Hi, this is Rinat Abdullin, writing to you in a newsletter on ML, engineering and product development. So, have you heard the news? Meta has trained a new GPT 3.0 model called LLaMa, using public data. They have published the paper and invited researchers to apply for the access to the model. Eventually somebody just downloaded the model and shared it via torrent. Setting a question of copyrights aside (are weights even copy-writeable?), the model is open source now. It is all over the internet.
Heard the news? LLaMa is open source now. In a way...
Heard the news? LLaMa is open source now. In…
Hi, this is Rinat Abdullin, writing to you in a newsletter on ML, engineering and product development. So, have you heard the news? Meta has trained a new GPT 3.0 model called LLaMa, using public data. They have published the paper and invited researchers to apply for the access to the model. Eventually somebody just downloaded the model and shared it via torrent. Setting a question of copyrights aside (are weights even copy-writeable?), the model is open source now. It is all over the internet.