Meta unveils a new large language model that can run on a single GPU [Updated] thumbnail
Meta unveils a new large language model that can run on a single GPU [Updated]
arstechnica.com
The LLaMA collection of language models range from 7 billion to 65 billion parameters in size. By comparison, OpenAI's GPT-3 model—the foundational model behind ChatGPT—has 175 billion parameters.
1 Users
0 Comments
1 Highlights
1 Notes

Top Highlights

  • The LLaMA collection of language models range from 7 billion to 65 billion parameters in size. By comparison, OpenAI's GPT-3 model—the foundational model behind ChatGPT—has 175 billion parameters.

Ready to highlight and find good content?

Glasp is a social web highlighter that people can highlight and organize quotes and thoughts from the web, and access other like-minded people’s learning.