Meta unveils a new large language model that can run on a single GPU [Updated]
arstechnica.com
The LLaMA collection of language models range from 7 billion to 65 billion parameters in size. By comparison, OpenAI's GPT-3 model—the foundational model behind ChatGPT—has 175 billion parameters.