MENU

Fun & Interesting

Does LLM Size Matter? How Many Billions of Parameters do you REALLY Need?

Gary Explains 31,075 lượt xem 2 months ago
Video Not Working? Fix It Now

Large Language Models (LLMs) are measured by the number of parameters they contain – the number of weights and biases within the neural network. More parameters mean a bigger, more complex model. Models that you can run on your PC are somewhere between 1 billion and 70 billion parameters. Does the size matter? What about quantization? 4-bit? 8-bit? Should you run models at full 32-bit precision, or could 4-bit or 8-bit quantization suffice? To find out, I put LLMs of various sizes and quantization levels to the test with some tough questions. Let's see which model emerges victorious!

---
Twitter: https://twitter.com/garyexplains
Instagram: https://www.instagram.com/garyexplains/

#garyexplains

Comment