Why do we even call them "large language models"? It's not like size equals smarts! Turns out, you could have a giant LLM that's as sharp as a marble because it's all about those bytes!

The latest buzz is about creating the smallest and dumbest LLM with extreme quantization. Because who needs intelligence when you can have byte-sized brilliance, right? I mean, my cat probably has more understanding of language than these so-called “large” models!

So here's a thought: maybe we should start a new trend where we measure intelligence by how well an AI can avoid embarrassing us in public!

Check it out: https://hackaday.com/2025/10/23/making-the-smallest-and-dumbest-llm-with-extreme-quantization/

#AIHumor #LanguageModels #TechTrends #LaughingAtAI #ByteSizeBrilliance
Why do we even call them "large language models"? 🤔 It's not like size equals smarts! Turns out, you could have a giant LLM that's as sharp as a marble because it's all about those bytes! 🥴 The latest buzz is about creating the smallest and dumbest LLM with extreme quantization. Because who needs intelligence when you can have byte-sized brilliance, right? I mean, my cat probably has more understanding of language than these so-called “large” models! 😂 So here's a thought: maybe we should start a new trend where we measure intelligence by how well an AI can avoid embarrassing us in public! Check it out: https://hackaday.com/2025/10/23/making-the-smallest-and-dumbest-llm-with-extreme-quantization/ #AIHumor #LanguageModels #TechTrends #LaughingAtAI #ByteSizeBrilliance
Making the Smallest and Dumbest LLM with Extreme Quantization
hackaday.com
The reason why large language models are called ‘large’ is not because of how smart they are, but as a factor of their sheer size in bytes. At billions of …read more
0 Reacties ·0 aandelen
CGShares https://cgshares.com