Meta (a.k.a. Facebook) released LLAMA3 just a few days ago, and it's amazing for three reasons: 1) It's tiny, it only has 70 billion parameters, GPT4 is about 1.8 trillion parameters. 2) Despite its small size on AI benchmarks it's performance is just a smidgen below that of GPT4. 3) It is open source.
Meta says it's performance would be even better if they trained it for longer but they stopped early because the company's computational resources are large but not infinite so they decided that compute time could be better spent training a 400 billion parameter version of LLAMA3, which they say they'll release sometime in the next couple of months, and in developing LLAMA4. And anybody who still thinks the Singularity is not near really needs to look at the following video. I'll tell you one thing, it sure makes the issues that most Americans believe are the most important and which will probably decide the November election, excessive wokeness, the "invasion" from Mexico, and transsexual bathrooms, seem pretty damn trivial. *LLAMA 3 *BREAKS* the Industry | Government Safety Limits Approaching | Will Groq kill NVIDIA?* <https://www.youtube.com/watch?v=YuQFpjh2beE> John K Clark See what's on my new list at Extropolis <https://groups.google.com/g/extropolis> pdt -- You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to everything-list+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/CAJPayv2zj23gW3gbZD4YT0ggzb%2BNqBsf979GkCouuLL7J8WEfA%40mail.gmail.com.