Hello !

On Wed, 2025-02-05 at 17:41 -0500, Tim Daly wrote:
> I built a machine with a GPU board, downloaded and ran 71 LLMs to 
> rank them on their ability to "Prove the greatest common divisor
> theorem".
> The game was to find the LLMs that were best suited for the proof
> task.

GPUs with lot of RAM are expensive. And lot of RAM is needed for big AI
models. I use AI on CPU with 128 GB of RAM attached to computer. This
DDR4 RAM costed me only about 500 EUR :) 

For DeepSeek it feels that minimum of 256 GB RAM is appropriate.

I use llama.cpp as inference engine. I download AI models from
HuggingFace in GGUF format. This big AI on CPU is too slow for realtime
chat, but I use it in batch mode. I give it many tasks and every day it
solves few of them. 

Best regards
-- 
Svjatoslav Agejenko
WWW: http://svjatoslav.eu





Reply via email to