A human designed program is a very small part of what goes into making an LLM type AI, like ChatGPT. Most of is the data it is trained on. which is a combination raw web contents, and sample questions and answers, including corrected answers to questions it gets wrong, and which it encodes into the neural network parameters.

Also both ChatGPT and Gemini, and probably others, do search the web, but they then use the results as input to the main model, rather than spitting it out word for word.

What has made LLMs possible is the hardware that can do extremely large numbers of simple calculations quickly, much more than any complexity in the program that runs those calculations. ChatGPT's position on the amount of conventional code is:

"- The neural network: Makes up 99%+ of the intelligence and data volume — hundreds of gigabytes or more.

- The conventional code: Relatively small — a few million lines at most — but critical for running and interfacing with the model."

Some of the conventional code is used for training, rather than answering queries, some only in relation to image generation, and some to support specific user interfaces. Some of it is also used to stop it answering inappropriate questions.

Conventional programming generally has a predictable behaviour, but the neural network attempts to learn from the reading material it is given and can often get that wrong, especially if there isn't a lot reliable material available.


--
David Woolley

On 28/09/2025 18:14, Karl W Hubbard wrote:
Stalin: What matters is who counts the votes.AI: What matters is who writes the 
programs.

______________________________________________________________
Elecraft mailing list
Home: http://mailman.qth.net/mailman/listinfo/elecraft
Help: http://mailman.qth.net/mmfaq.htm
Post: mailto:[email protected]

This list hosted by: http://www.qsl.net
Please help support this email list: http://www.qsl.net/donate.html
Message delivered to [email protected] 

Reply via email to