On Sun, Feb 25, 2024 at 3:15 PM Just Kant via cctalk
<cctalk@classiccmp.org> wrote:

> So the portions of code belonging to chatgpt which produce the hallucinations 
> have been isolated?

It's a massive deep neural network, so you can't really isolate
anything. But there are parameters that you can use to tune it, like
how quickly it forgets earlier parts of a conversation, etc. and some
people speculate that they tweaked something like that which resulted
in the recent issues.

> Which languages were used to build it?

One could say Python, but that mostly sits on top of C++, which then
invokes CUDA (or TPU or similar) code, but at the bottom it's all just
matrix multiplication.

Polog and other Logic Programming tools aren't applicable to Machine
Learning approaches which is all anyone is interested in for "AI"
these days. If you want to make a rules-based expert system, and you
know what all the rules are, then Prolog might still be a useful tool.
Turbo Pascal is even still available as its originators took it back
from Borland and made it into Visual Prolog for Windows which has a
free personal edition (the commercial license is only 100 euros too).
Also there's GNU Prolog if you just want to futz around with Prolog.

Reply via email to