On Wednesday, 4 September 2024 at 17:02:55 UTC, Vladimir Marchevsky wrote:
On Wednesday, 4 September 2024 at 12:24:07 UTC, FeepingCreature wrote:
Anyone who says large language models aren't *really* intelligent now has to argue that programming doesn't require intelligence.

In case that really needs some arguing, I would say translation is not a programming.

The other day I was watching a video from programmer (Jason Turner from C++) writing a raycasting[1], but for what it seems he didn't know the math behind the intersection between line-segments, so he decided to ask chatgpt to come with an answer based on attributes/properties he had already written and the AI generated a version based on this.

There was a problem with one sign in a expression and the raycasting was a mess, but on the other hand the programmer couldn't fix because he was just copying and pasting and de admittedly didn't know the math. He was only able to fix it when someone in chat pointed out the "sign problem".

I think the state of all that was sad, I mean people will not use their brain anymore? - But on the other hand there is something going on since the AI was able to generate an algorithm based on specs given before hand in this specific language (Python), but I saw other videos with other languages too.

What I mean by all this, we are at the beginning of this trend but I can't imagine the outcome, I don't know for example if the case of this topic is a good or bad thing yet, but I keep wondering about what the new programmers coming in the future will face.

Finally I didn't want to derail the topic but the subject was already raised by the original poster,

[1] - (https://yewtu.be/watch?v=0lSqedQau6w) you can change yewtu.be for the google one with ads if you wish.

Matheus.

Reply via email to