On Sat, Jul 15, 2023 at 05:40:11PM +0000, 'spudboy...@aol.com' via Everything List wrote: > Even the scientists who build AI can’t tell you how it works (msn.com) > > Interview with NYU professor: > > > ''Sam Bowman > > So there’s two connected big concerning unknowns. The first is that > we don’t really know what they’re doing in any deep sense. If we > open up ChatGPT or a system like it and look inside, you just see > millions of numbers flipping around a few hundred times a second, > and we just have no idea what any of it means. With only the tiniest > of exceptions, we can’t look inside these things and say, “Oh, > here’s what concepts it’s using, here’s what kind of rules of > reasoning it’s using. Here’s what it does and doesn’t know in any > deep way.” We just don’t understand what’s going on here. We built > it, we trained it, but we don’t know what it’s doing.'
Yeah. But I think this was obvious from the very beginning, at least if someone was paying attention. A lot of people, this unfortunately includes many decision makers (I hope I am not too optimistic), do not want to busy themselves with details. What is going on with Chad Gepettos and their ilk, is a bit like building a car by throwing parts inside a box. In a computer, one can do this really fast. Oh, something have built up. Let's make a schoolbus. This is being done by software firms who already mastered the art of licence writing - they cannot be liable for software errors (unless something changed during last few years) and you agreed to this. Yes you did. Now go read the licence. Sooo, if somebody's life is screwed up... Mr Jeffery Battle (veteran, businessman and professor) now sues Microsoft because its Bing conflated him with a person of similar name, who apparently is a convicted wannabe taliban. [ https://reason.com/volokh/2023/07/13/new-lawsuit-against-bing-based-on-allegedly-ai-hallucinated-libelous-statements/ ] We will see how it unfolds. > Noam Hassenfeld > Very big unknown.''If accurate, are we now looking at Pantheism? Should we? Such questions are loaded with a suggestion, intentional or not. Only a cretin makes atomic mushroom and then prays to it. Oh, wait a minute, who is the dominant species here... I suspect "we" were a lot smarter in a past. The engineers who built steam engines did not pray to them. The coal diggers did not pray to their shovels. Tailors did not pray to their sewing machines. Nobody (mentally capable) treated the devices as magical or impossible to understand. (or so I think) Machines are like any other machines. If they are doing what they are supposed to do and I can repair them when they break, then I have no problem. The problem starts when they do not break and do not do what they are expected to. Anyway, I like this quote, it is absolutely thrilling: "And it also plays into some of the concerns about these systems. That sometimes the skill that emerges in one of these models will be something you really don’t want. The paper describing GPT-4 talks about how when they first trained it, it could do a decent job of walking a layperson through building a biological weapons lab. And they definitely did not want to deploy that as a product. They built it by accident. And then they had to spend months and months figuring out how to clean it up, how to nudge the neural network around so that it would not actually do that when they deployed it in the real world." The humans knew what they did not want to release and were able, with huge effort, to rub this out. What was this thing, those things, which they did not know they had not wanted, which they could not know at that moment because nobody knew it yet? Ask Ding, or Brad, eh. Chad Gepetto has those things buried inside it, waiting for a right question to autocomplete. -- Regards, Tomasz Rola -- ** A C programmer asked whether computer had Buddha's nature. ** ** As the answer, master did "rm -rif" on the programmer's home ** ** directory. And then the C programmer became enlightened... ** ** ** ** Tomasz Rola mailto:tomasz_r...@bigfoot.com ** -- You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to everything-list+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/ZLc3CaKfD1XAsh6Y%40tau1.ceti.pl.