On 6/12/07, Eugen Leitl <[EMAIL PROTECTED]> wrote:
Letting a kid grow up planted in front of TV or a PC is not going to be dramatically different. It still results in a mind ruined for good. No hardware can substitute for proper parenting, and education in a good school.
"Ruined for good" is an exaggeration - plenty of productive people have come out of that environment, and indeed out of many considerably less promising ones. That having been said, I'm certainly not claiming hardware can substitute for parenting and a school with decent standards! Only that as hardware goes, the PC is a superior substitute for the TV. Okay, I was tongue-in-cheek, but this is not the complete answer
to what is going on. There's a change in social climate or some chronical disease spreading across the old industrialized places. Whatever it is, it's depressing as hell. We need something to resurge the old post-war enthusiasm, preferrably something which is a not yet another war.
Yeah. I think the fundamental cause is the interaction between K-strategist genes and parasite memes, but in any event it's a problem for which there is no easy solution. Computing power is not equal to computing power. The computer
is no longer an all-pupose machine, though it's certainly has gotten faster. I do like things that go beyond SMP, such as Cell and all-purpose vertex shaders, and do hope to see more of it on the classical PC. It is rather sad to see all these transistors sitting there and doing nothing but heat up your room.
Yep! It's money, but it's dumb money. If it was smart money, we'd
have massive parallelism a decade or two ago. Allright, it is much better than no money at all, and w'all timesharing into the corporate mainframes by glass teletype.
Oh indeed. If I had the authority to do so, if people would follow me, I'd declare war on Death, and throw wartime levels of funding at it. Then we could afford chip factories dedicated to bio/nanosimulation optimized hardware. But the current state of affairs (e.g. Playstation 3 now contributing hundreds of teraflops to [EMAIL PROTECTED], high-end graphics cards doing better still on per-unit basis) is much better than it might have been. Google is doing rather a piss-poor job of explaining what IA is.
It is probably not the Irrigation Association, or has anything to do with Iowa. I'm hazarding it's something like Intelligence Augmentation, or somesuch. A smart person would have designed an acronym with enough letters that Google would have resolved it easily.
Don't blame me, I didn't design it :) (And, in case you haven't noticed, Google is rapidly losing
brownie points even among the fanboys due to their cavalier attitude towards privacy, and their rapid turning into a mainstream corporation. Maybe you can do something about the privacy thing, but the progressive coprophagation part is utterly resistant to any treatment save of starting from scratch).
Well, it always happens when a company gets big, bureaucracy starts to creep in. Starting from scratch is always necessary. But not necessarily as a replacement for the older company! Microsoft didn't replace IBM in the mainframe market, they got into microcomputers. Google didn't replace Microsoft on the office desktop, they got into search. The next big thing (and I'll put forward smart general CAD with global pool of reusable procedural knowledge as the biggest thing that needs to be done in software) will undoubtedly require new startups. There's plenty of progress to be made, for sure. It's too bad I haven't
seen a lot of it in about three decades that I'm a conscious observer, and I only have about that much time left as a conscious observer (yes, cryonics, but), and I would rather see something moving visibly.
Yeah, me too. Still, there is visible progress being made, even if it's not as fast as we'd like; let's try to encourage it. ----- This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?member_id=4007604&user_secret=7d7fb4d8