Below is a program that can feel pain. It is a simulation of a programmable 2-input logic gate that you train using reinforcement conditioning.
/* pain.cpp This program simulates a programmable 2-input logic gate. You train it by reinforcement conditioning. You provide a pair of input bits (00, 01, 10, or 11). It will output a 0 or 1. If the output is correct, you "reward" it by entering "+". If it is wrong, you "punish" it by entering "-". You can program it this way to implement any 2-input logic function (AND, OR, XOR, NAND, etc). */ #include <iostream> #include <cstdlib> using namespace std; int main() { // probability of output 1 given input 00, 01, 10, 11 double wt[4]={0.5, 0.5, 0.5, 0.5}; while (1) { cout << "Please input 2 bits (00, 01, 10, 11): "; char b1, b2; cin >> b1 >> b2; int input = (b1-'0')*2+(b2-'0'); if (input >= 0 && input < 4) { int response = double(rand())/RAND_MAX < wt[input]; cout << "Output = " << response << ". Please enter + if right, - if wrong: "; char reinforcement; cin >> reinforcement; if (reinforcement == '+') cout << "aah! :-)\n"; else if (reinforcement == '-') cout << "ouch! :-(\n"; else continue; int adjustment = (reinforcement == '-') ^ response; if (adjustment == 0) wt[input] /= 2; else wt[input] = 1 - (1 - wt[input])/2; } } } --- Jiri Jelinek <[EMAIL PROTECTED]> wrote: > Mark, > > Again, simulation - sure, why not. On VNA (Neumann's architecture) - I > don't think so - IMO not advanced enough to support qualia. Yes, I do > believe qualia exists (= I do not agree with all Dennett's views, but > I think his views are important to consider.) I wrote tons of pro > software (using many languages) for a bunch of major projects but I > have absolutely no idea how to write some kind of feelPain(intensity) > fn that could cause real pain sensation to an AI system running on my > (VNA based) computer. BTW I often do the test driven development so I > would probably first want to write a test procedure for real pain. If > you can write at least a pseudo-code for that then let me know. When > talking about VNA, this is IMO a pure fiction. And even *IF* it > actually was somehow possible, I don't think it would be clever to > allow adding such a code to our AGI. In VNA-processing, there is no > room for subjective feelings. VNA = "cold" data & "cold" logic (no > matter how complex your algorithms get) because the CPU (with its set > of primitive instructions) - just like the other components - was not > designed to handle anything more. > > Jiri > > On 6/10/07, Mark Waser <[EMAIL PROTECTED]> wrote: > > > > > > > For feelings - like pain - there is a problem. But I don't feel like > > > spending much time explaining it little by little through many emails. > > > There are books and articles on this topic. > > > > Indeed there are and they are entirely unconvincing. Anyone who writes > > something can get it published. > > > > If you can't prove that you're not a simulation, then you certainly can't > > prove that "pain that really *hurts*" isn't possible. I'll just simply > > argue that you *are* a simulation, that you do experience "pain that > really > > *hurts*", and therefore, my point is proved. I'd say that the burden of > > proof is upon you or anyone else who makes claims like ""Why you can't > make > > a computer that feels pain". > > > > I've read all of Dennett's books. I would argue that there are far more > > people with credentials who disagree with him than agree. His arguments > > really don't boil down to anything better than "I don't see how it happens > > or how to do it so it isn't possible." > > > > I still haven't seen you respond to the simulation argument (which I feel > > *is* the stake through Dennett's argument) but if you want to stop > debating > > without doing so that's certainly cool. > > > > Mark________________________________ -- Matt Mahoney, [EMAIL PROTECTED] ----- This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?member_id=231415&user_secret=e9e40a7e