Your funds are already being managed by software programs. AI programs is just a few deployments down the road; that is, if they're not already trading securities with neural nets, or case based reasoning data, or fuzzy logic, or rule based systems.
Nothing new there. ~PM Date: Mon, 4 May 2015 21:18:57 -0500 Subject: Re: [agi] AGI as adaptive control From: [email protected] To: [email protected] Also, I think the idea of turning funds over to AI managed software is a little scary. In the event of some major market shaking event, what ethics of opportunity topology shuts down the bots, rather than them seeing it as just another arbitrage opportunity? I don't see many good ways for the SEC to punish a rogue algorithm. At the least, the Enron traders and their bosses eventually got their asses kicked by the legal profession even if it was a day late and a dollar short. We had similar nonsense go off when the mortgage meltdown turned a bunch of sophisticated mathematical candy floss into a smoking crater in the ground (Thank you Congress and a bunch of over-educated Ivy League kids). Hey, what the hell do I know, I'm just a metallurgist and inventor and occasional consultant and.. nevermind. I just worry.. failure analysts are good at anxiety and paranoia. -GJS On Mon, May 4, 2015 at 9:07 PM, Greg Staskowski <[email protected]> wrote: All right! #HAJIME! #ITSON! I like seeing the hardware vs. software argument presented very rationally and point by point in terms of the adaptive control systems paradigm (ecch, I know I couldn't think of a less overused word). I come down on the side of hardware for this stuff and what I really want to understand is the quantum biophysics of the brain. If we are really going for "the whole enchilada" that is "embodied or unembodied reasoning, Turing or non-Turing, consciousness" rather than "just another algorithm, JA^2" I think this thing has to start from the hardware or wetware up and that means we need to really understand the quantum biophysics of a living, functioning human or whale brain. Hey, good luck getting the funding for whale brains of course. This is a "non trivial" problem. Now Wolfram et al seems to think this whole deal starts down at the level of cellular automata but don't quote me on that because d00d is kinda a Genius at this stuff. I tend to come out more on the side of ethics and applications especially in terms of diversified system operators, the IOT (Internet of things) and agents for selection of what I call "opportunity topology." If we want to start experimenting with silicon though, Northwestern has a NuFab facility where you can work out new silicon at what are basically bargain rates. Check it out. I think this stuff is way too complex to simulate and the only way to get there is Wright Brothers style. Iterate, re-design, Iterate, re-design till you run out of money of course. That is my opinion, I could be wrong. -GJS On Mon, May 4, 2015 at 6:06 PM, Stefan Pernar <[email protected]> wrote: The page limit apparently applies to publication the proceedings. A fair bit of bible references for sure. Have a read. Really appreciate every pair of eyes that goes over it. On 05/05/2015 9:04 AM, "Anastasios Tsiolakidis" <[email protected]> wrote: On Tue, May 5, 2015 at 12:30 AM, Stefan Pernar <[email protected]> wrote: My paper building on the subject has been accepted for presentation at the AGI Congerence Ben organizes in July in Berlin: http://rationalmorality.info/wp-content/uploads/2015/04/TranshumanPhilosophy_formatted.pdf Isn't this a bit too long, considering the 10 page limit? Do I detect a certain Judaism in the text? Surely will go down with Ben better than my Christian references lol. AT AGI | Archives | Modify Your Subscription AGI | Archives | Modify Your Subscription AGI | Archives | Modify Your Subscription ------------------------------------------- AGI Archives: https://www.listbox.com/member/archive/303/=now RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424 Modify Your Subscription: https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657 Powered by Listbox: http://www.listbox.com
