From: [email protected]
 Jim,You don’t get it, i.e. creativity, and you seem almost incapable of 
getting it.  Mike,There are different kinds of creativity. Self expression is a 
form of creativity so we are both capable of getting it (we are both capable of 
being creative at least to the extent that we are writing messages like this).  
So when you claim that I am incapable of getting it you must be talking more 
about understanding how a computer program could be creative.  (If you are 
saying that I am incapable of being creative then you are unquestionably wrong, 
because, as I said, self-expression is a form of creativity.  We are both 
creating something as we write these messages.  So you must be saying that 
there is something about artificial computer creativity that I don't get.) The 
problem with this is that would mean that you are claiming to know more about 
computer programming than I do. That is unlikely.  Although there are many 
things I don't know about computer programming I have been programming for 30 
years and I am working on a program this evening.  Since you are not a 
programmer I am confident that I do know a great deal more about programming 
than you do. So you must be saying that you know more about making computer 
programs 'creative' than I do.  This happens to be something that I have been 
thinking about a lot longer than you have and it is a subject that I am quite 
confident that I have by the mane.  The idea that you know more about 
programming a computer to exhibit artificial creativity is seriously 
preposterous. I have seen the solution to AGI in the distance for some time 
now, but it has been a blur up until recently.  I still can't quite see the 
mechanism clearly but I know a lot more about how it will work than I did a few 
months ago.   I would never be able to explain it to someone who refuses to 
accept the idea that they might not understand something that I have been 
trying to explain. Jim Bromer 
 From: [email protected]
To: [email protected]
Subject: Re: [agi] A General O.D. (Operational Definition) for all AGI projects
Date: Sat, 11 May 2013 14:02:40 +0100







Jim,
 
You don’t get it, i.e. creativity, and you seem almost incapable of getting 
it.
 
What you do below is – having been told the solution to a creative problem, 
– i.e. the meaning of a new derivative object, BALL BOX – work backward to how 
it can be *logically* generated, from a set of logical propositions.  And 
so it all seems “easy” to you.
 
The whole point of creativity/generativity is that you/your program DON’T , 
you DON’T have the answer/ method-of-solution before you begin – as you do in 
all narrow AI and rationality. You DON’T have a prior set of logical 
propositions which will enable you to understand “BALL BOX”. Creative/AGI 
problems can’t be solved by logic.
 
This is why AGI as a whole is an unsolved problem.. 
 
This does help though in refining my General O.D.  -
 
it must be added that your program must solve not just one but AN ENDLESS 
CLASS  of such operations, – handle not just one radically new object, but 
an ENDLESS CLASS  of new objects – an endless class of new ball-box 
conjunctions.
 
For example, let’s say you set up your text program with 
definitions/propositions that do encompass “ball boxes.” Now let it try and 
explain :
 
HOW DO YOU SQUARE A CIRCLE?
 
Hey, this program knows that boxes can be square, and balls are circular, 
and it even knows that there are such things as “ball boxes”. According to you, 
it ought to be able to, but in fact t it isn’t going to be able to do, a damn 
thing about squaring a circle.
 
But YOU, a real AGI,  can start having ideas now about how to do that 
-  square a circle  -  new, non standard ideas.
 
And that’s what we need to replicate/emulate – how you are able to be 
creative – and start having ideas about – and handling – objects that you DON’T 
already know how to handle, that you DON’T have any set of logical or other 
propositions or commands for.
 
In this whole area, you can take some comfort from the fact that the entire 
field of AGI is making the same mistake as you – completely misunderstanding 
that AGI is about creativity, and doing new things you don’t already  know 
how to do, solving problems that you don’t already know how to solve. Instead 
you and other AGI-ers are addressing completely the wrong, purely narrow AI 
challenge, of how to solve problems that you do already know logically how to 
solve.
 
If you don’t have a proper AGI O.D.  - a creative O.D. – all you can 
do is waste life, digging the same old hole in completely the wrong place ever 
deeper.
 
 
 


 

From: Jim Bromer 
Sent: Saturday, May 11, 2013 1:26 PM
To: AGI 

Subject: RE: [agi] A General O.D. (Operational Definition) for all 
AGI projects
 

Mike Tintner 
said:

Your project 
must have an E.M. for  how
 
BALL + BOX = 
BALLBOX
 
i.e. you have 
to show how with only standard knowledge of two objects, balls & boxes, you 
can a) generate and/or b) understand a new, third object, “ball-box” that is 
derived from them by non-standard means. In this case, a BALLBOX is a box 
shaped 
like a ball rather than a cube.------------------------------------------
 
I am only replying to 
this as a way to repeat one of my ideas that seems obvious or commonsensical 
me.
 
It takes many 'statements' about a simple idea to understand 
it.  So if you removed all knowledge except that knowledge that referred to 
a box or a ball then the text-based program would not be able to figure out 
that 
a "ballbox" was referring to a box shaped like a ball rather than a cube.  
And in fact, I did not realize what Mike was talking about until he made the 
statement that, "a ballbox is a box shaped like a ball rather than a 
cube."  So no, an AGI program would not be able to figure that out without 
information beyond the heavily redacted information about balls and boxes.  
However, once the definition of a "ballbox" was made, as Mike made it for us, a 
text only AGI program would be able to figure it out (just as I was able to 
figure it out once I read it,) and use the term intelligibly.  And, 
significantly, if the text-based AGI program had many statements about 
different 
things it would be able to consider ideas like:
 
A box that can hold a ball.  
(That was my first guess.)
A sphere that can hold a 
box. (That is a simple rearrangement of terms.)
An 
box that was shaped like a ball.
A ball that was 
shaped like a box.
A metaphor of something else. 

For example, a square line on the ground where 
balls are put or something (similar to a "batter's box").
 
It is easy to see that these 
could all be generated using computational methods of rational creativity if 
the 
program had general knowledge about many different things.
 
The real question is whether or 
not a text-based AGI program would ever be able to distinguish what kinds of 
things words "box" or "ball" referred to without ever seeing one.  I can 
say that there are human beings who are born blind but who can use references 
to 
things like, "the view of the mountains in the distance," intelligibly.  If 
the program used terms like this intelligibly their use would always be removed 
slightly from our more familiar use of the terms. But so what?  The 
text-based AGI model is just a step that is being made to try to discover *how 
thinking works* in general rather than precise subprograms that concern the 
visual shapes of things (as in Mike's unconscious cherry-picked 
example.)
 
Jim 
Bromer


 



From: [email protected]
To: [email protected]
Subject: [agi] A 
General O.D. (Operational Definition) for all AGI projects
Date: Sat, 11 May 
2013 10:25:31 +0100






What 
we’ve just seen with Jim is yet another example here of effective creative 
illiteracy –s.o. talking about their “creative” AGI project without any attempt 
at defining either its O.D. (the effect to be achieved) or an effective 
mechanism -  without, to put that extremely crudely in common parlance, 
having any “idea”.
 
To 
repeat, this is appalling – it’s simply non-creative and a waste of 
space.
 
So to 
take further steps to eradicate this disease, let me put forward a general O.D. 
for A.G.I projects (and to some extent all culturally creative 
projects).
 
Your 
project must have an E.M. for  how
 
BALL + 
BOX = BALLBOX
 
i.e. 
you have to show how with only standard knowledge of two objects, balls & 
boxes, you can a) generate and/or b) understand a new, third object, “ball-box” 
that is derived from them by non-standard means. In this case, a BALLBOX is a 
box shaped like a ball rather than a cube.
 
(That 
is a more concrete way of saying: “you must be able to show how your project 
can 
think outside the box”).
 
Another example would be, you must have an E.M. for 
how
 
CAT + 
DOG = CATOTAUR 
 
a 
creature, similar to a minotaur, half cat, half dog.
 
Or, 
you must have an E.M. for how
 
2 + 2 
= 5
 
again, 
your machine must from knowledge of standard maths be able to produce 
non-standard maths. (And for the benefit of “same old, same old” Matt, that 
does 
not mean googling an existing 2+2=5 “proof” – you have to generate/understand 
an 
altogether new one).
 
Or, you must be able to show 
how
 
2 rocks + 2 rocks = 4 
rocks.
 
You must be able to show how having knowledge of 
how two rocks are laid, you can lay another two rocks on top of them. Laying 
rock walls is not a math operation like laying brick walls – each rock is 
individually formed and needs to be laid individually.
 
(Or your system must just be able to 
recognize/conceptualise ROCK, since all rock forms are individual and 
non-standard).
 
In all of these cases, you combine two objects to 
produce a third object that has never, to your knowledge, been derived from 
them 
before.
 
You do “magic” – you put a rabbit in an empty hat 
and pull out a bird. You put a penis in a vagina and pull out a baby (“where 
did 
that come from?”)
 
It’s a waste of time to even ask Jim to produce 
an O.D., but anyone serious about AGI will want to produce an O.D. with an E.M. 
-
 
an explanation of a BALL 
BOX.
 
 
 
 
 


  
  
    AGI | Archives  | Modify Your Subscription 
    


  
  
    AGI | Archives  | Modify 
      Your Subscription 
    


  
    
      
      AGI | Archives

 | Modify
 Your Subscription


      
    
  

                                          


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to