[sqlalchemy] Re: Updating a column to an expression containing itself; & Inserting data by lists or tuples

2007-02-06 Thread Michael Bayer

its all good, just not a high priority for me :).  feel free to dig
in.  the insert/update code is already a little intense due to it
already having multiple ways to do the same thing, not to mention its
some of the first code i wrote in SA and feels a little old, so adding
features like these might not be so straightforward.

On Feb 6, 3:46 pm, "Rick Morrison" <[EMAIL PROTECTED]> wrote:
> Well one good thing about the positional-style INSERT is that it just
> might be useful for implementing "INSERT INTO SELECT FROM", which
> would be a pretty useful addition to the SQL API layer
>
> On 2/6/07, S.R. Larmes <[EMAIL PROTECTED]> wrote:
>
>
>
> > Michael Bayer wrote:
> > > On Feb 6, 1:20 pm, "S.R. Larmes" <[EMAIL PROTECTED]> wrote:
>
> > > > a) How do I do:
>
> > > > UPDATE sometable SET value = value * 0.9 + 1[ + optional WHERE
> > > > clause ] ?
>
> > > > I've tried experimenting with .update(values=..) or stuff
> > > > like .execute(value=sometable.c.value*0.9+1) but it just tries to do
> > > > strange things. (SET value=sometable.value * %(sometable_value)s + %
> > > > (literal)s ??)
>
> > > that looks correct to me...all literal values are by default converted
> > > to bind params.   if you execute the statement the bind values will be
> > > set up correctly.
>
> > Ahh, got it to work now. It doesn't seem to work with
> > .execute(value=table.c.value*0.9)) (gives programming error), but that
> > makes sense to me. :) With .update(values={..}) it works fine! Sorry
> > about that.
>
> > Thanks for the swift reply by the way.
>
> > > > b) I have an array (or tuple) with values that I'd just like to fire
> > > > off to insert() which is compiled to work on one column (the other
> > > > columns have defaults or pre-set values)... To me, this one ought to
> > > > be obvious and intuitive, and maybe I'm just stupid or blind, but I
> > > > can't find a way.
>
> > > how did you compile an insert() to work on just one (unnamed) column ?
> > > like table.insert(values=['someval']) ?
>
> > No, I didn't even know about that? I just meant that all the other
> > columns had bound values in values={..} or as default values so I
> > would
> > expect execute() (etc) to figure out their values automatically
> > without
> > me having to mention those columns again.
>
> > > > (Alternatively, if I have an array such as [(1, 'hi', 'text'), (2,
> > > > 'hello', 'tuxt'), ...] where the columns match up to the columns in
> > > > the table (or a given insert())...)
>
> > > > .execute(*[dict(name=x) for x in array]) # to me, this is inelegant
> > > > and horribly roundabout
>
> > > well, SA is mostly name oriented at this time for several reasons:
>
> > > 1. there is no chance of ambiguity.  for example, if you write code
> > > that just says table.insert().execute(1,2,3,4,5), supposing the table
> > > is reflected from the local database - that code might not be portable
> > > if executed on a different database where the table was created with a
> > > different column ordering, or had columns added via ALTER, etc.
>
> > That's true, but I imagine there would be a way to specify the columns
> > (once, when getting the insert()), and then trust subsequent
> > execute()s
> > to permute the arguments into their right order...
>
> > > 2. it is more database-neutral to build off named parameters, since
> > > its easy to convert from named param to positional, hard to convert in
> > > the other direction (ordering may not be consistent depending on how a
> > > dialect compiles a statement).  as not all DBAPIs support positional
> > > bind parameters anyway, SA is going to create named bind parameters in
> > > any case so the "inelegant and roundabout" behavior will still occur,
> > > albeit behind the scenes, if support for positional execute()
> > > arguments were improved.
>
> > > since its *only* inserts that positional parameters even have a chance
> > > of working on all DBs, SA's parameter-processing logic was written
> > > around a "named parameter" concept since positional params are of very
> > > limited use.
>
> > But aren't the result set from selects also positional (AS WELL as
> > name-centric):
>
> > .select().execute().fetchone() gives something which can be thought
> > of either as a hash or as a tuple à la (1, 'ah', 102), and I'm using
> > the tuple... It would thus be consistent to be able to use similar
> > constructs with inserts.
>
> > > 3. lets look at the proposed feature itself.  let me preface by
> > > saying, i will accept this feature into SA, and i am not opposed to
> > > it.  if an adequate patch is provided, i will commit it.  however, its
> > > not a priority for me to implement this myself.  this is because it
> > > would be complex to implement, and the current insert() method is one
> > > step and is a "single way to do it".  the method you are proposing
> > > requries two steps in most cases:
>
> > > i.e.
>
> > >table.insert().execute(*[dict(name=x) for x in array])
>
> > > specifies the 

[sqlalchemy] Re: Updating a column to an expression containing itself; & Inserting data by lists or tuples

2007-02-06 Thread Rick Morrison

Well one good thing about the positional-style INSERT is that it just
might be useful for implementing "INSERT INTO SELECT FROM", which
would be a pretty useful addition to the SQL API layer



On 2/6/07, S.R. Larmes <[EMAIL PROTECTED]> wrote:
>
> Michael Bayer wrote:
> > On Feb 6, 1:20 pm, "S.R. Larmes" <[EMAIL PROTECTED]> wrote:
> > >
> > > a) How do I do:
> > >
> > > UPDATE sometable SET value = value * 0.9 + 1[ + optional WHERE
> > > clause ] ?
> > >
> > > I've tried experimenting with .update(values=..) or stuff
> > > like .execute(value=sometable.c.value*0.9+1) but it just tries to do
> > > strange things. (SET value=sometable.value * %(sometable_value)s + %
> > > (literal)s ??)
> >
> > that looks correct to me...all literal values are by default converted
> > to bind params.   if you execute the statement the bind values will be
> > set up correctly.
>
> Ahh, got it to work now. It doesn't seem to work with
> .execute(value=table.c.value*0.9)) (gives programming error), but that
> makes sense to me. :) With .update(values={..}) it works fine! Sorry
> about that.
>
> Thanks for the swift reply by the way.
>
> > > b) I have an array (or tuple) with values that I'd just like to fire
> > > off to insert() which is compiled to work on one column (the other
> > > columns have defaults or pre-set values)... To me, this one ought to
> > > be obvious and intuitive, and maybe I'm just stupid or blind, but I
> > > can't find a way.
> >
> > how did you compile an insert() to work on just one (unnamed) column ?
> > like table.insert(values=['someval']) ?
>
> No, I didn't even know about that? I just meant that all the other
> columns had bound values in values={..} or as default values so I
> would
> expect execute() (etc) to figure out their values automatically
> without
> me having to mention those columns again.
>
> > > (Alternatively, if I have an array such as [(1, 'hi', 'text'), (2,
> > > 'hello', 'tuxt'), ...] where the columns match up to the columns in
> > > the table (or a given insert())...)
> > >
> > > .execute(*[dict(name=x) for x in array]) # to me, this is inelegant
> > > and horribly roundabout
> >
> > well, SA is mostly name oriented at this time for several reasons:
> >
> > 1. there is no chance of ambiguity.  for example, if you write code
> > that just says table.insert().execute(1,2,3,4,5), supposing the table
> > is reflected from the local database - that code might not be portable
> > if executed on a different database where the table was created with a
> > different column ordering, or had columns added via ALTER, etc.
>
> That's true, but I imagine there would be a way to specify the columns
> (once, when getting the insert()), and then trust subsequent
> execute()s
> to permute the arguments into their right order...
>
> > 2. it is more database-neutral to build off named parameters, since
> > its easy to convert from named param to positional, hard to convert in
> > the other direction (ordering may not be consistent depending on how a
> > dialect compiles a statement).  as not all DBAPIs support positional
> > bind parameters anyway, SA is going to create named bind parameters in
> > any case so the "inelegant and roundabout" behavior will still occur,
> > albeit behind the scenes, if support for positional execute()
> > arguments were improved.
> >
> > since its *only* inserts that positional parameters even have a chance
> > of working on all DBs, SA's parameter-processing logic was written
> > around a "named parameter" concept since positional params are of very
> > limited use.
>
> But aren't the result set from selects also positional (AS WELL as
> name-centric):
>
> .select().execute().fetchone() gives something which can be thought
> of either as a hash or as a tuple à la (1, 'ah', 102), and I'm using
> the tuple... It would thus be consistent to be able to use similar
> constructs with inserts.
>
> > 3. lets look at the proposed feature itself.  let me preface by
> > saying, i will accept this feature into SA, and i am not opposed to
> > it.  if an adequate patch is provided, i will commit it.  however, its
> > not a priority for me to implement this myself.  this is because it
> > would be complex to implement, and the current insert() method is one
> > step and is a "single way to do it".  the method you are proposing
> > requries two steps in most cases:
> >
> > i.e.
> >
> >table.insert().execute(*[dict(name=x) for x in array])
> >
> > specifies the column names and their values in one step.  the proposed
> > method requires in most cases a separate column specification step,
> > currently the inelegant:
> >
> >   table.insert(values=['fake value 1', 'fake value2']).execute(*array)
> >
> > but if we added a column list argument, would look like:
> >
> >   insert([table.c.col1, table.c.col2]).execute(*array)
> >
> > which is still two steps.  of course if you want to insert the whole
>
> Two steps, but, I feel, more consistent and overall better design,
> w

[sqlalchemy] Re: Updating a column to an expression containing itself; & Inserting data by lists or tuples

2007-02-06 Thread S.R. Larmes

Michael Bayer wrote:
> On Feb 6, 1:20 pm, "S.R. Larmes" <[EMAIL PROTECTED]> wrote:
> >
> > a) How do I do:
> >
> > UPDATE sometable SET value = value * 0.9 + 1[ + optional WHERE
> > clause ] ?
> >
> > I've tried experimenting with .update(values=..) or stuff
> > like .execute(value=sometable.c.value*0.9+1) but it just tries to do
> > strange things. (SET value=sometable.value * %(sometable_value)s + %
> > (literal)s ??)
>
> that looks correct to me...all literal values are by default converted
> to bind params.   if you execute the statement the bind values will be
> set up correctly.

Ahh, got it to work now. It doesn't seem to work with
.execute(value=table.c.value*0.9)) (gives programming error), but that
makes sense to me. :) With .update(values={..}) it works fine! Sorry
about that.

Thanks for the swift reply by the way.

> > b) I have an array (or tuple) with values that I'd just like to fire
> > off to insert() which is compiled to work on one column (the other
> > columns have defaults or pre-set values)... To me, this one ought to
> > be obvious and intuitive, and maybe I'm just stupid or blind, but I
> > can't find a way.
>
> how did you compile an insert() to work on just one (unnamed) column ?
> like table.insert(values=['someval']) ?

No, I didn't even know about that? I just meant that all the other
columns had bound values in values={..} or as default values so I
would
expect execute() (etc) to figure out their values automatically
without
me having to mention those columns again.

> > (Alternatively, if I have an array such as [(1, 'hi', 'text'), (2,
> > 'hello', 'tuxt'), ...] where the columns match up to the columns in
> > the table (or a given insert())...)
> >
> > .execute(*[dict(name=x) for x in array]) # to me, this is inelegant
> > and horribly roundabout
>
> well, SA is mostly name oriented at this time for several reasons:
>
> 1. there is no chance of ambiguity.  for example, if you write code
> that just says table.insert().execute(1,2,3,4,5), supposing the table
> is reflected from the local database - that code might not be portable
> if executed on a different database where the table was created with a
> different column ordering, or had columns added via ALTER, etc.

That's true, but I imagine there would be a way to specify the columns
(once, when getting the insert()), and then trust subsequent
execute()s
to permute the arguments into their right order...

> 2. it is more database-neutral to build off named parameters, since
> its easy to convert from named param to positional, hard to convert in
> the other direction (ordering may not be consistent depending on how a
> dialect compiles a statement).  as not all DBAPIs support positional
> bind parameters anyway, SA is going to create named bind parameters in
> any case so the "inelegant and roundabout" behavior will still occur,
> albeit behind the scenes, if support for positional execute()
> arguments were improved.
>
> since its *only* inserts that positional parameters even have a chance
> of working on all DBs, SA's parameter-processing logic was written
> around a "named parameter" concept since positional params are of very
> limited use.

But aren't the result set from selects also positional (AS WELL as
name-centric):

.select().execute().fetchone() gives something which can be thought
of either as a hash or as a tuple à la (1, 'ah', 102), and I'm using
the tuple... It would thus be consistent to be able to use similar
constructs with inserts.

> 3. lets look at the proposed feature itself.  let me preface by
> saying, i will accept this feature into SA, and i am not opposed to
> it.  if an adequate patch is provided, i will commit it.  however, its
> not a priority for me to implement this myself.  this is because it
> would be complex to implement, and the current insert() method is one
> step and is a "single way to do it".  the method you are proposing
> requries two steps in most cases:
>
> i.e.
>
>table.insert().execute(*[dict(name=x) for x in array])
>
> specifies the column names and their values in one step.  the proposed
> method requires in most cases a separate column specification step,
> currently the inelegant:
>
>   table.insert(values=['fake value 1', 'fake value2']).execute(*array)
>
> but if we added a column list argument, would look like:
>
>   insert([table.c.col1, table.c.col2]).execute(*array)
>
> which is still two steps.  of course if you want to insert the whole

Two steps, but, I feel, more consistent and overall better design,
without always having to go via the tiny temporary dictionary objects.

Also, I don't think it should be necessary to unpack the array at all,
since we're using Python. If we wanted to enforced strict typing we
might be better off programming in a statically-typed environment
(no?). To me, the incredible dynamic powers of introspection and
polymorphism is half of the point of using Python. If I pass an array
to insert().execute() it should be intuitive

[sqlalchemy] Re: Updating a column to an expression containing itself; & Inserting data by lists or tuples

2007-02-06 Thread Michael Bayer

On Feb 6, 1:20 pm, "S.R. Larmes" <[EMAIL PROTECTED]> wrote:
>
> a) How do I do:
>
> UPDATE sometable SET value = value * 0.9 + 1[ + optional WHERE
> clause ] ?
>
> I've tried experimenting with .update(values=..) or stuff
> like .execute(value=sometable.c.value*0.9+1) but it just tries to do
> strange things. (SET value=sometable.value * %(sometable_value)s + %
> (literal)s ??)

that looks correct to me...all literal values are by default converted
to bind params.   if you execute the statement the bind values will be
set up correctly.

>
> b) I have an array (or tuple) with values that I'd just like to fire
> off to insert() which is compiled to work on one column (the other
> columns have defaults or pre-set values)... To me, this one ought to
> be obvious and intuitive, and maybe I'm just stupid or blind, but I
> can't find a way.

how did you compile an insert() to work on just one (unnamed) column ?
like table.insert(values=['someval']) ?

>
> (Alternatively, if I have an array such as [(1, 'hi', 'text'), (2,
> 'hello', 'tuxt'), ...] where the columns match up to the columns in
> the table (or a given insert())...)
>
> .execute(*[dict(name=x) for x in array]) # to me, this is inelegant
> and horribly roundabout

well, SA is mostly name oriented at this time for several reasons:

1. there is no chance of ambiguity.  for example, if you write code
that just says table.insert().execute(1,2,3,4,5), supposing the table
is reflected from the local database - that code might not be portable
if executed on a different database where the table was created with a
different column ordering, or had columns added via ALTER, etc.

2. it is more database-neutral to build off named parameters, since
its easy to convert from named param to positional, hard to convert in
the other direction (ordering may not be consistent depending on how a
dialect compiles a statement).  as not all DBAPIs support positional
bind parameters anyway, SA is going to create named bind parameters in
any case so the "inelegant and roundabout" behavior will still occur,
albeit behind the scenes, if support for positional execute()
arguments were improved.

since its *only* inserts that positional parameters even have a chance
of working on all DBs, SA's parameter-processing logic was written
around a "named parameter" concept since positional params are of very
limited use.

3. lets look at the proposed feature itself.  let me preface by
saying, i will accept this feature into SA, and i am not opposed to
it.  if an adequate patch is provided, i will commit it.  however, its
not a priority for me to implement this myself.  this is because it
would be complex to implement, and the current insert() method is one
step and is a "single way to do it".  the method you are proposing
requries two steps in most cases:

i.e.

   table.insert().execute(*[dict(name=x) for x in array])

specifies the column names and their values in one step.  the proposed
method requires in most cases a separate column specification step,
currently the inelegant:

  table.insert(values=['fake value 1', 'fake value2']).execute(*array)

but if we added a column list argument, would look like:

  insert([table.c.col1, table.c.col2]).execute(*array)

which is still two steps.  of course if you want to insert the whole
row, then youd leave the table out...then you have some implicit
behavior going on which may be less than ideal.  also, we now have to
check for error conditions like this:

  insert([table.c.col1, table.c.col2]).execute(col1=5, col2=7,
col3=15)

this probably breaks right now if you use the values clause in this
manner.

so..yeah...not against it as a feature, but to me its decidedly more
complex and ambiguous than the current method.


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To post to this group, send email to sqlalchemy@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en
-~--~~~~--~~--~--~---