On Fri, May 01, 2015 at 04:23:26AM -0700, Sanjay Minni wrote:
> however in your notes below you mentioned that you used a dump of large
> size. My question is when faced with such large row sets how do you limit
> the size of data read in the select statement .
This web page suggests how to deal w
Thanks Pierce - I go the drift of the loop but have actually tried it out
with the odbc<->sqlite driver
however in your notes below you mentioned that you used a dump of large
size. My question is when faced with such large row sets how do you limit
the size of data read in the select statement .
Sergio -
Are you using DBX - there seem to be (-ve) comments on DBX here
http://forum.world.st/Oracle-on-Linux-td4816854.html#a4816876
for Windows ODBC seems to be successfully used across many RDBMS engines
here
http://forum.world.st/Question-regarding-crashes-using-OpenDBXDriver-td
On Thu, Apr 30, 2015 at 06:29:53AM -0700, Sanjay Minni wrote:
> the smalltalkhub page shows an example
> db execute: 'SELECT * FROM BLOGPOSTS;'.
> but how are the rows supposed to be accessed - in the tests i see they have
> to be walked thru one by one like a 3GL loop
Sanjay,
Here's how you can
I haven't worked myself with this part of the code, but I can show you how
we did it.
Sergio - which database driver are you using and what is the expression to
> read in multiple rows
>
FreeTDS
This is how we performed a query:
*clientes*
| rows lista |
rows := (self executeSQL: 'SELECT * from
Sergio - which database driver are you using and what is the expression to
read in multiple rows
problem 1:
in PostgresV2 I used
res := postgresConnection execute: 'select * from df_co;'.
and then
res rows seems to return an array with the rows
I have not tried with a large result set say 1
I don't know if this is your case, but I remember our team having a problem
in SQL selects.
Th rows it retrieved where very volatile; after the first read they
dissappeared. So the first step was to save them/copy them into a ST object
and then process them.
(this was due to the memory of each rec
hi,
let me clarify ...
with a Postgres connection using Pharo4+PostgresV2 this works:
result := pgConnection execute: 'select * from mytable'.
result rows -> gives an ordered collection of the rows
but how to get this in Sqlite (Pharo4+NBSqlite3)
result := nbsqliteConnection execute: 'sele
Hi
what is the usual practice in reading in rows from a database table
- is a loop usually written to read next n rows / till the end
- does the arriving data have to be processed like a dictionary into a
collection where each object corresponds to a row and each key/value into
the objects met