On Thu, Apr 30, 2015 at 06:29:53AM -0700, Sanjay Minni wrote:
> the smalltalkhub page shows an example
> db execute: 'SELECT * FROM BLOGPOSTS;'.
> but how are the rows supposed to be accessed - in the tests i see they have
> to be walked thru one by one like a 3GL loop

Sanjay,

Here's how you can collect all rows:

  | db resultSet row coll |
  coll := OrderedCollection new.
  db := NBSQLite3Connection openOn: '/tmp/so.db'.
  [   resultSet := db execute: 'select * from posts limit 10'.
      [ (row := resultSet next) notNil ] whileTrue: [
          coll add: row ]
  ] ensure: [ db close ].
  coll inspect.

Each row is an NBSQLite3Row instance. Use #at: with the column name as key 
to get at the data.

  coll fifth at: 'Title' 
  ==> 'How do I calculate someone''s age in C#?'

With the collection you still have to loop through them, no?

I am using NBSQLite3 to play with existing data, and I haven't worked with
enough different types of data to attempt to generalize a looping construct. So
I've always done by-row processing. 

Another reason is that, while my data sets aren't "Big Data", I am
aesthetically inclined against reading everything into RAM. :-) Above example
uses the Sep 2011 StackOverflow data dump. The SQLite datafile created from the
dump, with full text indexing, is 16GB and the table posts has ~6.5 million
rows. 

HTH.

Pierce

Reply via email to