use the LIMIT clause track and re-run the query when you need the next chunk of 
data.


--- On Tue, 3/17/09, baxy77bax <b...@hi.htnet.hr> wrote:

> From: baxy77bax <b...@hi.htnet.hr>
> Subject: [sqlite]  control buffering of query results
> To: sqlite-users@sqlite.org
> Date: Tuesday, March 17, 2009, 6:44 AM
> hi 
> 
> i need help with this one.
> 
> i have this perl script that goes something like this:
> 
> my $fetchrow_stmt;
> 
> sub _fetchrow_stmt {
>       
>   my ($self,%arg) = @_;
>   my $stm = "select * from $arg{table}";
>   $fetchrow_stmt = $dbh->prepare($stm) || die
> $dbh->errstr; ;
>   $fetchrow_stmt->execute || die $dbh->errstr; 
>          
> }
>         
>  sub _return_row {
>               
> my ($self,%arg) =...@_;
> return $fetchrow_stmt->fetchrow_arrayref();
> 
> 
>   }
>   
> sub _finish_stmt {
>       
>   my ($self,%arg) = @_;
>       
>  $fetchrow_stmt->finish();
>               
> }
> 
> the thing is that it's using my memory like crasy, and
> the source of this
> behaviour (I THINK/not sure) is in buffering the query
> results from sqlite.
> so is there a way to limit that, so that in query results
> there are only 2
> results max at a time (not the whole table)
> 
> thanx
> 
> -- 
> View this message in context:
> http://www.nabble.com/control-buffering-of-query-results-tp22557409p22557409.html
> Sent from the SQLite mailing list archive at Nabble.com.
> 
> _______________________________________________
> sqlite-users mailing list
> sqlite-users@sqlite.org
> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to