Hai,

    I am using perl DBI module for fetching data from database.

  

    My database table contains almost >70 million  entries. 

    I am fetching the data using  following steps

            1. $dbh->DBI->connect(...);

            2.
$sth = $dbh->prepare("Select * from $TABLE") //which contains more
than >70 million   entries

            3. $dbh->execute

            4.  $sth->fetchrow_array , to fetch data




My question is because of the large data (>70 million entries) , any
memory error  like memory out of error comes (because i observed at
some instance memory utilization > 70 %).

 

Basically i want to know , the fetch command gets from db one row at a time or 
it gets from the memory

Any
better approach to make it use less memory . I tried by reading entries
in chunks of(10,000) , memory is low but speed is slow...

 Thanks for your help

Thanks,
N Ravi


__________________________________________________
Do You Yahoo!?
Tired of spam?  Yahoo! Mail has the best spam protection around 
http://mail.yahoo.com 

Reply via email to