Marcello Giovagnoli writes:
> Hi Monty,
>
> > Why do you need to retrieve so many rows to the client? Can't you use
> > LIMIT to restrict the number of result rows?
>
> The point is that the program scans the entire table and process each row
> in a diffrent way, executing other queryes inside the main SQLFetch() loop.
> Someting like this:
>
> SQLExecute(hstmt1,query1)
> while(SQLFetchRow(hstmt1) == SQL_SUCCESS )
> {
> ...................................
> ...................................
> SQLExecute(hstmt2,query2)
> while(SQLFetchRow(hstmt2) == SQL_SUCCESS )
> {
> ......................
> }
> }.
>
> Perhaps the code should be rewritten, to reflect the right way
> to write code that uses database, but in my source code i
> don't use ODBC directly, it is embedded, I have someting like this:
>
> dbSetKey(Db,"table1","column1,column2");
> .............................................
> dbSetValue("table1.column1", value);
> ...........................................
> err=dbStart(Db,"table1")
> while(!err)
> {
> .....................................
> .....................................
> err=dbStart(Db,"table2")
> while(!err)
> {
> ....................................
> dbNext(Db,"table2");
> }
> dbNext(Db,"table1");
> }
> dbStart() build the right query with the where condition and order by
> using ODBC.
>
> For each running program I have one instance of mysqld connected to.
>
> There are up to 300 programs written in this way and they
> works, I can't modify them in a few time.
>
> >
> > Are you sure you close the statement before doing another ?
> > The reason this happens is that mysql_free_result() is not called
> > for the previous query and there is still on the connections data
> > for the previous query.
> >
> > Executing mysql_free_result() will quickly discard all rows from the
> > server so that you can issue a new command against it.
> >
>
> Reading the manual :( It seems that i can do this ONLY using
> mysql_store_result(), because each mysqld instance can do
> only one query at a time, is it correct ?
>
> If yes, I need some way to reduce the client memory usage,
> processing big tables.
>
> I can implement something build a select with limit n1,2n
> re-run the query when SQLFetchRow() fails.
>
> I don't have notice how the other databases works in this cases.
>
> In my opinion the right way to work around this problem should be
> to use a temporary file to store results for the big tables, forcing this
> option with a parameter as SQL_BIG_TABLES=1, in this case
> mysql_store_result() can store the result in a file, i can issue the
> SQLFreeResult() and build another query.
>
> What do you think about ?
>
>
> > Which version of MyODBC do you use?
>
> myodbc-2.50.22
>
> > If you have an very old MyODBC versions which includes an old
> > mysqlclient lib this may explain your problems; Older
> > mysql_free_result() versions didn't read the old result from the
> > connections and because of this one would get the error message you
> > got if one didn't read all rows from the result before calling
> > mysql_query() again.
>
>
> Thanks, Monty
>
> marcello
>
>
>
>
>
Hi!
Final solution for your problem are cursors, which are on TODO for our
4.* branch.
Meanwhile you could cache result set and use LIMIT n,m logic to
refresh a cache. I have some code written for mysqlgui that utilizes
that logic and connects to MySQL only when there is no row in the
cache.
Also, we have in the works MySQL++ which uses the above logic but
stores entire restult set in a STL container with a modest memory
usage.
Regards,
Sinisa
____ __ _____ _____ ___ == MySQL AB
/*/\*\/\*\ /*/ \*\ /*/ \*\ |*| Sinisa Milivojevic
/*/ /*/ /*/ \*\_ |*| |*||*| mailto:[EMAIL PROTECTED]
/*/ /*/ /*/\*\/*/ \*\|*| |*||*| Larnaka, Cyprus
/*/ /*/ /*/\*\_/*/ \*\_/*/ |*|____
^^^^^^^^^^^^/*/^^^^^^^^^^^\*\^^^^^^^^^^^
/*/ \*\ Developers Team
---------------------------------------------------------------------
Before posting, please check:
http://www.mysql.com/manual.php (the manual)
http://lists.mysql.com/ (the list archive)
To request this thread, e-mail <[EMAIL PROTECTED]>
To unsubscribe, e-mail <[EMAIL PROTECTED]>
Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php