Am 02.07.2017 um 02:16 schrieb PICCORO McKAY Lenz:
>> If you query such a lot of tuples into Sqlite you won't make it
>> better, I think. Also a collection seems to be not very fast.
>>
> tested, very slower... you have right.. sqlite memory more faster
> butstill slow process passed to sqlite
2017-07-01 10:59 GMT-04:30 Christof Thalhofer :
> From what database do you query "select * from table" with ODBC?
>
sybase and Oracle, a propietary odbc module does all the job very good, but
i need to use open source,, and freetds have in combination with gambas
lack of a
Am 01.07.2017 um 12:35 schrieb PICCORO McKAY Lenz:
> hi cristof, the query its just "select * from table" but where
> "table" its a "cube" of the datawarehouse.. so i want to made a
> something similar to BussinesObject.. so get 200.000 rows its not a
> surprise in desktop..
For a datawarehouse
2017-07-01 6:38 GMT-04:30 Fernando Cabral :
> I think you should be more specific. Instead of saying "the real problem is
> the lack of
> gambas to handle many DB features", let us know which those [lacking]
> features are.
>
yet explainet and bug filet to
2017-07-01 6:30 GMT-03:00 PICCORO McKAY Lenz :
> all of those question are not relevant, the real problem its the lack of
> gambas to handle many DB features due the ODBC connection..
>
I think you should be more specific. Instead of saying "the real problem is
the lack
hi cristof, the query its just "select * from table" but where "table" its
a "cube" of the datawarehouse.. so i want to made a something similar to
BussinesObject.. so get 200.000 rows its not a surprise in desktop..
the other problem to force me to get so many rows its the lack of
Am 30.06.2017 um 00:57 schrieb PICCORO McKAY Lenz:
> i'm taking about 200.000 rows in a result... the problem its that the odbc
> db object support only cursor with forward only..
Show us your query. For what do you need 200.000 rows? That's way too
much if you want to visialize anything.
thanks in advance adamnt42, i need to convert the result due missing odbc
important features...
2017-07-01 3:06 GMT-04:30 adamn...@gmail.com :
> Well, 30 minutes does sound very excessive. Are you certain that it's the
> "unmarshalling" that is taking the time and not the
On Fri, 30 Jun 2017 08:41:49 -0400
PICCORO McKAY Lenz wrote:
> i get more than 30 minutes, due i must parse to a low end machine, not to
> your 4 cores, 16Gb ram super power machine.. i'm taking about a 1G ram and
> single core 1,6GHz atom cpu
>
> i need to convert from
i get more than 30 minutes, due i must parse to a low end machine, not to
your 4 cores, 16Gb ram super power machine.. i'm taking about a 1G ram and
single core 1,6GHz atom cpu
i need to convert from Result/cursor to other due the problem of the odbc
lack of cursor/count ..
i thinking about use
On Thu, 29 Jun 2017 18:57:29 -0400
PICCORO McKAY Lenz wrote:
> can i convert directly or more faster than copy each row, a Result from
> database to a collection or a VArian matrix?
>
> i'm taking about 200.000 rows in a result... the problem its that the odbc
> db
can i convert directly or more faster than copy each row, a Result from
database to a collection or a VArian matrix?
i'm taking about 200.000 rows in a result... the problem its that the odbc
db object support only cursor with forward only..
so with a matrix or a collection i cant emulate the
12 matches
Mail list logo