I doubt that query compilation is the major cost here. The problem is that too
many records are being moved too often.
Sent from my iPad
On Aug 15, 2011, at 10:23 PM, Lance Norskog goks...@gmail.com wrote:
The standard advice also applies: use stored procedures if you can. If
not, use
Yes, I also doubt that the cost of parsing a simple select a,b,c from
x query matters compared to sending 80K records across the network.
On Tue, Aug 16, 2011 at 6:23 AM, Lance Norskog goks...@gmail.com wrote:
The standard advice also applies: use stored procedures if you can. If
not, use
Is there a way to selectively reload data from the database for a
user? That way, we wouldn't have to pull down 80k records on every
reload?
On Mon, Aug 15, 2011 at 1:59 PM, Sean Owen sro...@gmail.com wrote:
That's more reasonable. It sounds a bit long still but could believe
it is due to the
There isn't -- you could probably add that to your copy fairly easily. Just
clear the in memory representation and reload what you want from the DB.
On Tue, Aug 16, 2011 at 7:34 PM, Salil Apte sa...@offlinelabs.com wrote:
Is there a way to selectively reload data from the database for a
user?
Something's very wrong there. 80K rows is tiny, and loads in a second
or so from a file. I think you want to figure out where the slow-down
is with some debugging, since I do not think it's the library. Is
something locking a table, excluding reads, for instance?
On Mon, Aug 15, 2011 at 8:02 PM,
I don't think it's the library either. 80K rows load very fast for us.
We did experience slow writing back to the database. But after we
disable JDBC auto-commit, then INSERT got very fast again.
On Mon, Aug 15, 2011 at 3:50 PM, Sean Owen sro...@gmail.com wrote:
Something's very wrong there. 80K
Apologies, I typed that email without having had my coffee. I meant it
takes 10 seconds to reload.
On Mon, Aug 15, 2011 at 1:16 PM, Daniel Xiaodan Zhou
danith...@gmail.com wrote:
I don't think it's the library either. 80K rows load very fast for us.
We did experience slow writing back to the
That's more reasonable. It sounds a bit long still but could believe
it is due to the overhead of reading everything from the database.
It seems very expensive to reload everything on every user change --
that's not quite what it was meant for. Though I imagine you could get
away with it for
The standard advice also applies: use stored procedures if you can. If
not, use 'prepared statements' because they precompile the database
execution plan. Also, different DBs have their quirks in re batching
reads and returning them.
On Mon, Aug 15, 2011 at 1:59 PM, Sean Owen sro...@gmail.com