Thanks Prakhar for this information.
Jacques
Le 20/07/2020 à 10:12, Prakhar Kumar a écrit :
Hello Pawan,
We were getting a hard time dealing with large datasets in our client
project. We were streaming data from MySQL using the FetchSize and
EntityListIterator, which helped us up to some
Hi Prakhar,
Glad to know that this implementation helps you. Thanks for sharing details
:)
--
Thanks & Regards
Pawan Verma
Technical Consultant
*HotWax Systems*
*Enterprise open source experts*
http://www.hotwaxsystems.com
On Mon, Jul 20, 2020 at 1:42 PM Prakhar Kumar <
Hello Pawan,
We were getting a hard time dealing with large datasets in our client
project. We were streaming data from MySQL using the FetchSize and
EntityListIterator, which helped us up to some point, but ultimately
struggled with the further increase in data. This is where the batch
Hi Chandan, Jacques
Thanks, for your feedback.
Yes, To solve the problem of heavy entity operations which consumes all the
system memory, we have implemented EntityBatchIterator. Originally designed
for the heavy entity operations.
--
Thanks & Regards
Pawan Verma
Technical Consultant
*HotWax
Hi,
I have not looked into any details but Chandan's advice sounds like a wise one
to me
Jacques
Le 27/06/2020 à 13:43, Chandan Khandelwal a écrit :
Hello Pawan,
Approach looks good, my only suggestion is to use batch processing only
when we are dealing with large data set, as this method
Hello Pawan,
Approach looks good, my only suggestion is to use batch processing only
when we are dealing with large data set, as this method takes a longer time
compared to the normal method specially on a distributed environment, which
may negatively impact the performance.
Kind Regards,
Thanks, Pritam and Scott for the discussion.
I've created Jira OFBIZ-11789 for this improvement and also created a PR
with the proposed changes.
I request everyone to review the PR and suggest your thought on this.
Thanks!
--
Thanks & Regards
Pawan Verma
Technical Consultant
*HotWax Systems*
Thanks Scott for your detailed explanation.
The solution looks good to me too. My confusion was with why we are going
to implement new method if we can achieve that using the current
EntityQuery methods.
+1 for adding queryBatchIterator() to EntityQuery.
Kind Regards,
--
Pritam Kute
On Thu,
Hi Pritam,
I'm not sure about PostgreSQL or Derby but I know with MySQL that using a
cursor doesn't really work. You have to set the result set to
TYPE_FORWARD_ONLY and CONCUR_READ_ONLY and also set the fetch size to
INTEGER.MIN_VALUE. Only then will the driver stream the results and even
then,
Hello Pawan,
I just had a look into the EntityQuery.queryIterator() method and looks
like we can achieve that by using fetchSize(), fowardOnly(),
cursorScrollInsensitive(), cursorScrollSensitive() and offset() methods in
EntityQuery class. Let me know if I am missing anything.
It will be good if
Hello Devs,
While working on the large database we have figured out that very large
queries consume all memory and crash ofbiz(because queryIterator() doesn't
really work, it's no different from queryList())
The EntityListIterator attempts to use a cursor to iterate over large
result sets but in
11 matches
Mail list logo