Re: Fetching large number of records

2016-12-12 Thread vkulichenko
Anil,

You're setting 100 limit for result set rows, that's why you see only 100
rows.

statement.setMaxRows(100);

-Val



--
View this message in context: 
http://apache-ignite-users.70518.x6.nabble.com/Fetching-large-number-of-records-tp9267p9491.html
Sent from the Apache Ignite Users mailing list archive at Nabble.com.


Re: Fetching large number of records

2016-12-09 Thread Anil
Hi Val,

Seems ResultSet#next() not fetching next page.

i tried the following snippet -

JdbcConnection conn =
(JdbcConnection)DriverManager.getConnection("");

String sql = "select count from Test_counts where id is not null limit
1000";

PreparedStatement statement = conn.prepareStatement(sql);
statement.setMaxRows(100);

int i = 0;
ResultSet rs = statement.executeQuery();
while (rs.next() && i++ < 1000) {
System.out.println(rs.getString("count"));
}

i see only 100 records and 1000 is the expected records.

Do you see any issue with above snippet ? thanks


Thanks


On 3 December 2016 at 08:49, Anil  wrote:

> Hi Val,
>
> Thanks for clarification. I understand something and i will give a try.
>
> Thanks.
>
> On 2 December 2016 at 23:17, vkulichenko 
> wrote:
>
>> Anil,
>>
>> The JdbcQueryTask is executed each time the next page is needed. And the
>> number of rows returned by the task is limited by fetchSize:
>>
>> if (rows.size() == fetchSize) // If fetchSize is 0 then unlimited
>> break;
>>
>> The cursor is cached and reused there, so is this task is executed twice
>> for
>> the same result set, it will not execute the query from scratch, but will
>> get the existing cursor and start iteration from where it finished on the
>> first invocation.
>>
>> I'm not completely sure that I correctly understand what you mean by
>> streaming here, but paging is definitely in place and that's how it works
>> now.
>>
>> -Val
>>
>>
>>
>> --
>> View this message in context: http://apache-ignite-users.705
>> 18.x6.nabble.com/Fetching-large-number-of-records-tp9267p9373.html
>> Sent from the Apache Ignite Users mailing list archive at Nabble.com.
>>
>
>


Re: Fetching large number of records

2016-12-02 Thread Anil
Hi Val,

Thanks for clarification. I understand something and i will give a try.

Thanks.

On 2 December 2016 at 23:17, vkulichenko 
wrote:

> Anil,
>
> The JdbcQueryTask is executed each time the next page is needed. And the
> number of rows returned by the task is limited by fetchSize:
>
> if (rows.size() == fetchSize) // If fetchSize is 0 then unlimited
> break;
>
> The cursor is cached and reused there, so is this task is executed twice
> for
> the same result set, it will not execute the query from scratch, but will
> get the existing cursor and start iteration from where it finished on the
> first invocation.
>
> I'm not completely sure that I correctly understand what you mean by
> streaming here, but paging is definitely in place and that's how it works
> now.
>
> -Val
>
>
>
> --
> View this message in context: http://apache-ignite-users.
> 70518.x6.nabble.com/Fetching-large-number-of-records-tp9267p9373.html
> Sent from the Apache Ignite Users mailing list archive at Nabble.com.
>


Re: Fetching large number of records

2016-12-02 Thread vkulichenko
Anil,

The JdbcQueryTask is executed each time the next page is needed. And the
number of rows returned by the task is limited by fetchSize:

if (rows.size() == fetchSize) // If fetchSize is 0 then unlimited
break;

The cursor is cached and reused there, so is this task is executed twice for
the same result set, it will not execute the query from scratch, but will
get the existing cursor and start iteration from where it finished on the
first invocation.

I'm not completely sure that I correctly understand what you mean by
streaming here, but paging is definitely in place and that's how it works
now.

-Val



--
View this message in context: 
http://apache-ignite-users.70518.x6.nabble.com/Fetching-large-number-of-records-tp9267p9373.html
Sent from the Apache Ignite Users mailing list archive at Nabble.com.


Re: Fetching large number of records

2016-12-01 Thread Anil
Hi Val,

I think No. Correct me if i am wrong.

i was looking at following code pieces -

1. JdbcResultSet#next()

try {
JdbcQueryTask.QueryResult res =
loc ? qryTask.call() :
ignite.compute(ignite.cluster().forNodeId(nodeId)).call(qryTask);

finished = res.isFinished();

it = res.getRows().iterator();

return next();
}

 res.getRows() returns all the results and all are in memory. correct ?

2. JdbcQueryTask#call

List> rows = new ArrayList<>();

for (List row : cursor) {
List row0 = new ArrayList<>(row.size());

for (Object val : row)
row0.add(JdbcUtils.sqlType(val) ? val : val.toString());

rows.add(row0);

if (rows.size() == fetchSize) // If fetchSize is 0 then
unlimited
break;
}

all cursor rows are fetched and sent to #1

next() method is a just iterator on available rows. agree ? this is not a
streaming from server to client. Am i wrong ?

Thanks

On 2 December 2016 at 04:11, vkulichenko 
wrote:

> Anil,
>
> While you iterate through the ResultSet on the client, it will fetch
> results
> from server in pages. Are you looking for something else?
>
> -Val
>
>
>
> --
> View this message in context: http://apache-ignite-users.
> 70518.x6.nabble.com/Fetching-large-number-of-records-tp9267p9342.html
> Sent from the Apache Ignite Users mailing list archive at Nabble.com.
>


Re: Fetching large number of records

2016-12-01 Thread vkulichenko
Anil,

While you iterate through the ResultSet on the client, it will fetch results
from server in pages. Are you looking for something else?

-Val



--
View this message in context: 
http://apache-ignite-users.70518.x6.nabble.com/Fetching-large-number-of-records-tp9267p9342.html
Sent from the Apache Ignite Users mailing list archive at Nabble.com.


Re: Fetching large number of records

2016-12-01 Thread Anil
Hi Val,

Yes. results are fetched in pages only. But each execute query will return
the results of a page. correct ?

I am checking if there way that direct query parser stream is available to
client though jdbc driver. Hope this is clear.

Thanks

On 1 December 2016 at 04:34, vkulichenko 
wrote:

> Anil,
>
> JDBC driver actually uses the same Ignite API under the hood, so the
> results
> are fetched in pages as well.
>
> As for the query parser question, I didn't quite understand what you mean.
> Can you please give more details?
>
> -Val
>
>
>
> --
> View this message in context: http://apache-ignite-users.
> 70518.x6.nabble.com/Fetching-large-number-of-records-tp9267p9313.html
> Sent from the Apache Ignite Users mailing list archive at Nabble.com.
>


Re: Fetching large number of records

2016-11-30 Thread vkulichenko
Anil,

JDBC driver actually uses the same Ignite API under the hood, so the results
are fetched in pages as well.

As for the query parser question, I didn't quite understand what you mean.
Can you please give more details?

-Val



--
View this message in context: 
http://apache-ignite-users.70518.x6.nabble.com/Fetching-large-number-of-records-tp9267p9313.html
Sent from the Apache Ignite Users mailing list archive at Nabble.com.


Re: Fetching large number of records

2016-11-29 Thread Anil
Thanks Val

Unfortunately, I have created centralized ignite cache cluster and use
ignite jdbc connection to get fetch cache entries.

i could not find any method which uses query parser in jdbc connection. Can
we access query parser using jdbc connection ?

Thanks

On 30 November 2016 at 00:02, vkulichenko 
wrote:

> Hi Anil,
>
> While you iterate through the QueryCursor, the results will be fetched in
> pages and you will have only one page at a time in local client memory. The
> page size can be controlled via Query.setPageSize() method.
>
> -Val
>
>
>
> --
> View this message in context: http://apache-ignite-users.
> 70518.x6.nabble.com/Fetching-large-number-of-records-tp9267p9272.html
> Sent from the Apache Ignite Users mailing list archive at Nabble.com.
>


Re: Fetching large number of records

2016-11-29 Thread vkulichenko
Hi Anil,

While you iterate through the QueryCursor, the results will be fetched in
pages and you will have only one page at a time in local client memory. The
page size can be controlled via Query.setPageSize() method.

-Val



--
View this message in context: 
http://apache-ignite-users.70518.x6.nabble.com/Fetching-large-number-of-records-tp9267p9272.html
Sent from the Apache Ignite Users mailing list archive at Nabble.com.


Fetching large number of records

2016-11-29 Thread Anil
HI,

i have to implemented export functionality of the results (can be large
number) of ignite query.

Is there any way to get the query results in steaming manner ? are there
any better ways to achieve this instead of invoking the query number number
of times with offset and skip ?

Thanks.