Jame:

You control the number via settings in solrconfig.xml, so it's
up to you.

Jonathan:
Hmmm, that's seems right, after all the "deep paging" penalty is really
about keeping a large sorted array in memory.... but at least you only
pay it once per 10,000, rather than 100 times (assuming page size is
100)...

Best
Erick

On Wed, Aug 10, 2011 at 10:58 AM, jame vaalet <jamevaa...@gmail.com> wrote:
> when you say queryResultCache, does it only cache n number of result for the
> last one query or more than one queries?
>
>
> On 10 August 2011 20:14, simon <mtnes...@gmail.com> wrote:
>
>> Worth remembering there are some performance penalties with deep
>> paging, if you use the page-by-page approach. may not be too much of a
>> problem if you really are only looking to retrieve 10K docs.
>>
>> -Simon
>>
>> On Wed, Aug 10, 2011 at 10:32 AM, Erick Erickson
>> <erickerick...@gmail.com> wrote:
>> > Well, if you really want to you can specify start=0 and rows=10000 and
>> > get them all back at once.
>> >
>> > You can do page-by-page by incrementing the "start" parameter as you
>> > indicated.
>> >
>> > You can keep from re-executing the search by setting your
>> queryResultCache
>> > appropriately, but this affects all searches so might be an issue.
>> >
>> > Best
>> > Erick
>> >
>> > On Wed, Aug 10, 2011 at 9:09 AM, jame vaalet <jamevaa...@gmail.com>
>> wrote:
>> >> hi,
>> >> i want to retrieve all the data from solr (say 10,000 ids ) and my page
>> size
>> >> is 1000 .
>> >> how do i get back the data (pages) one after other ?do i have to
>> increment
>> >> the "start" value each time by the page size from 0 and do the iteration
>> ?
>> >> In this case am i querying the index 10 time instead of one or after
>> first
>> >> query the result will be cached somewhere for the subsequent pages ?
>> >>
>> >>
>> >> JAME VAALET
>> >>
>> >
>>
>
>
>
> --
>
> -JAME
>

Reply via email to