Hi Filip, Barry.

Thank you Barry for the great help here :). 

Filip, Barry is right, simply having 4000 docs being grabbed is a bit 
inefficient at best, and unscalable if you ever grow to more docs.

The blob idea is definitely a good idea, it will be quick to retrieve and 
show. With a cron job to update every day/week, or a bit in your code that 
re-generates that blob whenever the index changes, you'll always have up to 
date info on your files.

Then from the display, whenever the customer selects a doc, the search 
index will work the way it's intended : give you quick results :).

Cheers!

On Wednesday, July 1, 2015 at 9:08:51 AM UTC-4, barryhunter wrote:
>
>  
>
>> so now I do all the filtering on the client side with very excellent 
>> performance.  
>>
>
> Maybe you could just retrieve all the documents in low-priority 'batch' 
> process. And store the intermediate results just as a blob of text. Perhaps 
> put as json file into Cloud Storage. 
>
> The front end can just access it directly by URL, and the backend can just 
> fire of a job to rebuild the file whenever the documents in the index 
> update. 
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to google-appengine+unsubscr...@googlegroups.com.
To post to this group, send email to google-appengine@googlegroups.com.
Visit this group at http://groups.google.com/group/google-appengine.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/google-appengine/dbc521c9-1ee7-4887-a224-d4732c6db673%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to