woking on 1.
for 2, how do you "foresee" how much time and RAM the present query will 
take to be serialized ?

On Wednesday, November 7, 2012 11:25:29 AM UTC+1, Johann Spies wrote:
>
> On 7 November 2012 10:56, Niphlod <nip...@gmail.com <javascript:>> wrote:
>
>> uhm, good point on "if you want the entire table, just remove the 
>> filters".
>> how to handle something that is impossible to handle (export a table with 
>> so many rows that you can't export without timeouts or consuming 
>> memory)..... just timeout ?
>>
>
> I had a situation on Webfaction where a customer downloaded a csv file 
> (before the time of smartgrid) which put too much strain on the available 
> RAM and the result was incorrect and inconsistent. In the end I had to use 
> the backend (postgresql) through ssh to get the correct data.
>
> I suspect that one can put something like 'limitby' into the csv-query and 
> download several files if it exceeds a certain ceiling.
>
> I am working with sets of data of which some tables can contain millions 
> of records. 
>
> So there are two issues in this thread:
>
> 1.  The csv-download buttons in smartgrid/grid should download the result 
> of the present query
> 2.  Some safeguards should be build to prevent large datasets to consume 
> too much memory or time.
>
> Regards
> Johann
> -- 
> Because experiencing your loyal love is better than life itself, 
> my lips will praise you.  (Psalm 63:3)
>
>

-- 



Reply via email to