You should upload the file first then run a scheduler task that reads it 
and store into the database in background.

On Sunday, 5 October 2014 17:49:16 UTC-5, kenny c wrote:
>
> I've been trying to import big csv file into the database by running the 
> private script, but the system kills it for running too long..
>
> I am running postgresql and I have also tried COPY query. However, I don't 
> have id column for over 10million rows and Excel just dies on me if I try 
> to input id integers in the cell.
>
> Please let me know if there is a good way to import the big chunk of data 
> into the database.
>
> Thank you. 
>

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to