Hello,
Although RAM may limit some actions i truly doubt that its actually RAM size that limits the max upload size. It would be incredibly foolish for a browser developer to 1) load all data in RAM and 2) read RAM contents to socket. A proper, and i believe modern browser developers use this, technique is streaming from disk to socket, perhaps using a small buffered reader. Cheers, - Markus Jelsma Buyways B.V. Technisch Architect Friesestraatweg 215c http://www.buyways.nl 9743 AD Groningen Alg. 050-853 6600 KvK 01074105 Tel. 050-853 6620 Fax. 050-3118124 Mob. 06-5025 8350 In: http://www.linkedin.com/in/markus17 On Tue, 2010-01-05 at 15:24 +0530, Agrawal, Vikas IN BLR SISL wrote: > Hello Benoit, > > Thanks for your reply. > I have 2 GB RAM. What should be the minimum RAM required to upload such a big > file? > > Regards, > Vikas > > -----Original Message----- > From: Benoit Chesneau [mailto:[email protected]] > Sent: Tuesday, January 05, 2010 3:04 PM > To: [email protected] > Subject: Re: Large documents are not uploading. > > On Mon, Jan 4, 2010 at 1:32 PM, Agrawal, Vikas IN BLR SISL > <[email protected]> wrote: > > Hello All, > > > > I am new to Couch DB. I am trying to upload vdi file (Virtual disk Image > > file) which is an image file of sun virtual box machine. Size of my file is > > 3.5 GB. When I try to upload this file it starts to upload and nothing is > > happening after that i.e. uploading is not successful. I tried with 11 Mb > > file and it is working fine. > > > > I have also changed the values of max_document_size and > > max_attachment_chunk_size in defauilt.ini file to 17 GB. But still no > > progress. > > > > I am using Windows, version 0.10.0 of CouchDB. > > > > I don't think It's possible to upload such size from the browser, you > will be limited in RAM I guess. Better to use a script. If you use > couchdbkit you can can stream the sending. But such attachment size > it's a little big. It may be better to save it on a filesystem or if > you want it in CouchDB you can split the file in multiple chunks and > save it in multiple docs, then map them to retrieve final file from > from CouchDB. > > - benoƮt
