Here is the situation. I am using a cluster on which you can use MPI to split the processing time on to different processors. I was using PostgreSQL but since the data was on different server than the one I am running the code, it was slowing down the simulation due to connection overhead on PostgreSQL server. Then I switched to on-disk management software and so I chose SQLite. I first ran it without MPI on server, and it ran (compile g++). Then I switched to MPI (mpicc compiler), it is not even running for one node (on which I am using one thread). It gives error "Database or Disk is full". Further debugging revlealed that sqlite3_step() was returning 1 and that means "SQL error or Database missing". It was able to get the number of columns with the same database and SQL querry. I dont understand why I am getting this error.
Gaurav gavyas wrote: > > Its working fine with g++ compiler but not mpicc!! > > Dan Kennedy-4 wrote: >> >> On 11/22/2011 09:48 AM, gavyas wrote: >>> >>> I checked the code again and debugged it. The code gave error when I am >>> passing the querry "SELECT * FROM households ORDER BY zone_id LIMIT >>> 10000 >>> OFFSET 0" but it ran successfully for LIMIT 5000. I dont understand if >>> there >>> is an upper limit on LIMIT or what but the table "households" has 10000 >>> data >>> points. The code didnt run for LIMIT 9999 either >> >> This is a unix system, correct? >> >> It's worth trying 3.7.9 if you are using something older than >> that. There has been a fix or two regarding interrupts during >> write() system calls over the last few months. >> >> If you're already on 3.7.9, try adding some debugging code >> to SQLite to print out errno and call perror() right before >> the "return SQLITE_FULL;" line in function unixWrite(). >> Line 27722 of the 3.7.9 amalgamation on the website. Maybe >> there is some other error code we need to retry writes following. >> >> Dan. >> >> >> >>> >>> Simon Slavin-3 wrote: >>>> >>>> >>>> On 21 Nov 2011, at 11:09pm, gavyas wrote: >>>> >>>>> I am able to run the code successfully when I dont use parallel runs. >>>>> It >>>>> gives error when I am running the code parallely. >>>> >>>> Ahha. That's a more useful diagnostic. If you haven't already, read >>>> these: >>>> >>>> <http://www.sqlite.org/threadsafe.html> >>>> >>>> <http://stackoverflow.com/questions/524797/python-sqlite-and-threading> >>>> >>>> but I can't comment on multi-threading from my own use. I hope someone >>>> else can. >>>> >>>> Simon. >>>> _______________________________________________ >>>> sqlite-users mailing list >>>> sqlite-users@sqlite.org >>>> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users >>>> >>>> >>> >> >> _______________________________________________ >> sqlite-users mailing list >> sqlite-users@sqlite.org >> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users >> >> > > -- View this message in context: http://old.nabble.com/SQLite%3A-Database-or-disk-full-tp32871505p32872083.html Sent from the SQLite mailing list archive at Nabble.com. _______________________________________________ sqlite-users mailing list sqlite-users@sqlite.org http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users