arg sorry I miss your LRP (Long Running Process),

On Tue, Feb 2, 2010 at 11:55 PM, Sylvain Pointeau <
sylvain.point...@gmail.com> wrote:

> Please just don't forget that sqlite lock the file at each update,
> running multiple processes will not improve at all the speed, I think it
> will be even worst.
> The only improvement you can do is to group your update in a single
> transaction, and avoid to run one process (sqlite3) for just 1 update.
>
> On Tue, Feb 2, 2010 at 10:23 PM, Robert Citek <robert.ci...@gmail.com>wrote:
>
>> On Tue, Feb 2, 2010 at 3:13 PM, John Elrick <john.elr...@fenestra.com>
>> wrote:
>> > Robert Citek wrote:
>> >> Are there some white papers or examples of how to do updates in
>> >> parallel using sqlite?
>> >
>> > I could be misunderstanding your requirements, but this sounds a little
>> > like Map Reduce:
>> >
>> > http://labs.google.com/papers/mapreduce.html
>>
>> Not sure, but quite possibly.  I'm reading up more on mapreduce.
>>
>> > The only point I'd question is your assertion that you could speed up
>> > the overall time by running more than one long running process at the
>> > same time.  You *might* be able to do so up to the limit of the cores in
>> > the machine or by distributing the load over many machines, however, the
>> > implication to me of a long running process is something that is
>> > consuming large amounts of CPU time.
>>
>> What I mean is that long_running_process (LRP) takes a long time to
>> run relative to updating a record in a sqlite database.  LRP's
>> bottleneck could be CPU or I/O or network lag or something else.  I'm
>> also assuming that if multiple LRPs are running, then they will
>> negligibly compete with each other for resources.  For example, if the
>> LRP is CPU-limited and the machine has only 4 cores, then there will
>> be at most 4 LRPs running at any given time.
>>
>> > It is possible that running
>> > multiple processes per processor could actually increase the total
>> > amount of time due to process swap overhead.
>>
>> The ideal would be to have a general framework that works on a single
>> CPU, multiple-CPUs/Cores, and multiple machines.
>>
>> A google search for mapreduce led to this project:
>>
>> http://github.com/erikfrey/bashreduce
>>
>> I'll probably give that a try if for no other reason than to get more
>> familiar with mapreduce.
>>
>> Thanks, John.
>>
>> Regards,
>> - Robert
>> _______________________________________________
>> sqlite-users mailing list
>> sqlite-users@sqlite.org
>> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
>>
>
>
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to