i know this is not an io - bound problem, i am creating heavy objects in the
process and add these objects in to queue and get that object in my main
program using queue.
you can test the this sample code
import time
from multiprocessing import Process, Queue

class Data(object):
    def __init__(self):
        self.y = range(1, 1000000)

def getdata(queue):
    data = Data()
    queue.put(data)

if __name__=='__main__':
    t1 = time.time()
    d1 = Data()
    d2 = Data()
    t2 = time.time()
    print "without multiProcessing total time:", t2-t1
    #multiProcessing
    queue = Queue()
    Process(target= getdata, args=(queue, )).start()
    Process(target= getdata, args=(queue, )).start()
    s1 = queue.get()
    s2 = queue.get()
    t2 = time.time()
    print "multiProcessing total time::", t2-t1



-----Original Message-----
From: James Mills [mailto:prolo...@shortcircuit.net.au] 
Sent: Saturday, January 17, 2009 10:37 AM
To: gopal mishra
Cc: python-list@python.org
Subject: Re: problem in implementing multiprocessing

On Fri, Jan 16, 2009 at 7:16 PM, gopal mishra <gop...@infotechsw.com> wrote:
> I create two heavy objects sequentially without using multipleProcessing
> then creation of the objects takes 2.5 sec.if i create these two objects
in
> separate process then total time is 6.4 sec.
>
> i am thinking it is happening due to the pickling and unpickling of the
> objects.if i am right then what could be the sollution.
>
> my system configuration:
> dual-core processor
> winXP
> python2.6.1

System specs in this case are irrelevant.

What you are experiencing is most likely an I/O
bound problem - using multiprocessing may likely
not help you solve the problem any faster because of
your I/O constraint.

cheers
James

--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to