New submission from Bob <for_elis...@yahoo.fr>:

Hi,

I've found a strange performance issue when comparing queue.queue and 
multiprocessing.queue in python 2.6 and 3.1

My program creates a queue, and do 1 million put and get operations on either a 
small data or a "big" array.

My code: (This is the 3.1 version. Switch module name from queue to Queue to 
run on 2.6)

####################################################################
import multiprocessing
import queue

def with_queue(queuetype,datatype):
    if queuetype == 'multi':
        q = multiprocessing.Queue(1000)
    else:
        q = queue.Queue(1000)

    if datatype == 'small':
        data = 'some data'
    else:
        data = []
        for i in range(1000):
            data.append(i)
           
    for d in range(1000000):
        q.put(data)
        q.get()
       
if __name__=='__main__':
    from timeit import Timer
    t1 = Timer("with_queue('simple','small')","from __main__ import with_queue")
    t2 = Timer("with_queue('simple','big')","from __main__ import with_queue")
    t3 = Timer("with_queue('multi','small')","from __main__ import with_queue")
    t4 = Timer("with_queue('multi','big')","from __main__ import with_queue")
   
    print ('Using queue.Queue with small data            : ',t1.timeit(1))
    print ('Using queue.Queue with huge data             : ',t2.timeit(1))
    print ('Using multiprocessing.Queue with small data  : ',t3.timeit(1))
    print ('Using multiprocessing.Queue with huge  data  : ',t4.timeit(1))
#####################################################################

And the results (on my linux box:)

python2.6 read_write.py
    Using queue.Queue with small data            :  10.31s
    Using queue.Queue with huge data             :  10.39s
    Using multiprocessing.Queue with small data  :  33.85s
    Using multiprocessing.Queue with huge  data  :  155.38s

python3.1 read_write.py
    Using queue.Queue with small data            :  10.68s
    Using queue.Queue with huge data             :  10.61s
    Using multiprocessing.Queue with small data  :  50.27s
    Using multiprocessing.Queue with huge  data  :  472.49s


As you can see 3.1 is 50% slower than 2.6 in the third test; but 300 % slower 
in the 4th test.
If i go further with bigger data, 3.1 run for hours ... and i have to kill it 
before any result shows.
Am i doing something wrong or is there any known issue in 3.1 that can explain 
this ?

Thanks !

Bob

----------
components: Extension Modules
messages: 107775
nosy: bob
priority: normal
severity: normal
status: open
title: Performance issue with multiprocessing queue (3.1 VS 2.6)
type: performance
versions: Python 2.6, Python 3.1

_______________________________________
Python tracker <rep...@bugs.python.org>
<http://bugs.python.org/issue8995>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to