Alexander Belopolsky <belopol...@users.sourceforge.net> added the comment:

I have two problems with this proposal:

1. In constrained memory environments, creating a temporary internal copy of a 
large set may cause the difference operation to fail that would otherwise 
succeed.

2. The break-even point between extra lookups and a copy is likely to be 
different on different systems or even on the same system under different loads.

Programs that suffer from poor large_set.difference(small_set) performance can 
be rewritten as large_set_copy = large_set.copy(); 
large_set_copy.difference_updste(small_set) or even simply as 
large_set.difference_updste(small_set) if program logic allows it.

----------

_______________________________________
Python tracker <rep...@bugs.python.org>
<http://bugs.python.org/issue8685>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to