Antoine Pitrou <pit...@free.fr> added the comment: > 1. In constrained memory environments, creating a temporary internal > copy of a large set may cause the difference operation to fail that > would otherwise succeed.
It's a space/time tradeoff. There's nothing wrong about that. (do note that hash tables themselves take much more space than the "equivalent" list) > 2. The break-even point between extra lookups and a copy is likely to > be different on different systems or even on the same system under > different loads. So what? It's just a matter of choosing reasonable settings. There are other optimization heuristics in the interpreter. The optimization here looks ok to me. ---------- nosy: +pitrou _______________________________________ Python tracker <rep...@bugs.python.org> <http://bugs.python.org/issue8685> _______________________________________ _______________________________________________ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com