On Sun, 16 Aug 2009 05:55:48 -0400, Chris Rebert wrote: > On Sun, Aug 16, 2009 at 5:47 AM, Terry<terry.yin...@gmail.com> wrote: >> Hi, >> >> Is there a simple way (the pythonic way) to flatten a list of list? >> rather than my current solution: >> >> new_list=[] >> for l in list_of_list: >> new_list.extend(l) >> >> or, >> >> new_list=reduce(lambda x,y:x.extend(y), list_of_list) > > #only marginally better: > from operator import add > new_list = reduce(add, list_of_list)
Surely that's going to be O(N**2)? I'd predict that would perform even worse than O(N**2) string concatenation, on account that a string of size N has to allocate space for N bytes, but a list of size N allocates space for N pointers (each 4 bytes, or 8 depending on the system), rounded up to the nearest power of two. Also CPython can optimize string concatenation to O(N) under some circumstances, but not lists. >>> from timeit import Timer >>> setup = """\\ ... L = [ ([None]*5000) for x in xrange(%d) ] ... from operator import add ... """ >>> Timer("reduce(add, L)", setup % 4).repeat(number=1000) [0.53808808326721191, 0.51404905319213867, 0.51157188415527344] >>> Timer("reduce(add, L)", setup % 8).repeat(number=1000) [2.1178171634674072, 3.8830602169036865, 4.72245192527771] >>> Timer("reduce(add, L)", setup % 16).repeat(number=1000) [17.190337896347046, 21.572744131088257, 21.584265947341919] Looks like O(N**2) behaviour to me. -- Steven -- http://mail.python.org/mailman/listinfo/python-list