On 01:18 am, pavlovevide...@gmail.com wrote:
On Nov 7, 5:05�pm, sturlamolden <sturlamol...@yahoo.no> wrote:
On 7 Nov, 03:46, gil_johnson <gil_john...@earthlink.net> wrote:>

> I don't have the code with me, but for huge arrays, I have used
> something like:

> >>> arr[0] = initializer
> >>> for i in range N:
> >>> � � �arr.extend(arr)

> This doubles the array every time through the loop, and you can add
> the powers of 2 to get the desired result.
> Gil

You should really use append instead of extend. The above code is O
(N**2), with append it becomes O(N) on average.

I think the top one is O(N log N), and I'm suspicious that it's even
possible to grow a list in less than O(N log N) time without knowing
the final size in advance.  Citation?  Futhermore why would it matter
to use extend instead of append; I'd assume extend uses the same
growth algorithm.  (Although in this case since the extend doubles the
size of the list it most likely reallocates after each call.)

[None]*N is linear time and is better than growing the list using
append or extend.

The wikipedia page for http://en.wikipedia.org/wiki/Amortized_analysis conveniently uses exactly this example to explain the concept of amortized costs.

Jean-Paul
-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to