The limits for the large dataset mean that you should be prepared for about 
O(15 million) iterations in 15 seconds. The problem involves lookup tables, 
hence frequent memory access, and some benchmarks report a Python 
implementation can be even a hundred time slower than a C++ implementation in 
some cases.
Could I code in Python (and I definitely must learn it!), I would have afraid 
of the above numbers. Of course, for the above reasoning, help of Capt. 
Hindsight is beneficial too :)
As the Code Jam team say themselves in the FAQ: «Part of the contest is knowing 
which tool to use for which job, just as in everyday software engineering.»

-- 
You received this message because you are subscribed to the Google Groups 
"Google Code Jam" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/google-code/dc58ca1f-59f3-4996-b1c5-b7a0663389c0%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to