On Thu, Jan 15, 2015 at 9:02 PM, CL Talk <clt...@gmail.com> wrote: > Hello - I have a python program that reads data from data files and > does manipulation on the data and writes back to output files. I am > developing this program on a macbook with python 2.7.6. > > This program takes close to 10 minutes on my machine to run. But, the > issue is that it is not take the same amount of time on almost similar > data files. > > I am trying to see how I can time the whole program as well as various > functions that are part of the program. > > Say, that I have 5 functions in this program how can i time the whole > program and time each individual function for every run?
The use of the profiler, as the others have discussed, is the right approach here. It's hard to say what's really going on without good timing data, and the profiler should help. Once you get that data, please feel free to share it on the mailing list if you can; folks here can help analyze what you're seeing too. Also, please put your source code somewhere for folks to inspect. Finally, if you can say how large your input and output are, that would also be helpful. If we're talking about megabytes, then since you're seeing minutes of runtime, I'd expect that the algorithms used are prime culprits, and good targets for chasing for optimizations. But if we're talking about terabytes, maybe your program is fine, and you just have a lot of data that needs chugging through. :P Good luck! _______________________________________________ Tutor maillist - Tutor@python.org To unsubscribe or change subscription options: https://mail.python.org/mailman/listinfo/tutor