Hi all,
Does hadoop provide a way to let the users know the time for
computation(map/reduce functions) and the time for different types of
overhead (such as the startup, sorting, i/o disk, etc.) respectively?
Thanks~~
Best regards,
--
---
Wei
Hi all,
I am a new user with hadoop and have some questions about it.
1)about setting the number of maps/reduces: With running hadoop on a 8-node
cluster, I set mapred.map.tasks to 64 and
mapred.tasktracker.map.tasks.maximum to 8, but by examining the counter
"launched map tasks" from the output