Hi guys
I have situation in which i have machine with 4 processor and i have 5
containers so does it mean i can have only 4 mappers running parallely
at a time
and number of mappers is not dependent on the number of containers in a
machine then what is the use of container concept
sorry
It depends on memory settings as well, that how much you want to assign
resources to each container. Then yarn will run as many mappers in parallel
as possible.
See this:
http://hortonworks.com/blog/how-to-plan-and-configure-yarn-in-hdp-2-0/
It cannot run more mappers (tasks) in parallel than the underlying cores
available. Just like it cannot run multiple mappers in parallel if each
mapper's (task's) memory requirements are greater than allocated and
available container size configured on each node.
The links that I provided
I have one more doubt i was reading this
http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.0.6.0/bk_installing_manually_book/content/rpm-chap1-11.html
there is one property as
mapreduce.map.memory.mb = 2*1024 MB
mapreduce.reduce.memory.mb = 2 * 2 = 4*1024 MB
what are these
Explanation here.
http://stackoverflow.com/questions/24070557/what-is-the-relation-between-mapreduce-map-memory-mb-and-mapred-map-child-jav
https://support.pivotal.io/hc/en-us/articles/201462036-Mapreduce-YARN-Memory-Parameters
it is still not clear to me
lets suppose block size of my hdfs is 128 mb so every mapper will
process only 128 mb of data
then what is the meaning of setting the property mapreduce.map.memory.mb
that is already known from the block size then why this property
On Wednesday 15 October 2014
The data that the each map task will process is different from the memory
the task itself might require depending upon whatever processing that you
plan to do in the task.
Very trivial example: Let us say your map gets 128mb input data but your
task logic is such that it creates lots of String
thanks for the reply
i have one more doubt
are there three kinds of containers with different memory sizes in hadoop 2
1.normal container
2.map task container
3.reduce task container
On Wednesday 15 October 2014 07:33 PM, Shahab Yunus wrote:
The data that the each map task will process is