On Monday 20 April 2015 03:18 PM, Archit Thakur wrote:
There are lot of similar problems shared and resolved by users on this
same portal. I have been part of those discussions before, Search
those, Please Try them and let us know, if you still face problems.
Thanks and Regards,
Archit Thakur.
On Mon, Apr 20, 2015 at 3:05 PM, madhvi <madhvi.gu...@orkash.com
<mailto:madhvi.gu...@orkash.com>> wrote:
On Monday 20 April 2015 02:52 PM, SURAJ SHETH wrote:
Hi Madhvi,
I think the memory requested by your job, i.e. 2.0 GB is higher
than what is available.
Please request for 256 MB explicitly while creating Spark Context
and try again.
Thanks and Regards,
Suraj Sheth
Tried the same but still no luck:|
Madhvi
Hi,
Its still not working.Dont getting where I am mistaken or doing
wrong.Following are the configurations in my spark-env.sh file:
export HADOOP_CONF_DIR=/root/Documents/hadoop/etc/hadoop
export SPARK_EXECUTOR_MEMORY=256m
export SPARK_DRIVER_MEMORY=256m
I am running the command on the shell:
./bin/spark-submit --class Spark.testSpark.JavaWordCount --master
yarn-client --num-executors 2 --driver-memory 256m
--executor-memory 256m --executor-cores 1 lib/Untitled.jar
Madhvi