Hi,
when I am trying to execute my program as 
spark-submit --master yarn --class com.mytestpack.analysis.SparkTest
sparktest-1.jar

I am getting error bellow error-
java.lang.IllegalArgumentException: Required executor memory (1024+384 MB)
is above the max threshold (1024 MB) of this cluster!
        at
org.apache.spark.deploy.yarn.ClientBase$class.verifyClusterResources(ClientBase.scala:71)
        at
org.apache.spark.deploy.yarn.Client.verifyClusterResources(Client.scala:35)
        at 
org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:77)
        at
org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57)
        at
org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:140)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:335)
        at
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)

I am new in Hadoop environment,
Please help how/where need to set memory or any configuration ,thanks in
advance,




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/getting-error-when-submit-spark-with-master-as-yarn-tp21542.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to