Try this

./bin/spark-submit -v --master yarn-cluster --jars
./libs/mysql-connector-java-5.1.17.jar,./libs/log4j-1.2.17.jar --files
datasource.properties,log4j.properties  --num-executors 1 --driver-memory
4g *--driver-java-options "-XX:MaxPermSize=1G"* --executor-memory 2g
--executor-cores 1 --class com.test.spark.jobs.AggregationJob sparkagg.jar


I noticed that your using mysql-connector-java-5.1.17.jar. Are you running
Spark-SQL (Hive queries from Spark) ?


On Mon, Apr 13, 2015 at 3:53 PM, sachin Singh <sachin.sha...@gmail.com>
wrote:

> Hi ,
> When I am submitting spark job as --master yarn-cluster with below
> command/options getting driver
> memory error-
>
> spark-submit --jars
> ./libs/mysql-connector-java-5.1.17.jar,./libs/log4j-1.2.17.jar --files
> datasource.properties,log4j.properties --master yarn-cluster
> --num-executors
> 1 --driver-memory 2g --executor-memory 512m --class
> com.test.spark.jobs.AggregationJob sparkagg.jar
>
> Exceptions as per yarn application ID log as under -
> Container: container_1428938273236_0006_01_000001 on
> mycom.hostname.com_8041
>
> =============================================================================
> LogType: stderr
> LogLength: 128
> Log Contents:
> Exception in thread "Driver"
> Exception: java.lang.OutOfMemoryError thrown from the
> UncaughtExceptionHandler in thread "Driver"
>
> LogType: stdout
> LogLength: 40
>
>
> Container: container_1428938273236_0006_02_000001 on
> mycom.hostname.com_8041
>
> =============================================================================
> LogType: stderr
> LogLength: 1365
> Log Contents:
> java.io.IOException: Log directory
> hdfs://
> mycom.hostname.com:8020/user/spark/applicationHistory/application_1428938273236_0006
> already exists!
>         at
> org.apache.spark.util.FileLogger.createLogDir(FileLogger.scala:129)
>         at org.apache.spark.util.FileLogger.start(FileLogger.scala:115)
>         at
>
> org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:74)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:353)
>         at
>
> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
>
> LogType: stdout
> LogLength: 40
>
>
> please help its urgent for me,
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Exception-Driver-Memory-while-running-Spark-job-on-Yarn-cluster-tp22475.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


-- 
Deepak

Reply via email to