Hi Ankit, Here is my solution for this:- 1) Download the latest Spark 1.5.1(Just copied the following link from spark.apache.org, if it doesn't work then gran a new one from the website.) wget http://d3kbcqa49mib13.cloudfront.net/spark-1.5.1-bin-hadoop2.6.tgz
2) Unzip the folder and rename/move the folder according to your wish. 3) To run the Spark job now, first we need to setup HADOOP_CONF_DIR , YARN_CONF_DIR and SPARK_HOME. The following is my bash script to run the jar. export HADOOP_CONF_DIR=/etc/hadoop/conf export YARN_CONF_DIR=/etc/spark/conf.cloudera.spark_on_yarn/yarn-conf echo $YARN_CONF_DIR export SPARK_HOME=/hadoop/user/ooxpdeva/spark151 echo $SPARK_HOME $SPARK_HOME/bin/spark-submit --verbose --class anubhav.Main --master yarn-client --num-executors 7 --driver-memory 6g --executor-memory 6g --executor-cores 8 --queue deva --conf "spark.executor.extraJavaOptions=-Dlog4j.configuration/hadoop/user/deva/log4j.properties" /hadoop/user/deva/anubhav.jar arg1 arg2 Hope this helps you. Regards, Anubhav On Thu, Oct 22, 2015 at 11:39 AM, Ankit (JIRA) <j...@apache.org> wrote: > Ankit shared an issue with you > ------------------------------- > > > > > Documentation for remote spark Submit for R Scripts from 1.5 on CDH 5.4 > > ----------------------------------------------------------------------- > > > > Key: SPARK-11213 > > URL: https://issues.apache.org/jira/browse/SPARK-11213 > > Project: Spark > > Issue Type: Bug > > Reporter: Ankit > > > > Hello Guys, > > We have a Cloudera Dist 5.4 ad it has spark 1.3 version > > Issue > > we have data sciencetis work on R Script so was searching a ways to > submit a r script using ozie or local spark submit to a remoter Yarn > resource manager can anyone share the steps to do the same it really > difficult to guess the steps , > > Thanks in advance > > > > > -- > This message was sent by Atlassian JIRA > (v6.3.4#6332) > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > >