Re: [jira] Ankit shared "SPARK-11213: Documentation for remote spark Submit for R Scripts from 1.5 on CDH 5.4" with you

2015-10-22 Thread Anubhav Agarwal
$YARN_CONF_DIR export SPARK_HOME=/hadoop/user/ooxpdeva/spark151 echo $SPARK_HOME $SPARK_HOME/bin/spark-submit --verbose --class anubhav.Main --master yarn-client --num-executors 7 --driver-memory 6g --executor-memory 6g --executor-cores 8 --queue deva --conf "spark.executor.extraJavaOp

Re: spark-submit hive connection through spark Initial job has not accepted any resources

2015-10-10 Thread Yana Kadiyska
new JavaSparkContext(sctx); > HiveContext hiveCtx=new HiveContext(ctx.sc()); > DataFrame df= hiveCtx.sql("show tables"); > System.out.println(">>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>

Re: spark-submit hive connection through spark Initial job has not accepted any resources

2015-10-09 Thread vinayak
rting "+sctx.isLocal()); JavaSparkContext ctx=new JavaSparkContext(sctx); HiveContext hiveCtx=new HiveContext(ctx.sc()); DataFrame df= hiveCtx.sql("show tables"); System.out.println(">>>>>>>>&

spark-submit hive connection through spark Initial job has not accepted any resources

2015-10-09 Thread vinayak
Hi, I am able to fetch data, create table, put data from spark shell (scala command line) from spark to hive but when I create java code to do same and submitting it through spark-submit i am getting *"Initial job has not accepted any resources; check your cluster UI to ensure that wo

Re: spark-submit --packages using different resolver

2015-10-06 Thread Jerry Lam
or this. > > Best Regards, > > Jerry > > On Sat, Oct 3, 2015 at 12:50 PM, Burak Yavuz <brk...@gmail.com> wrote: > >> Hi Jerry, >> >> The --packages feature doesn't support private repositories right now. >> However, in the case of s3, maybe it might wor

Re: spark-submit --packages using different resolver

2015-10-06 Thread Jerry Lam
. Could you please try using > the --repositories flag and provide the address: > `$ spark-submit --packages my:awesome:package --repositories > s3n://$aws_ak:$aws_sak@bucket/path/to/repo` > > If that doesn't work, could you please file a JIRA? > > Best, > Burak > > >

Re: spark-submit --packages using different resolver

2015-10-03 Thread Burak Yavuz
Hi Jerry, The --packages feature doesn't support private repositories right now. However, in the case of s3, maybe it might work. Could you please try using the --repositories flag and provide the address: `$ spark-submit --packages my:awesome:package --repositories s3n://$aws_ak:$aws_sak@bucket

python version in spark-submit

2015-10-01 Thread roy
Hi, We have python2.6 (default) on cluster and also we have installed python2.7. I was looking a way to set python version in spark-submit. anyone know how to do this ? Thanks -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/python-version-in-spark

Re: python version in spark-submit

2015-10-01 Thread Ted Yu
; python2.7. > > I was looking a way to set python version in spark-submit. > > anyone know how to do this ? > > Thanks > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/python-version-in-spark-submit-tp24902.ht

spark-submit --packages using different resolver

2015-10-01 Thread Jerry Lam
Hi spark users and developers, I'm trying to use spark-submit --packages against private s3 repository. With sbt, I'm using fm-sbt-s3-resolver with proper aws s3 credentials. I wonder how can I add this resolver into spark-submit such that --packages can resolve dependencies from private repo

Re: spark-submit classloader issue...

2015-09-28 Thread Aniket Bhatnagar
ndings:* When I run my program as a *java application within eclipse > everything works fine*. But when I am running the program using > *spark-submit* I am getting following error: > > URL content Could not initialize class > org.apache.http.conn.ssl.SSLConnectionSocketFact

spark-submit classloader issue...

2015-09-28 Thread Rachana Srivastava
Hello all, Goal: I want to use APIs from HttpClient library 4.4.1. I am using maven shaded plugin to generate JAR. Findings: When I run my program as a java application within eclipse everything works fine. But when I am running the program using spark-submit I am getting following error

Re: --class has to be always specified in spark-submit either it is defined in jar manifest?

2015-09-25 Thread Petr Novak
Either setting it programatically doesn't work: sparkConf.setIfMissing("class", "...Main") In my current setting moving main to another package requires to propagate change to deploy scripts. Doesn't matter I will find some other way. Petr On Fri, Sep 25, 2015 at 4:40 PM, Petr Novak

--class has to be always specified in spark-submit either it is defined in jar manifest?

2015-09-25 Thread Petr Novak
Ortherwise it seems it tries to load from a checkpoint which I have deleted and cannot be found. Or it should work and I have wrong something else. Documentation doesn't mention option with jar manifest, so I assume it doesn't work this way. Many thanks, Petr

Re: --class has to be always specified in spark-submit either it is defined in jar manifest?

2015-09-25 Thread Petr Novak
I'm sorry. Both approaches actually work. It was something else wrong with my cluster. Petr On Fri, Sep 25, 2015 at 4:53 PM, Petr Novak wrote: > Either setting it programatically doesn't work: > sparkConf.setIfMissing("class", "...Main") > > In my current setting moving

Zeppelin on Yarn : org.apache.spark.SparkException: Detected yarn-cluster mode, but isn't running on a cluster. Deployment to YARN is not supported directly by SparkContext. Please use spark-submit.

2015-09-18 Thread shahab
spark-submit. Anyone knows What's the solution to this? best, /Shahab

spark-submit chronos issue

2015-09-16 Thread Saurabh Malviya (samalviy)
Hi, I am using facing strange issue while using chronos, As job is not able to find the Main class while invoking spark-submit using chronos. Issue I identified as "colon" in the task name Env -Chronos scheduled job on mesos /tmp/mesos/slaves/20150911-070325-218147008-5050-30275-S4/

Error - Calling a package (com.databricks:spark-csv_2.10:1.0.3) with spark-submit

2015-09-11 Thread Subhajit Purkayastha
by data source as a DataFrame, use the header for column names val df = sqlContext.load("com.databricks.spark.csv", Map("path" -> "sfpd.csv", "header" -> "true")) Now, I want to do the above as part of my package using spark-submit sp

Re: spark-submit not using conf/spark-defaults.conf

2015-09-03 Thread Davies Liu
gt; > "spark-submit.py -v test.py" > > I see that my "spark.files" default option has been replaced with > "spark.files test.py", basically spark-submit is overwriting > spark.files with the name of the script. > > Is this a bug or is ther

Re: spark-submit not using conf/spark-defaults.conf

2015-09-03 Thread Axel Dahl
t more investigation, shows that: > > > > if I have configured spark-defaults.conf with: > > > > "spark.files library.py" > > > > then if I call > > > > "spark-submit.py -v test.py" > > > > I see that my

spark-submit not using conf/spark-defaults.conf

2015-09-02 Thread Axel Dahl
in my spark-defaults.conf I have: spark.files file1.zip, file2.py spark.master spark://master.domain.com:7077 If I execute: bin/pyspark I can see it adding the files correctly. However if I execute bin/spark-submit test.py where test.py relies on the file1.zip, I get

Re: spark-submit not using conf/spark-defaults.conf

2015-09-02 Thread Davies Liu
; If I execute: > bin/pyspark > > I can see it adding the files correctly. > > However if I execute > > bin/spark-submit test.py > > where test.py relies on the file1.zip, I get and error. > > If I i instead execute > > bin/spark-submit --py-files file1.zip tes

Re: spark-submit not using conf/spark-defaults.conf

2015-09-02 Thread Axel Dahl
So a bit more investigation, shows that: if I have configured spark-defaults.conf with: "spark.files library.py" then if I call "spark-submit.py -v test.py" I see that my "spark.files" default option has been replaced with "spark.files

Re: spark-submit issue

2015-08-31 Thread Igor Berman
ing Spark 1.3.1 ) Create a > command > string that uses "spark-submit" in it ( with my Class file etc ), and i > store this string in a temp file somewhere as a shell script Using > Runtime.exec, i execute this script and wait for its completion, using > pro

Re: spark-submit issue

2015-08-31 Thread Akhil Das
> thx for the inputs Igor,, i am actually building an Analytics layer ( 'As > a service model' using Spark as the backend engine ) and hence i am > implementing it this way... Initially, i was opening the spark-contenxt in > the JVM that i had spawned ( without even using Spark-submit ) and add

Re: spark-submit issue

2015-08-31 Thread Pranay Tonpay
thx for the inputs Igor,, i am actually building an Analytics layer ( 'As a service model' using Spark as the backend engine ) and hence i am implementing it this way... Initially, i was opening the spark-contenxt in the JVM that i had spawned ( without even using Spark-submit ) and adding all

Re: spark-submit issue

2015-08-31 Thread Pranay Tonpay
s immediately exits . From: Igor Berman <igor.ber...@gmail.com> Sent: Monday, August 31, 2015 12:41 PM To: Pranay Tonpay Cc: user Subject: Re: spark-submit issue might be you need to drain stdout/stderr of subprocess...otherwise subprocess can deadlock http://stackoverflow.com/quest

Re: spark-submit issue

2015-08-31 Thread Igor Berman
1. think once again if you want to call spark submit in such way...I'm not sure why you do it, but please consider just opening spark context inside your jvm(you need to add spark jars to classpath..) 2. use https://commons.apache.org/proper/commons-exec/ with PumpStreamHandler On 31 August 2015

Re: spark-submit issue

2015-08-31 Thread Pranay Tonpay
, 2015 11:02 AM To: Pranay Tonpay Cc: user@spark.apache.org Subject: Re: spark-submit issue You can also add a System.exit(0) after the sc.stop. On 30 Aug 2015 23:55, "Pranay Tonpay" <pranay.ton...@impetus.co.in<mailto:pranay.ton...@impetus.co.in>> wrote: yes, the contex

Re: spark-submit issue

2015-08-31 Thread Pranay Tonpay
5 9:18 PM To: Pranay Tonpay Cc: Igor Berman; user@spark.apache.org Subject: Re: spark-submit issue Can you not use the spark jobserver instead? Just submit your job to the job server who already has the sparkcontext initialized in it, it would make it much easier i think. Thanks Best Regards

Re: spark-submit issue

2015-08-30 Thread Ted Yu
. -- *From:* Akhil Das ak...@sigmoidanalytics.com *Sent:* Sunday, August 30, 2015 9:03 AM *To:* Pranay Tonpay *Cc:* user@spark.apache.org *Subject:* Re: spark-submit issue Did you try putting a sc.stop at the end of your pipeline? Thanks Best Regards On Thu, Aug 27, 2015 at 6:41 PM

Re: spark-submit issue

2015-08-30 Thread Pranay Tonpay
yes, the context is being closed at the end. From: Akhil Das ak...@sigmoidanalytics.com Sent: Sunday, August 30, 2015 9:03 AM To: Pranay Tonpay Cc: user@spark.apache.org Subject: Re: spark-submit issue Did you try putting a sc.stop at the end of your pipeline

Re: spark-submit issue

2015-08-30 Thread Akhil Das
*To:* Pranay Tonpay *Cc:* user@spark.apache.org *Subject:* Re: spark-submit issue Did you try putting a sc.stop at the end of your pipeline? Thanks Best Regards On Thu, Aug 27, 2015 at 6:41 PM, pranay pranay.ton...@impetus.co.in wrote: I have a java program that does this - (using Spark 1.3.1

Re: spark-submit issue

2015-08-29 Thread Akhil Das
Did you try putting a sc.stop at the end of your pipeline? Thanks Best Regards On Thu, Aug 27, 2015 at 6:41 PM, pranay pranay.ton...@impetus.co.in wrote: I have a java program that does this - (using Spark 1.3.1 ) Create a command string that uses spark-submit in it ( with my Class file etc

spark-submit issue

2015-08-27 Thread pranay
I have a java program that does this - (using Spark 1.3.1 ) Create a command string that uses spark-submit in it ( with my Class file etc ), and i store this string in a temp file somewhere as a shell script Using Runtime.exec, i execute this script and wait for its completion, using

Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-25 Thread Utkarsh Sengar
This worked for me locally: spark-1.4.1-bin-hadoop2.4/bin/spark-submit --conf spark.executor.extraClassPath=/.m2/repository/ch/qos/logback/logback-core/1.1.2/logback-core-1.1.2.jar:/.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.2.jar --conf spark.driver.extraClassPath

Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-25 Thread Marcelo Vanzin
On Tue, Aug 25, 2015 at 10:48 AM, Utkarsh Sengar utkarsh2...@gmail.com wrote: Now I am going to try it out on our mesos cluster. I assumed spark.executor.extraClassPath takes csv as jars the way --jars takes it but it should be : separated like a regular classpath jar. Ah, yes, those options

Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-25 Thread Utkarsh Sengar
So do I need to manually copy these 2 jars on my spark executors? On Tue, Aug 25, 2015 at 10:51 AM, Marcelo Vanzin van...@cloudera.com wrote: On Tue, Aug 25, 2015 at 10:48 AM, Utkarsh Sengar utkarsh2...@gmail.com wrote: Now I am going to try it out on our mesos cluster. I assumed

Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-25 Thread Marcelo Vanzin
On Tue, Aug 25, 2015 at 1:50 PM, Utkarsh Sengar utkarsh2...@gmail.com wrote: So do I need to manually copy these 2 jars on my spark executors? Yes. I can think of a way to work around that if you're using YARN, but not with other cluster managers. On Tue, Aug 25, 2015 at 10:51 AM, Marcelo

Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-25 Thread Utkarsh Sengar
Looks like I stuck then, I am using mesos. Adding these 2 jars to all executors might be a problem for me, I will probably try to remove the dependency on the otj-logging lib then and just use log4j. On Tue, Aug 25, 2015 at 2:15 PM, Marcelo Vanzin van...@cloudera.com wrote: On Tue, Aug 25, 2015

Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-24 Thread Utkarsh Sengar
artifactIdslf4j-log4j12/artifactId /exclusion /exclusions /dependency Now, when I run my job from Intellij (which sets the classpath), things work perfectly. But when I run my job via spark-submit: ~/spark-1.4.1-bin-hadoop2.4/bin/spark-submit --class

Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-24 Thread Marcelo Vanzin
groupIdorg.slf4j/groupId artifactIdslf4j-log4j12/artifactId /exclusion /exclusions /dependency Now, when I run my job from Intellij (which sets the classpath), things work perfectly. But when I run my job via spark-submit: ~/spark

Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-24 Thread Utkarsh Sengar
artifactIdslf4j-log4j12/artifactId /exclusion /exclusions /dependency The SparkRunner class works fine (from IntelliJ) but when I build a jar and submit it to spark-submit: I get this error: Caused by: java.lang.ClassCastException

Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-24 Thread Utkarsh Sengar
/dependency And no exclusions from my logging lib. And I submit this task: spark-1.4.1-bin-hadoop2.4/bin/spark-submit --class runner.SparkRunner --conf spark.driver.extraClassPath=/.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.2.jar --conf spark.executor.extraClassPath=/.m2

Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-24 Thread Marcelo Vanzin
Hi Utkarsh, A quick look at slf4j's source shows it loads the first StaticLoggerBinder in your classpath. How are you adding the logback jar file to spark-submit? If you use spark.driver.extraClassPath and spark.executor.extraClassPath to add the jar, it should take precedence over the log4j

Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-24 Thread Utkarsh Sengar
git:(bulkrunner) ✗ spark-1.4.1-bin-hadoop2.4/bin/spark-submit --class runner.SparkRunner --jars /.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.2.jar,/.m2/repository/ch/qos/logback/logback-core/1.1.2/logback-core-1.1.2.jar --conf spark.executor.userClassPathFirst=true --conf

Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-24 Thread Marcelo Vanzin
On Mon, Aug 24, 2015 at 3:58 PM, Utkarsh Sengar utkarsh2...@gmail.com wrote: That didn't work since extraClassPath flag was still appending the jars at the end, so its still picking the slf4j jar provided by spark. Out of curiosity, how did you verify this? The extraClassPath options are

Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-24 Thread Utkarsh Sengar
submit this task: spark-1.4.1-bin-hadoop2.4/bin/spark-submit --class runner.SparkRunner --conf spark.driver.extraClassPath=/.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.2.jar --conf spark.executor.extraClassPath=/.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback

Run scala code with spark submit

2015-08-20 Thread MasterSergius
Is there any possibility to run standalone scala program via spark submit? Or I have always put it in some packages, build it with maven (or sbt)? What if I have just simple program, like that example word counter? Could anyone please, show it on this simple test file Greeting.scala

Re: Run scala code with spark submit

2015-08-20 Thread Dean Wampler
I haven't tried it, but scala-shell should work if you give it a scala script file, since it's basically a wrapper around the Scala REPL. dean On Thursday, August 20, 2015, MasterSergius master.serg...@gmail.com wrote: Is there any possibility to run standalone scala program via spark submit

Re: dse spark-submit multiple jars issue

2015-08-18 Thread Andrew Or
Hi Satish, The problem is that `--jars` accepts a comma-delimited list of jars! E.g. spark-submit ... --jars lib1.jar,lib2.jar,lib3.jar main.jar where main.jar is your main application jar (the one that starts a SparkContext), and lib*.jar refer to additional libraries that your main

Re: dse spark-submit multiple jars issue

2015-08-13 Thread Javier Domingo Cansino
Please notice that 'jars: null' I don't know why you put ///. but I would propose you just put normal absolute paths. dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld --jars /home/missingmerch/postgresql-9.4-1201.jdbc41.jar /home/missingmerch/dse.jar /home/missingmerch

dse spark-submit multiple jars issue

2015-08-11 Thread satish chandra j
*HI,* Please let me know if i am missing anything in the command below *Command:* dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld --jars ///home/missingmerch/postgresql-9.4-1201.jdbc41.jar ///home/missingmerch/dse.jar ///home/missingmerch/spark-cassandra-connector

Re: dse spark-submit multiple jars issue

2015-08-11 Thread Javier Domingo Cansino
11, 2015 at 2:44 PM, satish chandra j jsatishchan...@gmail.com wrote: HI , I have used --jars option as well, please find the command below *Command:* dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld *--jars* ///home/missingmerch/postgresql-9.4-1201.jdbc41.jar ///home

Re: dse spark-submit multiple jars issue

2015-08-11 Thread Javier Domingo Cansino
,* Please let me know if i am missing anything in the command below *Command:* dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld --jars ///home/missingmerch/postgresql-9.4-1201.jdbc41.jar ///home/missingmerch/dse.jar ///home/missingmerch/spark-cassandra-connector-java_2.10

Re: dse spark-submit multiple jars issue

2015-08-11 Thread satish chandra j
HI, Please find the log details below: dse spark-submit --verbose --master local --class HelloWorld etl-0.0.1-SNAPSHOT.jar --jars file:/home/missingmerch/postgresql-9.4-1201.jdbc41.jar file:/home/missingmerch/dse.jar file:/home/missingmerch/postgresql-9.4-1201.jdbc41.jar Using properties file

Re: Spark-submit not finding main class and the error reflects different path to jar file than specified

2015-08-09 Thread Akhil Das
command line to spark-submit: bin/spark-submit --verbose --master local[2]--class org.yardstick.spark.SparkCoreRDDBenchmark /shared/ysgood/target/yardstick-spark-uber-0.0.1.jar Here is the output: NOTE: SPARK_PREPEND_CLASSES is set, placing locally compiled Spark classes ahead of assembly

Re: Spark-submit fails when jar is in HDFS

2015-08-09 Thread Akhil Das
Did you try this way? /usr/local/spark/bin/spark-submit --master mesos://mesos.master:5050 --conf spark.mesos.executor.docker.image=docker.repo/spark:latest --class org.apache.spark.examples.SparkPi *--jars hdfs://hdfs1/tmp/spark-* *examples-1.4.1-hadoop2.6.0-**cdh5.4.4.jar* 100 Thanks Best

Re: Spark-submit fails when jar is in HDFS

2015-08-09 Thread Dean Wampler
/deanwampler http://polyglotprogramming.com On Sun, Aug 9, 2015 at 4:30 AM, Akhil Das ak...@sigmoidanalytics.com wrote: Did you try this way? /usr/local/spark/bin/spark-submit --master mesos://mesos.master:5050 --conf spark.mesos.executor.docker.image=docker.repo/spark:latest --class

Re: Spark-submit fails when jar is in HDFS

2015-08-09 Thread Alan Braithwaite
Did you try this way? /usr/local/spark/bin/spark-submit --master mesos://mesos.master:5050 --conf spark.mesos.executor.docker.image=docker.repo/spark:latest --class org.apache.spark.examples.SparkPi --jars hdfs://hdfs1/tmp/spark-examples-1.4.1-hadoop2.6.0-cdh5.4.4.jar 100 I did, and got

Spark-submit fails when jar is in HDFS

2015-08-06 Thread abraithwaite
-applications.html [2] http://apache-spark-user-list.1001560.n3.nabble.com/Spark-submit-not-working-when-application-jar-is-in-hdfs-td21840.html -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-submit-fails-when-jar-is-in-HDFS-tp24163.html Sent from

Spark-submit not finding main class and the error reflects different path to jar file than specified

2015-08-06 Thread Stephen Boesch
Given the following command line to spark-submit: bin/spark-submit --verbose --master local[2]--class org.yardstick.spark.SparkCoreRDDBenchmark /shared/ysgood/target/yardstick-spark-uber-0.0.1.jar Here is the output: NOTE: SPARK_PREPEND_CLASSES is set, placing locally compiled Spark classes

Re: Spark-Submit error

2015-08-03 Thread satish chandra j
Hi Guru, I am executing this on DataStax Enterprise Spark node and ~/.dserc file exists which consists Cassandra credentials but still getting the error Below is the given command dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld --jars ///home/missingmerch/postgresql-9.4

Re: Spark-Submit error

2015-08-03 Thread Guru Medasani
spark-submit spark error exception in thread main java.io.ioexception: Invalid Request Exception(Why you have not logged in) Note: submitting datastax spark node please let me know if anybody have a solutions for this issue Regards, Saish Chandra

Re: Spark-Submit error

2015-08-03 Thread Guru Medasani
this on DataStax Enterprise Spark node and ~/.dserc file exists which consists Cassandra credentials but still getting the error Below is the given command dse spark-submit --master spark://10.246.43.15:7077 http://10.246.43.15:7077/ --class HelloWorld --jars ///home/missingmerch

org.apache.spark.SparkException: Detected yarn-cluster mode, but isn't running on a cluster. Deployment to YARN is not supported directly by SparkContext. Please use spark-submit

2015-08-03 Thread Rajeshkumar J
running on a cluster. Deployment to YARN is not supported directly by SparkContext. Please use spark-submit

Fwd: org.apache.spark.SparkException: Detected yarn-cluster mode, but isn't running on a cluster. Deployment to YARN is not supported directly by SparkContext. Please use spark-submit

2015-08-03 Thread Rajeshkumar J
running on a cluster. Deployment to YARN is not supported directly by SparkContext. Please use spark-submit This is my java code what I tried in Single node cluster SparkConf sparkConf = new SparkConf().setAppName(Hive).setMaster(local).setSparkHome(path); JavaSparkContext ctx = new

Spark-Submit error

2015-07-31 Thread satish chandra j
HI, I have submitted a Spark Job with options jars,class,master as *local* but i am getting an error as below *dse spark-submit spark error exception in thread main java.io.ioexception: Invalid Request Exception(Why you have not logged in)* *Note: submitting datastax spark node* please let me

Re: How to set log level in spark-submit ?

2015-07-30 Thread Dean Wampler
: I saw such example in docs: --conf spark.driver.extraJavaOptions=-Dlog4j.configuration= file://$path_to_file but, unfortunately, it does not work for me. On 30.07.2015 05:12, canan chen wrote: Yes, that should work. What I mean is is there any option in spark-submit command that I can

Re: How to set log level in spark-submit ?

2015-07-30 Thread Alexander Krasheninnikov
I saw such example in docs: --conf spark.driver.extraJavaOptions=-Dlog4j.configuration=file://$path_to_file but, unfortunately, it does not work for me. On 30.07.2015 05:12, canan chen wrote: Yes, that should work. What I mean is is there any option in spark-submit command that I can specify

Re: Authentication Support with spark-submit cluster mode

2015-07-29 Thread Zhan Zhang
If you run it on yarn with kerberos setup. You authenticate yourself by kinit before launching the job. Thanks. Zhan Zhang On Jul 28, 2015, at 8:51 PM, Anh Hong hongnhat...@yahoo.com.INVALIDmailto:hongnhat...@yahoo.com.INVALID wrote: Hi, I'd like to remotely run spark-submit from a local

How to set log level in spark-submit ?

2015-07-29 Thread canan chen
Anyone know how to set log level in spark-submit ? Thanks

Re: How to set log level in spark-submit ?

2015-07-29 Thread Jonathan Coveney
Put a log4j.properties file in conf/. You can copy log4j.properties.template as a good base El miércoles, 29 de julio de 2015, canan chen ccn...@gmail.com escribió: Anyone know how to set log level in spark-submit ? Thanks

Re: Authentication Support with spark-submit cluster mode

2015-07-29 Thread Anh Hong
Hi Zhan,I'm running Standalone Spark cluster and execute spark-submit from a local host outside the cluster. Beside kerberos, do you know any other existing method? Is there any JIRA opened on this enhancement request? Regards,Anh. On Wednesday, July 29, 2015 4:15 PM, Zhan Zhang zzh

Re: How to set log level in spark-submit ?

2015-07-29 Thread canan chen
Yes, that should work. What I mean is is there any option in spark-submit command that I can specify for the log level On Thu, Jul 30, 2015 at 10:05 AM, Jonathan Coveney jcove...@gmail.com wrote: Put a log4j.properties file in conf/. You can copy log4j.properties.template as a good base El

Authentication Support with spark-submit cluster mode

2015-07-28 Thread Anh Hong
Hi,I'd like to remotely run spark-submit from a local machine to submit a job to spark cluster (cluster mode). What method do I use to authenticate myself to the cluster? Like how to pass user id or password or private key to the cluster Any help is appreciated.

Does spark-submit support file transfering from local to cluster?

2015-07-28 Thread Anh Hong
Hi, I'm using spark-submit cluster mode to submit a job from local to spark cluster. There are input files, output files, and job log files that I need to transfer in and out between local machine and spark cluster.Any recommendation methods to use file transferring. Is there any future

Re: 1.4.0 classpath issue with spark-submit

2015-07-25 Thread Michal Haris
-shell for exploration and I have a runner class that executes some tasks with spark-submit. I used to run against 1.4.0-SNAPSHOT. Since then 1.4.0 and 1.4.1 were released so I tried to switch to the official release. Now, when I run the program as a shell, everything works but when I try

Re: spark-submit and spark-shell behaviors mismatch.

2015-07-24 Thread Yana Kadiyska
that is pretty odd -- toMap not being there would be from scala...but what is even weirder is that toMap is positively executed on the driver machine, which is the same when you do spark-shell and spark-submit...does it work if you run with --master local[*]? Also, you can try to put a set -x

Re: spark-submit and spark-shell behaviors mismatch.

2015-07-23 Thread Dan Dong
The problem should be toMap, as I tested that val maps2=maps.collect runs ok. When I run spark-shell, I run with --master mesos://cluster-1:5050 parameter which is the same with spark-submit. Confused here. 2015-07-22 20:01 GMT-05:00 Yana Kadiyska yana.kadiy...@gmail.com: Is it complaining

Re: 1.4.0 classpath issue with spark-submit

2015-07-23 Thread Akhil Das
for exploration and I have a runner class that executes some tasks with spark-submit. I used to run against 1.4.0-SNAPSHOT. Since then 1.4.0 and 1.4.1 were released so I tried to switch to the official release. Now, when I run the program as a shell, everything works but when I try to run

spark-submit and spark-shell behaviors mismatch.

2015-07-22 Thread Dan Dong
Hi, I have a simple test spark program as below, the strange thing is that it runs well under a spark-shell, but will get a runtime error of java.lang.NoSuchMethodError: in spark-submit, which indicate the line of: val maps2=maps.collect.toMap has problem. But why the compilation has

Re: spark-submit and spark-shell behaviors mismatch.

2015-07-22 Thread Yana Kadiyska
Is it complaining about collect or toMap? In either case this error is indicative of an old version usually -- any chance you have an old installation of Spark somehow? Or scala? You can try running spark-submit with --verbose. Also, when you say it runs with spark-shell do you run spark shell

1.4.0 classpath issue with spark-submit

2015-07-21 Thread Michal Haris
I have a spark program that uses dataframes to query hive and I run it both as a spark-shell for exploration and I have a runner class that executes some tasks with spark-submit. I used to run against 1.4.0-SNAPSHOT. Since then 1.4.0 and 1.4.1 were released so I tried to switch to the official

Re: spark-submit can not resolve spark-hive_2.10

2015-07-15 Thread Hao Ren
Thanks for the reply. Actually, I don't think excluding spark-hive from spark-submit --packages is a good idea. I don't want to recompile spark by assembly for my cluster, every time a new spark release is out. I prefer using binary version of spark and then adding some jars for job execution

spark submit configuration on yarn

2015-07-14 Thread Pa Rö
/bin/spark-submit \ --class mgm.tp.bigdata.ma_spark.SparkMain \ --master yarn-cluster \ --executor-memory 9G \ --total-executor-cores 16 \ ma-spark.jar \ 1000 maybe my configuration is not the optimal?? best regards, paul

spark-submit

2015-07-10 Thread AshutoshRaghuvanshi
when I do run this command: ashutosh@pas-lab-server7:~/spark-1.4.0$ ./bin/spark-submit \ --class org.apache.spark.graphx.lib.Analytics \ --master spark://172.17.27.12:7077 \ assembly/target/scala-2.10/spark-assembly-1.4.0-hadoop2.2.0.jar \ pagerank soc-LiveJournal1.txt --numEPart=100 --nverts

Re: spark-submit

2015-07-10 Thread Andrew Or
do run this command: ashutosh@pas-lab-server7:~/spark-1.4.0$ ./bin/spark-submit \ --class org.apache.spark.graphx.lib.Analytics \ --master spark://172.17.27.12:7077 \ assembly/target/scala-2.10/spark-assembly-1.4.0-hadoop2.2.0.jar \ pagerank soc-LiveJournal1.txt --numEPart=100 --nverts

Re: Remote spark-submit not working with YARN

2015-07-09 Thread Juan Gordon
...@gmail.com wrote: I'm trying to submit a spark job from a different server outside of my Spark Cluster (running spark 1.4.0, hadoop 2.4.0 and YARN) using the spark-submit script : spark/bin/spark-submit --master yarn-client --executor-memory 4G myjobScript.py The think is that my application

Remote spark-submit not working with YARN

2015-07-08 Thread jegordon
I'm trying to submit a spark job from a different server outside of my Spark Cluster (running spark 1.4.0, hadoop 2.4.0 and YARN) using the spark-submit script : spark/bin/spark-submit --master yarn-client --executor-memory 4G myjobScript.py The think is that my application never pass from

Re: Remote spark-submit not working with YARN

2015-07-08 Thread Sandy Ryza
a different server outside of my Spark Cluster (running spark 1.4.0, hadoop 2.4.0 and YARN) using the spark-submit script : spark/bin/spark-submit --master yarn-client --executor-memory 4G myjobScript.py The think is that my application never pass from the accepted state, it stuck on it : 15/07/08

spark-submit can not resolve spark-hive_2.10

2015-07-07 Thread Hao Ren
I want to add spark-hive as a dependence to submit my job, but it seems that spark-submit can not resolve it. $ ./bin/spark-submit \ → --packages org.apache.spark:spark-hive_2.10:1.4.0,org.postgresql:postgresql:9.3-1103-jdbc3,joda-time:joda-time:2.8.1 \ → --class

Re: spark-submit can not resolve spark-hive_2.10

2015-07-07 Thread Burak Yavuz
, but it seems that spark-submit can not resolve it. $ ./bin/spark-submit \ → --packages org.apache.spark:spark-hive_2.10:1.4.0,org.postgresql:postgresql:9.3-1103-jdbc3,joda-time:joda-time:2.8.1 \ → --class fr.leboncoin.etl.jobs.dwh.AdStateTraceDWHTransform \ → --master spark://localhost:7077 \ Ivy

RE: spark-submit in deployment mode with the --jars option

2015-06-29 Thread Hisham Mohamed
) ... 9 more From: Akhil Das [ak...@sigmoidanalytics.com] Sent: 29 June 2015 09:43 To: Hisham Mohamed Cc: user@spark.apache.org Subject: Re: spark-submit in deployment mode with the --jars option Can you paste the stacktrace? Looks like you are missing few

Re: Spark-Submit / Spark-Shell Error Standalone cluster

2015-06-28 Thread Tomas Hudik
. Have you got the rights to execute it? niedz., 28.06.2015 o 04:53 użytkownik Ashish Soni asoni.le...@gmail.com napisał: Not sure what is the issue but when i run the spark-submit or spark-shell i am getting below error /usr/bin/spark-class: line 24: /usr/bin/load-spark-env.sh: No such file

Re: Spark-Submit / Spark-Shell Error Standalone cluster

2015-06-28 Thread Wojciech Pituła
I assume that /usr/bin/load-spark-env.sh exists. Have you got the rights to execute it? niedz., 28.06.2015 o 04:53 użytkownik Ashish Soni asoni.le...@gmail.com napisał: Not sure what is the issue but when i run the spark-submit or spark-shell i am getting below error /usr/bin/spark-class

spark-submit in deployment mode with the --jars option

2015-06-28 Thread hishamm
Hi, I want to deploy my application on a standalone cluster. Spark submit acts in strange way. When I deploy the application in *client* mode, everything works well and my application can see the additional jar files. Here is the command: spark-submit --master spark://1.2.3.4:7077 --deploy

Spark-Submit / Spark-Shell Error Standalone cluster

2015-06-27 Thread Ashish Soni
Not sure what is the issue but when i run the spark-submit or spark-shell i am getting below error /usr/bin/spark-class: line 24: /usr/bin/load-spark-env.sh: No such file or directory Can some one please help Thanks,

Re: Problem Run Spark Example HBase Code Using Spark-Submit

2015-06-26 Thread Akhil Das
Try to add them in the SPARK_CLASSPATH in your conf/spark-env.sh file Thanks Best Regards On Thu, Jun 25, 2015 at 9:31 PM, Bin Wang binwang...@gmail.com wrote: I am trying to run the Spark example code HBaseTest from command line using spark-submit instead run-example, in that case, I can

Problem Run Spark Example HBase Code Using Spark-Submit

2015-06-25 Thread Bin Wang
I am trying to run the Spark example code HBaseTest from command line using spark-submit instead run-example, in that case, I can learn more how to run spark code in general. However, it told me CLASS_NOT_FOUND about htrace since I am using CDH5.4. I successfully located the htrace jar file but I

<    1   2   3   4   5   6   7   8   >