Hi Satish,
The problem is that `--jars` accepts a comma-delimited list of jars! E.g.
spark-submit ... --jars lib1.jar,lib2.jar,lib3.jar main.jar
where main.jar is your main application jar (the one that starts a
SparkContext), and lib*.jar refer to additional libraries that your main
Please notice that 'jars: null'
I don't know why you put ///. but I would propose you just put normal
absolute paths.
dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld
--jars /home/missingmerch/postgresql-9.4-1201.jdbc41.jar
/home/missingmerch/dse.jar
*HI,*
Please let me know if i am missing anything in the command below
*Command:*
dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld
--jars ///home/missingmerch/postgresql-9.4-1201.jdbc41.jar
///home/missingmerch/dse.jar
use --verbose, it might give you some insights on what0s happening,
[image: Fon] http://www.fon.com/Javier Domingo CansinoResearch
Development Engineer+34 946545847Skype: javier.domingo.fonAll information
in this email is confidential http://corp.fon.com/legal/email-disclaimer
On Tue, Aug
I have no real idea (not java user), but have you tried with the --jars
option?
http://spark.apache.org/docs/latest/submitting-applications.html#advanced-dependency-management
AFAIK, you are currently submitting the jar names as arguments to the
called Class instead of the jars themselves
HI,
Please find the log details below:
dse spark-submit --verbose --master local --class HelloWorld
etl-0.0.1-SNAPSHOT.jar --jars
file:/home/missingmerch/postgresql-9.4-1201.jdbc41.jar
file:/home/missingmerch/dse.jar
file:/home/missingmerch/postgresql-9.4-1201.jdbc41.jar
Using properties file: