Here is a script I use to submit a directory of jar files. It assumes jar files
are in target/dependency or lib/
DRIVER_PATH=
DEPEND_PATH=
if [ -d lib ]; then
DRIVER_PATH=lib
DEPEND_PATH=lib
else
DRIVER_PATH=target
DEPEND_PATH=target/dependency
fi
DEPEND_JARS=log4j.properties
for f in
/myjar.jar at
http://10.61.187.176:57956/jars/filesplitter_2.10-1.0.jar with timestamp
1419032530459
14/12/19 23:42:10 INFO AppClient$ClientActor: Connecting to master
spark://ec2-54-90-85-197.compute-1.amazonaws.com:7077...
Exception in thread main java.sql.SQLException: No suitable driver found
Hi All,
I tried to make combined.jar in shell script . it is working when I am using
spark-shell. But for the spark-submit it is same issue.
Help is highly appreciated.
Thanks
-D
--
View this message in context:
One more question.
How would I submit additional jars to the spark-submit job. I used --jars
option, it seems it is not working as explained earlier.
Thanks for the help,
-D
--
View this message in context:
java.sql.SQLException: No suitable driver found
for jdbc:mysql://192.168.20.45:3306/abcdb?user=rootpassword=admin
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/java-sql-SQLException-No-suitable-driver-found-tp20792.html
Sent from the Apache Spark User List mailing list