Hi Jeff and Satish,

I have modified script and executed. Please find below command

./spark-submit --master local  --class test.Main --jars
/home/user/download/jar/ojdbc7.jar
/home//test/target/spark16-0.0.1-SNAPSHOT.jar

Still I'm getting same exception.


Exception in thread "main" java.sql.SQLException: No suitable driver found
for jdbc:oracle:thin:@xxxx:1521:xxx
    at java.sql.DriverManager.getConnection(DriverManager.java:596)
    at java.sql.DriverManager.getConnection(DriverManager.java:187)
    at
org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$getConnector$1.apply(JDBCRDD.scala:188)
    at
org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$getConnector$1.apply(JDBCRDD.scala:181)
    at
org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:121)
    at
org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.<init>(JDBCRelation.scala:91)
    at
org.apache.spark.sql.execution.datasources.jdbc.DefaultSource.createRelation(DefaultSource.scala:60)
    at
org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:125)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
    at com.cisco.ss.etl.utils.ETLHelper$class.getData(ETLHelper.scala:22)
    at com.cisco.ss.etl.Main$.getData(Main.scala:9)
    at com.cisco.ss.etl.Main$delayedInit$body.apply(Main.scala:13)
    at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
    at
scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
    at scala.App$$anonfun$main$1.apply(App.scala:71)
    at scala.App$$anonfun$main$1.apply(App.scala:71)
    at scala.collection.immutable.List.foreach(List.scala:318)
    at
scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
    at scala.App$class.main(App.scala:71)
    at com.cisco.ss.etl.Main$.main(Main.scala:9)
    at com.cisco.ss.etl.Main.main(Main.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
    at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)


Regards,
Rajesh

On Mon, Dec 21, 2015 at 7:18 PM, satish chandra j <jsatishchan...@gmail.com>
wrote:

> Hi Rajesh,
> Could you please try giving your cmd as mentioned below:
>
> ./spark-submit --master local  --class <classname> --jars <odbcjar>
> <sparkjob-jar>
>
> Regards,
> Satish Chandra
>
> On Mon, Dec 21, 2015 at 6:45 PM, Madabhattula Rajesh Kumar <
> mrajaf...@gmail.com> wrote:
>
>> Hi,
>>
>> How to add dependent jars in spark-submit command. For example: Oracle.
>> Could you please help me to resolve this issue
>>
>> I have a standalone cluster. One Master and One slave.
>>
>> I have used below command it is not working
>>
>> ./spark-submit --master local  --class test.Main
>> /test/target/spark16-0.0.1-SNAPSHOT.jar --jars
>> /home/user/download/jar/ojdbc7.jar
>>
>> *I'm getting below exception :*
>>
>> Exception in thread "main" java.sql.SQLException: No suitable driver
>> found for jdbc:oracle:thin:@xxxx:1521:xxx
>>     at java.sql.DriverManager.getConnection(DriverManager.java:596)
>>     at java.sql.DriverManager.getConnection(DriverManager.java:187)
>>     at
>> org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$getConnector$1.apply(JDBCRDD.scala:188)
>>     at
>> org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$getConnector$1.apply(JDBCRDD.scala:181)
>>     at
>> org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:121)
>>     at
>> org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.<init>(JDBCRelation.scala:91)
>>     at
>> org.apache.spark.sql.execution.datasources.jdbc.DefaultSource.createRelation(DefaultSource.scala:60)
>>     at
>> org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:125)
>>     at
>> org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
>>     at com.cisco.ss.etl.utils.ETLHelper$class.getData(ETLHelper.scala:22)
>>     at com.cisco.ss.etl.Main$.getData(Main.scala:9)
>>     at com.cisco.ss.etl.Main$delayedInit$body.apply(Main.scala:13)
>>     at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
>>     at
>> scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
>>     at scala.App$$anonfun$main$1.apply(App.scala:71)
>>     at scala.App$$anonfun$main$1.apply(App.scala:71)
>>     at scala.collection.immutable.List.foreach(List.scala:318)
>>     at
>> scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
>>     at scala.App$class.main(App.scala:71)
>>     at com.cisco.ss.etl.Main$.main(Main.scala:9)
>>     at com.cisco.ss.etl.Main.main(Main.scala)
>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>     at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>     at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>     at java.lang.reflect.Method.invoke(Method.java:606)
>>     at
>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
>>     at
>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
>>     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
>>     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
>>     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>
>> Regards,
>> Rajesh
>>
>
>

Reply via email to