Re: spark-submit for dependent jars

2015-12-21 Thread Shixiong Zhu
Looks you need to add an "driver" option to your codes, such as

sqlContext.read.format("jdbc").options(
Map("url" -> "jdbc:oracle:thin:@:1521:xxx",
  "driver" -> "oracle.jdbc.driver.OracleDriver",
  "dbtable" -> "your_table_name")).load()

Best Regards,
Shixiong Zhu

2015-12-21 6:03 GMT-08:00 Jeff Zhang <zjf...@gmail.com>:

> Please make sure this is correct jdbc url,
> jdbc:oracle:thin:@:1521:xxx
>
>
>
> On Mon, Dec 21, 2015 at 9:54 PM, Madabhattula Rajesh Kumar <
> mrajaf...@gmail.com> wrote:
>
>> Hi Jeff and Satish,
>>
>> I have modified script and executed. Please find below command
>>
>> ./spark-submit --master local  --class test.Main --jars
>> /home/user/download/jar/ojdbc7.jar
>> /home//test/target/spark16-0.0.1-SNAPSHOT.jar
>>
>> Still I'm getting same exception.
>>
>>
>>
>> Exception in thread "main" java.sql.SQLException: No suitable driver
>> found for jdbc:oracle:thin:@:1521:xxx
>> at java.sql.DriverManager.getConnection(DriverManager.java:596)
>> at java.sql.DriverManager.getConnection(DriverManager.java:187)
>> at
>> org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$getConnector$1.apply(JDBCRDD.scala:188)
>> at
>> org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$getConnector$1.apply(JDBCRDD.scala:181)
>> at
>> org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:121)
>> at
>> org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.(JDBCRelation.scala:91)
>> at
>> org.apache.spark.sql.execution.datasources.jdbc.DefaultSource.createRelation(DefaultSource.scala:60)
>> at
>> org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:125)
>> at
>> org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
>> at com.cisco.ss.etl.utils.ETLHelper$class.getData(ETLHelper.scala:22)
>> at com.cisco.ss.etl.Main$.getData(Main.scala:9)
>> at com.cisco.ss.etl.Main$delayedInit$body.apply(Main.scala:13)
>> at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
>> at
>> scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
>> at scala.App$$anonfun$main$1.apply(App.scala:71)
>> at scala.App$$anonfun$main$1.apply(App.scala:71)
>> at scala.collection.immutable.List.foreach(List.scala:318)
>> at
>> scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
>> at scala.App$class.main(App.scala:71)
>> at com.cisco.ss.etl.Main$.main(Main.scala:9)
>> at com.cisco.ss.etl.Main.main(Main.scala)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at
>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
>> at
>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>
>>
>> Regards,
>> Rajesh
>>
>> On Mon, Dec 21, 2015 at 7:18 PM, satish chandra j <
>> jsatishchan...@gmail.com> wrote:
>>
>>> Hi Rajesh,
>>> Could you please try giving your cmd as mentioned below:
>>>
>>> ./spark-submit --master local  --class  --jars 
>>> 
>>>
>>> Regards,
>>> Satish Chandra
>>>
>>> On Mon, Dec 21, 2015 at 6:45 PM, Madabhattula Rajesh Kumar <
>>> mrajaf...@gmail.com> wrote:
>>>
>>>> Hi,
>>>>
>>>> How to add dependent jars in spark-submit command. For example: Oracle.
>>>> Could you please help me to resolve this issue
>>>>
>>>> I have a standalone cluster. One Master and One slave.
>>>>
>>>> I have used below command it is not working
>>>>
>>>> ./spark-submit --master local  --class test.Main
>>>> /test/target/spark16-0.0.1-SNAPSHOT.jar --jars
>>>> /home/user

Re: spark-submit for dependent jars

2015-12-21 Thread Madabhattula Rajesh Kumar
Hi Jeff and Satish,

I have modified script and executed. Please find below command

./spark-submit --master local  --class test.Main --jars
/home/user/download/jar/ojdbc7.jar
/home//test/target/spark16-0.0.1-SNAPSHOT.jar

Still I'm getting same exception.


Exception in thread "main" java.sql.SQLException: No suitable driver found
for jdbc:oracle:thin:@:1521:xxx
at java.sql.DriverManager.getConnection(DriverManager.java:596)
at java.sql.DriverManager.getConnection(DriverManager.java:187)
at
org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$getConnector$1.apply(JDBCRDD.scala:188)
at
org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$getConnector$1.apply(JDBCRDD.scala:181)
at
org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:121)
at
org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.(JDBCRelation.scala:91)
at
org.apache.spark.sql.execution.datasources.jdbc.DefaultSource.createRelation(DefaultSource.scala:60)
at
org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:125)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
at com.cisco.ss.etl.utils.ETLHelper$class.getData(ETLHelper.scala:22)
at com.cisco.ss.etl.Main$.getData(Main.scala:9)
at com.cisco.ss.etl.Main$delayedInit$body.apply(Main.scala:13)
at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
at
scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.collection.immutable.List.foreach(List.scala:318)
at
scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
at scala.App$class.main(App.scala:71)
at com.cisco.ss.etl.Main$.main(Main.scala:9)
at com.cisco.ss.etl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)


Regards,
Rajesh

On Mon, Dec 21, 2015 at 7:18 PM, satish chandra j <jsatishchan...@gmail.com>
wrote:

> Hi Rajesh,
> Could you please try giving your cmd as mentioned below:
>
> ./spark-submit --master local  --class  --jars 
> 
>
> Regards,
> Satish Chandra
>
> On Mon, Dec 21, 2015 at 6:45 PM, Madabhattula Rajesh Kumar <
> mrajaf...@gmail.com> wrote:
>
>> Hi,
>>
>> How to add dependent jars in spark-submit command. For example: Oracle.
>> Could you please help me to resolve this issue
>>
>> I have a standalone cluster. One Master and One slave.
>>
>> I have used below command it is not working
>>
>> ./spark-submit --master local  --class test.Main
>> /test/target/spark16-0.0.1-SNAPSHOT.jar --jars
>> /home/user/download/jar/ojdbc7.jar
>>
>> *I'm getting below exception :*
>>
>> Exception in thread "main" java.sql.SQLException: No suitable driver
>> found for jdbc:oracle:thin:@:1521:xxx
>> at java.sql.DriverManager.getConnection(DriverManager.java:596)
>> at java.sql.DriverManager.getConnection(DriverManager.java:187)
>> at
>> org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$getConnector$1.apply(JDBCRDD.scala:188)
>> at
>> org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$getConnector$1.apply(JDBCRDD.scala:181)
>> at
>> org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:121)
>> at
>> org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.(JDBCRelation.scala:91)
>> at
>> org.apache.spark.sql.execution.datasources.jdbc.DefaultSource.createRelation(DefaultSource.scala:60)
>> at
>> org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:125)
>> at
>> org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
>> at com.cisco.ss.etl.utils.ETLHelper$class.getData(ETLHelper.scala:22)
>> at com.cisco.ss.etl.Main$.getData(Main.scala:9)
>> at com.cisco.ss.etl.Main$delayedInit$body.apply(Main.scala:13)
>> at scala.Function0$class.apply

Re: spark-submit for dependent jars

2015-12-21 Thread Jeff Zhang
Put /test/target/spark16-0.0.1-SNAPSHOT.jar as the last argument

./spark-submit --master local  --class test.Main  --jars
/home/user/download/jar/ojdbc7.jar /test/target/spark16-0.0.1-SNAPSHOT.jar

On Mon, Dec 21, 2015 at 9:15 PM, Madabhattula Rajesh Kumar <
mrajaf...@gmail.com> wrote:

> Hi,
>
> How to add dependent jars in spark-submit command. For example: Oracle.
> Could you please help me to resolve this issue
>
> I have a standalone cluster. One Master and One slave.
>
> I have used below command it is not working
>
> ./spark-submit --master local  --class test.Main
> /test/target/spark16-0.0.1-SNAPSHOT.jar --jars
> /home/user/download/jar/ojdbc7.jar
>
> *I'm getting below exception :*
>
> Exception in thread "main" java.sql.SQLException: No suitable driver found
> for jdbc:oracle:thin:@:1521:xxx
> at java.sql.DriverManager.getConnection(DriverManager.java:596)
> at java.sql.DriverManager.getConnection(DriverManager.java:187)
> at
> org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$getConnector$1.apply(JDBCRDD.scala:188)
> at
> org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$getConnector$1.apply(JDBCRDD.scala:181)
> at
> org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:121)
> at
> org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.(JDBCRelation.scala:91)
> at
> org.apache.spark.sql.execution.datasources.jdbc.DefaultSource.createRelation(DefaultSource.scala:60)
> at
> org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:125)
> at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
> at com.cisco.ss.etl.utils.ETLHelper$class.getData(ETLHelper.scala:22)
> at com.cisco.ss.etl.Main$.getData(Main.scala:9)
> at com.cisco.ss.etl.Main$delayedInit$body.apply(Main.scala:13)
> at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
> at
> scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
> at scala.App$$anonfun$main$1.apply(App.scala:71)
> at scala.App$$anonfun$main$1.apply(App.scala:71)
> at scala.collection.immutable.List.foreach(List.scala:318)
> at
> scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
> at scala.App$class.main(App.scala:71)
> at com.cisco.ss.etl.Main$.main(Main.scala:9)
> at com.cisco.ss.etl.Main.main(Main.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
> at
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
> Regards,
> Rajesh
>



-- 
Best Regards

Jeff Zhang


Re: spark-submit for dependent jars

2015-12-21 Thread satish chandra j
Hi Rajesh,
Could you please try giving your cmd as mentioned below:

./spark-submit --master local  --class  --jars 


Regards,
Satish Chandra

On Mon, Dec 21, 2015 at 6:45 PM, Madabhattula Rajesh Kumar <
mrajaf...@gmail.com> wrote:

> Hi,
>
> How to add dependent jars in spark-submit command. For example: Oracle.
> Could you please help me to resolve this issue
>
> I have a standalone cluster. One Master and One slave.
>
> I have used below command it is not working
>
> ./spark-submit --master local  --class test.Main
> /test/target/spark16-0.0.1-SNAPSHOT.jar --jars
> /home/user/download/jar/ojdbc7.jar
>
> *I'm getting below exception :*
>
> Exception in thread "main" java.sql.SQLException: No suitable driver found
> for jdbc:oracle:thin:@:1521:xxx
> at java.sql.DriverManager.getConnection(DriverManager.java:596)
> at java.sql.DriverManager.getConnection(DriverManager.java:187)
> at
> org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$getConnector$1.apply(JDBCRDD.scala:188)
> at
> org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$getConnector$1.apply(JDBCRDD.scala:181)
> at
> org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:121)
> at
> org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.(JDBCRelation.scala:91)
> at
> org.apache.spark.sql.execution.datasources.jdbc.DefaultSource.createRelation(DefaultSource.scala:60)
> at
> org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:125)
> at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
> at com.cisco.ss.etl.utils.ETLHelper$class.getData(ETLHelper.scala:22)
> at com.cisco.ss.etl.Main$.getData(Main.scala:9)
> at com.cisco.ss.etl.Main$delayedInit$body.apply(Main.scala:13)
> at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
> at
> scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
> at scala.App$$anonfun$main$1.apply(App.scala:71)
> at scala.App$$anonfun$main$1.apply(App.scala:71)
> at scala.collection.immutable.List.foreach(List.scala:318)
> at
> scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
> at scala.App$class.main(App.scala:71)
> at com.cisco.ss.etl.Main$.main(Main.scala:9)
> at com.cisco.ss.etl.Main.main(Main.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
> at
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
> Regards,
> Rajesh
>


Re: spark-submit for dependent jars

2015-12-21 Thread Jeff Zhang
Please make sure this is correct jdbc url,
jdbc:oracle:thin:@:1521:xxx



On Mon, Dec 21, 2015 at 9:54 PM, Madabhattula Rajesh Kumar <
mrajaf...@gmail.com> wrote:

> Hi Jeff and Satish,
>
> I have modified script and executed. Please find below command
>
> ./spark-submit --master local  --class test.Main --jars
> /home/user/download/jar/ojdbc7.jar
> /home//test/target/spark16-0.0.1-SNAPSHOT.jar
>
> Still I'm getting same exception.
>
>
>
> Exception in thread "main" java.sql.SQLException: No suitable driver found
> for jdbc:oracle:thin:@:1521:xxx
> at java.sql.DriverManager.getConnection(DriverManager.java:596)
> at java.sql.DriverManager.getConnection(DriverManager.java:187)
> at
> org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$getConnector$1.apply(JDBCRDD.scala:188)
> at
> org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$getConnector$1.apply(JDBCRDD.scala:181)
> at
> org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:121)
> at
> org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.(JDBCRelation.scala:91)
> at
> org.apache.spark.sql.execution.datasources.jdbc.DefaultSource.createRelation(DefaultSource.scala:60)
> at
> org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:125)
> at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
> at com.cisco.ss.etl.utils.ETLHelper$class.getData(ETLHelper.scala:22)
> at com.cisco.ss.etl.Main$.getData(Main.scala:9)
> at com.cisco.ss.etl.Main$delayedInit$body.apply(Main.scala:13)
> at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
> at
> scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
> at scala.App$$anonfun$main$1.apply(App.scala:71)
> at scala.App$$anonfun$main$1.apply(App.scala:71)
> at scala.collection.immutable.List.foreach(List.scala:318)
> at
> scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
> at scala.App$class.main(App.scala:71)
> at com.cisco.ss.etl.Main$.main(Main.scala:9)
> at com.cisco.ss.etl.Main.main(Main.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
> at
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
>
> Regards,
> Rajesh
>
> On Mon, Dec 21, 2015 at 7:18 PM, satish chandra j <
> jsatishchan...@gmail.com> wrote:
>
>> Hi Rajesh,
>> Could you please try giving your cmd as mentioned below:
>>
>> ./spark-submit --master local  --class  --jars 
>> 
>>
>> Regards,
>> Satish Chandra
>>
>> On Mon, Dec 21, 2015 at 6:45 PM, Madabhattula Rajesh Kumar <
>> mrajaf...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> How to add dependent jars in spark-submit command. For example: Oracle.
>>> Could you please help me to resolve this issue
>>>
>>> I have a standalone cluster. One Master and One slave.
>>>
>>> I have used below command it is not working
>>>
>>> ./spark-submit --master local  --class test.Main
>>> /test/target/spark16-0.0.1-SNAPSHOT.jar --jars
>>> /home/user/download/jar/ojdbc7.jar
>>>
>>> *I'm getting below exception :*
>>>
>>> Exception in thread "main" java.sql.SQLException: No suitable driver
>>> found for jdbc:oracle:thin:@:1521:xxx
>>> at java.sql.DriverManager.getConnection(DriverManager.java:596)
>>> at java.sql.DriverManager.getConnection(DriverManager.java:187)
>>> at
>>> org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$getConnector$1.apply(JDBCRDD.scala:188)
>>> at
>>> org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$getConnector$1.apply(JDBCRDD.scala:181)
>>> at
>>> org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:121)
>>> at
>>> org.apache.spark.sql.execution.datasources.jdbc.JDBCRelat

spark-submit for dependent jars

2015-12-21 Thread Madabhattula Rajesh Kumar
Hi,

How to add dependent jars in spark-submit command. For example: Oracle.
Could you please help me to resolve this issue

I have a standalone cluster. One Master and One slave.

I have used below command it is not working

./spark-submit --master local  --class test.Main
/test/target/spark16-0.0.1-SNAPSHOT.jar --jars
/home/user/download/jar/ojdbc7.jar

*I'm getting below exception :*

Exception in thread "main" java.sql.SQLException: No suitable driver found
for jdbc:oracle:thin:@:1521:xxx
at java.sql.DriverManager.getConnection(DriverManager.java:596)
at java.sql.DriverManager.getConnection(DriverManager.java:187)
at
org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$getConnector$1.apply(JDBCRDD.scala:188)
at
org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$getConnector$1.apply(JDBCRDD.scala:181)
at
org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:121)
at
org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.(JDBCRelation.scala:91)
at
org.apache.spark.sql.execution.datasources.jdbc.DefaultSource.createRelation(DefaultSource.scala:60)
at
org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:125)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
at com.cisco.ss.etl.utils.ETLHelper$class.getData(ETLHelper.scala:22)
at com.cisco.ss.etl.Main$.getData(Main.scala:9)
at com.cisco.ss.etl.Main$delayedInit$body.apply(Main.scala:13)
at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
at
scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.collection.immutable.List.foreach(List.scala:318)
at
scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
at scala.App$class.main(App.scala:71)
at com.cisco.ss.etl.Main$.main(Main.scala:9)
at com.cisco.ss.etl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Regards,
Rajesh