Hi Mich ,
Have you set SPARK_CLASSPATH in Spark-env.sh ?
Thanks,
Divya
On 27 December 2016 at 17:33, Mich Talebzadeh
wrote:
> When one runs in Local mode (one JVM) on an edge host (the host user
> accesses the cluster), it is possible to put additional jar file
, December 27, 2016 2:59 PM
To: Mich Talebzadeh; Deepak Sharma
Cc: user @spark
Subject: Re: Location for the additional jar files in Spark
Hi all,
I have the same problem with spark 2.0.2.
Best regards,
On Tue, Dec 27, 2016, 9:40 AM Mich Talebzadeh
<mich.talebza...@gmail.com<mailto:mich.t
Hi all,
I have the same problem with spark 2.0.2.
Best regards,
On Tue, Dec 27, 2016, 9:40 AM Mich Talebzadeh
wrote:
> Thanks Deppak
>
> but get the same error unfortunately
>
> ADD_JARS="/home/hduser/jars/ojdbc6.jar" spark-shell
> Spark context Web UI available at
Thanks Deppak
but get the same error unfortunately
ADD_JARS="/home/hduser/jars/ojdbc6.jar" spark-shell
Spark context Web UI available at http://50.140.197.217:4041
Spark context available as 'sc' (master = local[*], app id =
local-1482842478988).
Spark session available as 'spark'.
Welcome to
How about this:
ADD_JARS="/home/hduser/jars/ojdbc6.jar" spark-shell
Thanks
Deepak
On Tue, Dec 27, 2016 at 5:04 PM, Mich Talebzadeh
wrote:
> Ok I tried this but no luck
>
> spark-shell --jars /home/hduser/jars/ojdbc6.jar
> Spark context Web UI available at
Ok I tried this but no luck
spark-shell --jars /home/hduser/jars/ojdbc6.jar
Spark context Web UI available at http://50.140.197.217:4041
Spark context available as 'sc' (master = local[*], app id =
local-1482838526271).
Spark session available as 'spark'.
Welcome to
__
I meant ADD_JARS as you said --jars is not working for you with spark-shell.
Thanks
Deepak
On Tue, Dec 27, 2016 at 4:51 PM, Mich Talebzadeh
wrote:
> Ok just to be clear do you mean
>
> ADD_JARS="~/jars/ojdbc6.jar" spark-shell
>
> or
>
> spark-shell --jars $ADD_JARS
>
Ok just to be clear do you mean
ADD_JARS="~/jars/ojdbc6.jar" spark-shell
or
spark-shell --jars $ADD_JARS
Thanks
Dr Mich Talebzadeh
LinkedIn *
https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
It works for me with spark 1.6 (--jars)
Please try this:
ADD_JARS="<>" spark-shell
Thanks
Deepak
On Tue, Dec 27, 2016 at 3:49 PM, Mich Talebzadeh
wrote:
> Thanks.
>
> The problem is that with spark-shell --jars does not work! This is Spark 2
> accessing Oracle 12c
>
Thanks.
The problem is that with spark-shell --jars does not work! This is Spark 2
accessing Oracle 12c
spark-shell --jars /home/hduser/jars/ojdbc6.jar
It comes back with
java.sql.SQLException: No suitable driver
unfortunately
and spark-shell uses spark-submit under the bonnet if you look at
Hi Mich
You can copy the jar to shared location and use --jars command line
argument of spark-submit.
Who so ever needs access to this jar , can refer to the shared path and
access it using --jars argument.
Thanks
Deepak
On Tue, Dec 27, 2016 at 3:03 PM, Mich Talebzadeh
I take you don't want to use the --jars option to avoid moving them every
time?
On Tue, 27 Dec 2016, 10:33 Mich Talebzadeh,
wrote:
> When one runs in Local mode (one JVM) on an edge host (the host user
> accesses the cluster), it is possible to put additional jar file
When one runs in Local mode (one JVM) on an edge host (the host user
accesses the cluster), it is possible to put additional jar file say
accessing Oracle RDBMS tables in $SPARK_CLASSPATH. This works
export SPARK_CLASSPATH=~/user_jars/ojdbc6.jar
Normally a group of users can have read access to
13 matches
Mail list logo