Hi,
The response to the below thread for making yarn-client mode work by adding
the JDBC driver JAR to spark.{driver,executor}.extraClassPath works fine.
http://mail-archives.us.apache.org/mod_mbox/spark-user/201504.mbox/%3CCAAOnQ7vHeBwDU2_EYeMuQLyVZ77+N_jDGuinxOB=sff2lkc...@mail.gmail.com%3E
Bu
No, those have to be local paths.
On Thu, Apr 23, 2015 at 6:53 PM, Night Wolf wrote:
> Thanks Marcelo, can this be a path on HDFS?
>
> On Fri, Apr 24, 2015 at 11:52 AM, Marcelo Vanzin
> wrote:
>>
>> You'd have to use spark.{driver,executor}.extraClassPath to modify the
>> system class loader. Bu
Thanks Marcelo, can this be a path on HDFS?
On Fri, Apr 24, 2015 at 11:52 AM, Marcelo Vanzin
wrote:
> You'd have to use spark.{driver,executor}.extraClassPath to modify the
> system class loader. But that also means you have to manually
> distribute the jar to the nodes in your cluster, into a c
You'd have to use spark.{driver,executor}.extraClassPath to modify the
system class loader. But that also means you have to manually
distribute the jar to the nodes in your cluster, into a common
location.
On Thu, Apr 23, 2015 at 6:38 PM, Night Wolf wrote:
> Hi guys,
>
> Having a problem build a
Hi guys,
Having a problem build a DataFrame in Spark SQL from a JDBC data source
when running with --master yarn-client and adding the JDBC driver JAR with
--jars. If I run with a local[*] master all works fine.
./bin/spark-shell --jars /tmp/libs/mysql-jdbc.jar --master yarn-client
sqlContext.lo