On Mon, Dec 29, 2014 at 7:39 PM, Jeremy Freeman
<freeman.jer...@gmail.com> wrote:
> Hi Stephen, it should be enough to include
>
>> --jars /path/to/file.jar
>
> in the command line call to either pyspark or spark-submit, as in
>
>> spark-submit --master local --jars /path/to/file.jar myfile.py

Unfortunately, you also need '--driver-class-path /path/to/file.jar'
to make it accessible in driver. (This may be fixed in 1.3).

> and you can check the bottom of the Web UI’s “Environment" tab to make sure 
> the jar gets on your classpath. Let me know if you still see errors related 
> to this.
>
> — Jeremy
>
> -------------------------
> jeremyfreeman.net
> @thefreemanlab
>
> On Dec 29, 2014, at 7:55 PM, Stephen Boesch <java...@gmail.com> wrote:
>
>> What is the recommended way to do this?  We have some native database
>> client libraries for which we are adding pyspark bindings.
>>
>> The pyspark invokes spark-submit.   Do we add our libraries to
>> the SPARK_SUBMIT_LIBRARY_PATH ?
>>
>> This issue relates back to an error we have been seeing "Py4jError: Trying
>> to call a package" - the suspicion being that the third party libraries may
>> not be available on the jvm side.
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to