try this using the shell parameter

SPARK_CLASSPATH

in $HIVE_HOME/conf

cp spark-env.sh.template spark-env.sh

Then edit that file and set

export SPARK_CLASSPATH=<full path where you have the jar file>

Connect to spark-shell and see if it find it


HTH



Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com



On 21 April 2016 at 15:25, Marco Mistroni <mmistr...@gmail.com> wrote:

> Thank mich but I seem to remember to modify a config file so that I don't
> need to specify the --packages option every time  I start the shell
> Kr
> On 21 Apr 2016 3:20 pm, "Mich Talebzadeh" <mich.talebza...@gmail.com>
> wrote:
>
>> on spark-shell this will work
>>
>> $SPARK_HOME/bin/spark-shell *--packages *
>> com.databricks:spark-csv_2.11:1.3.0
>>
>> HTH
>>
>> Dr Mich Talebzadeh
>>
>>
>>
>> LinkedIn * 
>> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>
>>
>>
>> http://talebzadehmich.wordpress.com
>>
>>
>>
>> On 21 April 2016 at 15:13, Marco Mistroni <mmistr...@gmail.com> wrote:
>>
>>> HI all
>>>  i need to use spark-csv in my spark instance, and i want to avoid
>>> launching spark-shell
>>> by passing the package name every time
>>> I seem to remember that i need to amend a file in the /conf directory to
>>> inlcude e,g
>>> spark.packages  com.databricks:spark-csv_2.11:1.4.0 ....
>>>
>>> but i cannot find any docs tell ing me which config file  i have to
>>> modify
>>>
>>> anyone can assist ?
>>> kr
>>>  marco
>>>
>>
>>

Reply via email to