OT/bin/pyspark $extraPkgs --conf
>>> spark.cassandra.connection.host=
>>> ec2-54-153-102-232.us-west-1.compute.amazonaws.com $*
>>>
>>>
>>>
>>> From: Russell Jurney <russell.jur...@gmail.com
>>> <javascript:_e(%7B%7D,'cvml','russell.jur.
ARK_PYTHON=python3
>>
>> export PYSPARK_DRIVER_PYTHON=python3
>>
>> IPYTHON_OPTS=notebook $SPARK_ROOT/bin/pyspark $extraPkgs --conf
>> spark.cassandra.connection.host=
>> ec2-54-153-102-232.us-west-1.compute.amazonaws.com $*
>>
>>
>>
>>
xport PYSPARK_DRIVER_PYTHON=python3
>
> IPYTHON_OPTS=notebook $SPARK_ROOT/bin/pyspark $extraPkgs --conf
> spark.cassandra.connection.host=
> ec2-54-153-102-232.us-west-1.compute.amazonaws.com $*
>
>
>
> From: Russell Jurney <russell.jur...@gmail.com>
> Date: Sunday, M
ail.com>
Date: Sunday, March 27, 2016 at 7:22 PM
To: "user @spark" <user@spark.apache.org>
Subject: --packages configuration equivalent item name?
> I run PySpark with CSV support like so: IPYTHON=1 pyspark --packages
> com.databricks:spark-csv_2.10:1.4.0
>
> I
I run PySpark with CSV support like so: IPYTHON=1 pyspark --packages
com.databricks:spark-csv_2.10:1.4.0
I don't want to type this --packages argument each time. Is there a config
item for --packages? I can't find one in the reference at
http://spark.apache.org/docs/latest/configuration.html
If