Thanks moon,

I set spark.files in SPARK_HOME/conf/spark-defaults.conf

and when I run spark/bin/pyspark shell it finds and adds these files, but
when I execute /bin/pyspark/spark-submit it doesn't add them. spark-submit
does read the spark-defaults.conf (because it does find the spark.master
entry), but for some reason ignores the spark.files directive.....very
strange since pyspark shell loads them properly.



On Tue, Sep 1, 2015 at 11:25 PM, moon soo Lee <m...@apache.org> wrote:

> Hi,
>
> I think changes are come from
> https://github.com/apache/incubator-zeppelin/pull/244.
>
> https://github.com/apache/incubator-zeppelin/pull/270 is not yet merged,
> but i suggest try this. It uses spark-submit if you have SPARK_HOME
> defined. You'll just need define your spark.files in
> SPARK_HOME/conf/spark-defaults.conf, without adding them into
> ZEPPELIN_JAVA_OPTS
>
> Thanks,
> moon
>
>
> On Tue, Sep 1, 2015 at 10:52 PM Axel Dahl <a...@whisperstream.com> wrote:
>
>> Downloaded and compiled latest zeppelin.
>>
>> in my conf/zeppelin-env.sh file I have the following line:
>>
>> export
>> ZEPPELIN_JAVA_OPTS="-Dspark.files=/home/hduser/lib/sparklib.zip,/home/hduser/lib/service.cfg,/home/hduser/lib/helper.py"
>>
>> This used to work, but when I inspect the folder using
>> SparkFile.getRootDirectory(), it doesn't show any of the files in the
>> folder.
>>
>> I have checked that all the files are accessible at the specified paths.
>> There's nothing in the logs to indicate that  "ZEPPELIN_JAVA_OPTS" was
>> read, but it looks like other entries are being read (e.g. SPARK_HOME).
>>
>> Did this change from previous versions?
>>
>> -Axel
>>
>>
>>

Reply via email to