Hi devs,

I need you help to verify some mode of SparkInterpreter. I saw the message
below from spark website in yarn mode:

```
To make Spark runtime jars accessible from YARN side, you can specify
spark.yarn.archive or spark.yarn.jars. For details please refer to Spark
Properties
<http://spark.apache.org/docs/latest/running-on-yarn.html#spark-properties>.
If neither spark.yarn.archive nor spark.yarn.jars is specified, Spark will
create a zip file with all jars under $SPARK_HOME/jarsand upload it to the
distributed cache.
```

It means if you use internal spark, you cannot run yarn mode in current
master. Can anyone test it and let me know the result?

Thanks in advance,
Jongyoul

-- 
이종열, Jongyoul Lee, 李宗烈
http://madeng.net

Reply via email to