Hi guys,
Thanks for responding.
Re SPARK_CLASSPATH (Daoyuan): I think you are right. We tried it, and that’s
what the warning we got said.
Re SparkConf (Daoyuan): We need the custom jar in the driver code, so I don’t
know how that would work.
Re EMR -u (Sonal): The documentation says that
Hi Gerhard,
I just stumbled upon some documentation on EMR - link below. Seems there is
a -u option to add jars in S3 to your classpath, have you tried that ?
http://docs.aws.amazon.com/ElasticMapReduce/latest/DeveloperGuide/emr-spark-configure.html
Best Regards,
Sonal
Founder, Nube
Hi Gerhard,
How does EMR set its conf for spark? I think if you set SPARK_CLASSPATH and
spark.dirver.extraClassPath, spark would ignore SPARK_CLASSPATH.
I think you can do this by read the configuration from SparkConf, and then add
your custom settings to the corresponding key, and use the
We're running Spark 1.6.0 on EMR, in YARN client mode. We run Python code, but
we want to add a custom jar file to the driver.
When running on a local one-node standalone cluster, we just use
spark.driver.extraClassPath and everything works:
spark-submit --conf