Hi Everyone

My question is specific to running spark-1.4.1 on emr-4.0.0

spark installed to /usr/lib/spark
conf folder linked to /etc/spark/conf
spark-shell location /usr/bin/spark-shell

I noticed that if I run spark-shell it does not read /etc/spark/conf folder
files (e.g. spark-env.sh and log4j configuration)

To solve the problem I have to add /etc/spark/conf to SPARK_CLASSPATH
export SPARK_CLASSPATH=/etc/spark/conf

How to configure spark/emr4 to avoid manual step of adding /etc/spark/conf
to SPARK_CLASSPATH?

Alex

Reply via email to