Re: Spark SQL configuration
:* Sunday, October 26, 2014 9:08 PM *To:* Pagliari, Roberto *Cc:* u...@spark.incubator.apache.org *Subject:* Re: Spark SQL configuration You can write `HADOOP_CONF_DIR=your_hadoop_conf_path` to `conf/spark-env.sh` to enable: 1 connect to your yarn cluster 2 set `hdfs` as default
Spark SQL configuration
I'm a newbie with Spark. After installing it on all the machines I want to use, do I need to tell it about Hadoop configuration, or will it be able to find it himself? Thank you,