Which variable is that you don't understand?
Here's a minimalistic spark-env.sh of mine.
export SPARK_MASTER_IP=192.168.10.28
export HADOOP_CONF_DIR=/home/akhld/sigmoid/localcluster/hadoop/conf
export HADOOP_HOME=/home/akhld/sigmoid/localcluster/hadoop/
Thanks
Best Regards
On Fri, Jan 23, 20
not needed. You can directly follow the installation
However you might need sbt to package your files to jar.
On Fri, Jan 23, 2015 at 11:54 AM, riginos wrote:
> Do i need to manually install and configure hadoop before doing anything
> with
> spark standalone?
>
>
>
> --
> View this message in c
You can do ./sbin/start-slave.sh --master spark://IP:PORT. I believe you're
missing --master. In addition, it's a good idea to pass with --master
exactly the spark master's endpoint as shown on your UI under
http://localhost:8080. But that should do it. If that's not working, you
can look at the Wo