The Spark cluster has no memory allocated.
Memory: 0.0 B Total, 0.0 B Used
From: Surendran Duraisamy 2013ht12...@wilp.bits-pilani.ac.in
To: user@spark.apache.org
Sent: Sunday, February 22, 2015 6:00 AM
Subject: Running Example Spark Program
Hello All,
I am new to Apache Spark
Hello All,
I am new to Apache Spark, I am trying to run JavaKMeans.java from Spark
Examples in my Ubuntu System.
I downloaded spark-1.2.1-bin-hadoop2.4.tgz
http://www.apache.org/dyn/closer.cgi/spark/spark-1.2.1/spark-1.2.1-bin-hadoop2.4.tgz
and started sbin/start-master.sh
After starting
If you would like a morr detailed walkthrough I wrote one recently.
https://dataissexy.wordpress.com/2015/02/03/apache-spark-standalone-clusters-bigdata-hadoop-spark/
Regards
Jason Bell
On 22 Feb 2015 14:16, VISHNU SUBRAMANIAN johnfedrickena...@gmail.com
wrote:
Try restarting your Spark
Try restarting your Spark cluster .
./sbin/stop-all.sh
./sbin/start-all.sh
Thanks,
Vishnu
On Sun, Feb 22, 2015 at 7:30 PM, Surendran Duraisamy
2013ht12...@wilp.bits-pilani.ac.in wrote:
Hello All,
I am new to Apache Spark, I am trying to run JavaKMeans.java from Spark
Examples in my Ubuntu
Thank You Jason,
Got the program working after setting
SPARK_WORKER_CORES
SPARK_WORKER_MEMORY
While running the program from eclipse, got strange ClassNotFoundException.
In JavaKMeans.java, ParsePoint is static inner class. When running the
program I got ClassNotFound for ParsePoint.
I have