Try restarting your Spark cluster .
./sbin/stop-all.sh
./sbin/start-all.sh

Thanks,
Vishnu

On Sun, Feb 22, 2015 at 7:30 PM, Surendran Duraisamy <
2013ht12...@wilp.bits-pilani.ac.in> wrote:

>  Hello All,
>
> I am new to Apache Spark, I am trying to run JavaKMeans.java from Spark
> Examples in my Ubuntu System.
>
> I downloaded spark-1.2.1-bin-hadoop2.4.tgz
> <http://www.apache.org/dyn/closer.cgi/spark/spark-1.2.1/spark-1.2.1-bin-hadoop2.4.tgz>
> and started sbin/start-master.sh
>
> After starting Spark and access http://localhost:8080/ to look at the
> status of my Spark Instance, and it shows as follows.
>
>
>    - *URL:* spark://vm:7077
>    - *Workers:* 0
>    - *Cores:* 0 Total, 0 Used
>    - *Memory:* 0.0 B Total, 0.0 B Used
>    - *Applications:* 0 Running, 4 Completed
>    - *Drivers:* 0 Running, 0 Completed
>    - *Status:* ALIVE
>
> Number of Cores is 0 and Memory is 0.0B. I think because of this I am
> getting following error when I try to run JavaKMeans.java
>
> "Initial job has not accepted any resources; check your cluster UI to
> ensure that workers are registered and have sufficient memory"
>
> Am I missing any configuration before running sbin/start-master.sh?
>  Regards,
> Surendran
>

Reply via email to