Re: Standalone cluster not using multiple workers for single application

2015-11-03 Thread Jeff Jones
With the default configuration SparkTC won’t run on my cluster. The log has: 15/11/03 17:50:13 WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources With the SparkUI Completed Applications

Standalone cluster not using multiple workers for single application

2015-11-02 Thread Jeff Jones
I’ve got an a series of applications using a single standalone Spark cluster (v1.4.1). The cluster has 1 master and 4 workers (4 CPUs per worker node). I am using the start-slave.sh script to launch the worker process on each node and I can see the nodes were successfully registered using the

Re: Standalone cluster not using multiple workers for single application

2015-11-02 Thread Jean-Baptiste Onofré
Hi Jeff, it may depend of your application code. To verify your setup and if your are able to scale on multiple worker, you can try using the SparkTC example for instance (it should use all workers). Regards JB On 11/02/2015 08:56 PM, Jeff Jones wrote: I’ve got an a series of applications