Whats in your SparkIsAwesome class? Just make sure that you are giving
enough partition to spark to evenly distribute the job throughout the
cluster.
Try submitting the job this way:
~/spark/bin/spark-submit --executor-cores 10 --executor-memory 5G
--driver-memory 5G --class com.example.SparkIsAwe
spark://
> ec2-54-215-112-121.us-west-1.compute.amazonaws.com:7077
>
> Its the public url to your master
>
>
> Andy
>
> From: Kyohey Hamaguchi
> Date: Friday, December 4, 2015 at 11:28 AM
> To: "user @spark"
> Subject: Not all workers seem to run in a stan
;user @spark"
Subject: Not all workers seem to run in a standalone cluster setup by
spark-ec2 script
> Hi,
>
> I have setup a Spark standalone-cluster, which involves 5 workers,
> using spark-ec2 script.
>
> After submitting my Spark application, I had noticed that just on
Quick question: Are you processing gzipped files by any chance? It's a
common stumbling block people hit.
See: http://stackoverflow.com/q/27531816/877069
Nick
On Fri, Dec 4, 2015 at 2:28 PM Kyohey Hamaguchi
wrote:
> Hi,
>
> I have setup a Spark standalone-cluster, which involves 5 workers,
> u
Hi,
I have setup a Spark standalone-cluster, which involves 5 workers,
using spark-ec2 script.
After submitting my Spark application, I had noticed that just one
worker seemed to run the application and other 4 workers were doing
nothing. I had confirmed this by checking CPU and memory usage on t