object SparkPi {
  private val logger = Logger(this.getClass)

  val sparkConf = new SparkConf()
    .setAppName("Spark Pi")
    .setMaster("spark://10.100.103.192:7077")

  lazy val sc = new SparkContext(sparkConf)

sc.addJar("/Users/yfang/workspace/mcs/target/scala-2.11/root-assembly-0.1.0.jar")

  def main(args: Array[String]) {
    val x = (1 to 4)
    val a = sc.parallelize(x)
    val mean = a.mean()
    print(mean)
  }
}


spark://10.100.103.192:7077 is a remote standalone cluster I created on
AWS.
I ran it locally with IntelliJ.
I can see the job is submitted. But the calculation can never finish.

The log shows:
15:34:21.674 [Timer-0] WARN org.apache.spark.scheduler.TaskSchedulerImpl -
Initial job has not accepted any resources; check your cluster UI to ensure
that workers are registered and have sufficient resources

Any help will be highly appreciated!

Thanks!

Yuan

-- 
This message is intended exclusively for the individual or entity to which 
it is addressed. This communication may contain information that is 
proprietary, privileged or confidential or otherwise legally prohibited 
from disclosure. If you are not the named addressee, you are not authorized 
to read, print, retain, copy or disseminate this message or any part of it. 
If you have received this message in error, please notify the sender 
immediately by e-mail and delete all copies of the message.

Reply via email to