so I need to reconfigure my sparkcontext this way: val conf = new SparkConf() .setMaster("local") .setAppName("CountingSheep") .set("spark.executor.memory", "1g") .set("spark.akka.frameSize","20") val sc = new SparkContext(conf)
And start a new cluster with the setup scripts from Spark 1.0.1. Is this the right approach? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Terminal-freeze-during-SVM-Broken-pipe-tp9022p9941.html Sent from the Apache Spark User List mailing list archive at Nabble.com.