Thank you for your help Akhil! We found that it is no longer working from
our laptop to remotely connect to the remote Spark cluster, but it works if
the client is on the remote cluster as well, starting from the version
1.2.0 and beyond (v1.1.1 and below are fine). Not sure if this is related
Are you submitting your application from local to a remote host?
If you want to run the spark application from a remote machine, then you have
to at least set the following configurations properly.
- *spark.driver.host* - points to the ip/host from where you are submitting
the job (make sure
Hi Akhil,
Thank you for your help. I just found that the problem is related to my
local spark application, since I ran it in IntelliJ and I didn't reload the
project after I recompile the jar via maven. If I didn't reload, it will
use some local cache data to run the application which leads to
Could you tell me what all you did to change the version of spark?
Can you fireup a spark-shell and write this line and see what happens:
sc.parallelize(1 to 1).collect()
Thanks
Best Regards
On Mon, Mar 16, 2015 at 11:13 PM, Eason Hu eas...@gmail.com wrote:
Hi Akhil,
Yes, I did change
Hi Akhil,
sc.parallelize(1 to 1).collect() in the Spark shell on Spark v1.2.0
runs fine. However, if I do the following remotely, it will throw
exception:
val sc : SparkContext = new SparkContext(conf)
val NUM_SAMPLES = 10
val count = sc.parallelize(1 to NUM_SAMPLES).map{i =
val x
Hi Akhil,
Yes, I did change both versions on the project and the cluster. Any clues?
Even the sample code from Spark website failed to work.
Thanks,
Eason
On Sun, Mar 15, 2015 at 11:56 PM, Akhil Das ak...@sigmoidanalytics.com
wrote:
Did you change both the versions? The one in your build
Did you change both the versions? The one in your build file of your
project and the spark version of your cluster?
Thanks
Best Regards
On Sat, Mar 14, 2015 at 6:47 AM, EH eas...@gmail.com wrote:
Hi all,
I've been using Spark 1.1.0 for a while, and now would like to upgrade to
Spark 1.1.1
Hi all,
I've been using Spark 1.1.0 for a while, and now would like to upgrade to
Spark 1.1.1 or above. However, it throws the following errors:
18:05:31.522 [sparkDriver-akka.actor.default-dispatcher-3hread] ERROR
TaskSchedulerImpl - Lost executor 37 on hcompute001: remote Akka client