the cluster)
- *spark.driver.port* - set it to a port number which is accessible from
the spark cluster.
You can look at more configuration options over here.
http://spark.apache.org/docs/latest/configuration.html#networking
Thanks
Best Regards
On Fri, Mar 20, 2015 at 4:02 AM, Eason Hu eas
Version incompatibility, can you double check your version?
On 18 Mar 2015 06:08, Eason Hu eas...@gmail.com wrote:
Hi Akhil,
sc.parallelize(1 to 1).collect() in the Spark shell on Spark v1.2.0
runs fine. However, if I do the following remotely, it will throw
exception:
val sc
of spark?
Can you fireup a spark-shell and write this line and see what happens:
sc.parallelize(1 to 1).collect()
Thanks
Best Regards
On Mon, Mar 16, 2015 at 11:13 PM, Eason Hu eas...@gmail.com wrote:
Hi Akhil,
Yes, I did change both versions on the project and the cluster. Any
Hi Akhil,
Yes, I did change both versions on the project and the cluster. Any clues?
Even the sample code from Spark website failed to work.
Thanks,
Eason
On Sun, Mar 15, 2015 at 11:56 PM, Akhil Das ak...@sigmoidanalytics.com
wrote:
Did you change both the versions? The one in your build