to ping this from the cluster)
>
> - *spark.driver.port* - set it to a port number which is accessible from
> the spark cluster.
>
> You can look at more configuration options over here.
> <http://spark.apache.org/docs/latest/configuration.html#networking>
>
>
> Thanks
>
erialVersionUID = -7366074099953117729
>
> Version incompatibility, can you double check your version?
> On 18 Mar 2015 06:08, "Eason Hu" wrote:
>
>> Hi Akhil,
>>
>> sc.parallelize(1 to 1).collect() in the Spark shell on Spark v1.2.0
>> runs fine. Howeve
on of spark?
>
> Can you fireup a spark-shell and write this line and see what happens:
>
> sc.parallelize(1 to 1).collect()
>
>
> Thanks
> Best Regards
>
> On Mon, Mar 16, 2015 at 11:13 PM, Eason Hu wrote:
>
>> Hi Akhil,
>>
>> Yes, I did cha
Hi Akhil,
Yes, I did change both versions on the project and the cluster. Any clues?
Even the sample code from Spark website failed to work.
Thanks,
Eason
On Sun, Mar 15, 2015 at 11:56 PM, Akhil Das
wrote:
> Did you change both the versions? The one in your build file of your
> project and t