Hi Ayan,

Thank you for the response. I tried to connect to same "stand alone spark
master" through spark-shell and it works as intended

On shell, tried ./spark-shell --master spark://host:7077
Connection was established and It wrote an info on console as 'Spark
context available as 'sc' (master = spark://host:7077)

But its issue when trying to connect to the same stand alone spark master
through intellij scala code.

   1. val <http://scala-lang.org/> conf = new <http://scala-lang.org/>
   SparkConf()
   2. .setAppName("scala spark")
   3. .setMaster("spark://host:7077")


when setMaster is local, it just works, but when trying to connect spark
stand alone (passing spark master url), it reports the above error.




On Mon, Sep 26, 2016 at 11:23 PM, ayan guha <guha.a...@gmail.com> wrote:

> can you run spark-shell and try what you are trying? It is probably
> intellij issue
>
> On Tue, Sep 27, 2016 at 3:59 PM, Reth RM <reth.ik...@gmail.com> wrote:
>
>> Hi,
>>
>>  I have issue connecting spark master, receiving a RuntimeException:
>> java.io.InvalidClassException: org.apache.spark.rpc.netty.RequestMessage.
>>
>> Followed the steps mentioned below. Can you please point me to where am I
>> doing wrong?
>>
>> 1. Downloaded spark (version spark-2.0.0-bin-hadoop2.7)
>> 2. Have scala installed (version  2.11.8)
>> 3. navigated to /spark-2.0.0-bin-hadoop2.7/sbin
>> 4../start-master.sh
>> 5../start-slave.sh spark://http://host:7077/
>> 6. Intellij has simple 2 lines code for scala as it is here
>> <https://ideone.com/i6mq3t>
>>
>> Error
>> https://jpst.it/NOUE
>>
>>
>>
>>
>
>
> --
> Best Regards,
> Ayan Guha
>

Reply via email to