Have you checked version of spark library referenced in intelliJ project and compare it with the binary distribution version?
On Tue, 2016-09-27 at 09:13 -0700, Reth RM wrote: > Hi Ayan, > > > Thank you for the response. I tried to connect to same "stand alone > spark master" through spark-shell and it works as intended > > > On shell, tried ./spark-shell --master spark://host:7077 > Connection was established and It wrote an info on console as 'Spark > context available as 'sc' (master = spark://host:7077) > > > But its issue when trying to connect to the same stand alone spark > master through intellij scala code. > val conf = new SparkConf() > .setAppName("scala spark") > .setMaster("spark://host:7077") > > > when setMaster is local, it just works, but when trying to connect > spark stand alone (passing spark master url), it reports the above > error. > > > > > > > > On Mon, Sep 26, 2016 at 11:23 PM, ayan guha <guha.a...@gmail.com> > wrote: > can you run spark-shell and try what you are trying? It is > probably intellij issue > > On Tue, Sep 27, 2016 at 3:59 PM, Reth RM > <reth.ik...@gmail.com> wrote: > Hi, > > > I have issue connecting spark master, receiving a > RuntimeException: java.io.InvalidClassException: > org.apache.spark.rpc.netty.RequestMessage. > > > Followed the steps mentioned below. Can you please > point me to where am I doing wrong? > > > 1. Downloaded spark > (version spark-2.0.0-bin-hadoop2.7) > 2. Have scala installed (version 2.11.8) > 3. navigated to /spark-2.0.0-bin-hadoop2.7/sbin > 4../start-master.sh > 5../start-slave.sh spark://http://host:7077/ > 6. Intellij has simple 2 lines code for scala as it > is here > > > Error > https://jpst.it/NOUE > > > > > > > > > > > > -- > Best Regards, > Ayan Guha > > > --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org