You need to start Master and worker processes before connecting to them.

On Wed, Mar 8, 2017 at 3:33 PM, Mina Aslani <aslanim...@gmail.com> wrote:

> Hi,
>
> I am writing a spark Transformer in intelliJ in Java and trying to connect
> to the spark in a VM using setMaster. I get "Failed to connect to master
> ..."
>
> I get 17/03/07 16:20:55 WARN StandaloneAppClient$ClientEndpoint: Failed
> to connect to master VM_IPAddress:7077
> org.apache.spark.SparkException: Exception thrown in awaitResult
> at org.apache.spark.rpc.RpcTimeout$$anonfun$1.
> applyOrElse(RpcTimeout.scala:77)
> at org.apache.spark.rpc.RpcTimeout$$anonfun$1.
> applyOrElse(RpcTimeout.scala:75)
> at scala.runtime.AbstractPartialFunction.apply(
> AbstractPartialFunction.scala:36)
> at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.
> applyOrElse(RpcTimeout.scala:59)
>
> SparkSession spark = SparkSession
>       .builder()
>       .appName("Java Spark SQL")
>       //.master("local[1]")
>       .master("spark://VM_IPAddress:7077")
>       .getOrCreate();
>
> Dataset<String> lines = spark
>       .readStream()
>       .format("kafka")      .option("kafka.bootstrap.servers", brokers)      
> .option("subscribe", topic)      .load()
>       .selectExpr("CAST(value AS STRING)")      .as(Encoders.STRING());
>
>
>
> I get same error when I try master("*spark://spark-master:7077**"*).
>
> *However, .master("local[1]") *no exception is thrown*.*
> *
> My Kafka is in the same VM and being new to SPARK still trying to understand:
> *
>
> - Why I get above exception and how I can fix it (connect to SPARK in VM and 
> read form KAfKA in VM)?
>
> - Why using "local[1]" no exception is thrown and how to setup to read from 
> kafka in VM?
>
> *- How to stream from Kafka (data in the topic is in json format)?
> *
> Your input is appreciated!
>
> Best regards,
> Mina
>
>
>
>


-- 
Best Regards,
Ayan Guha

Reply via email to