Hi All, I am unable to run Spark Streaming job in my Hadoop Cluster, its behaving unexpectedly. When i submit a job, it fails by throwing some socket exception in HDFS, if i run the same job second or third time, it runs for sometime and stops.
I am confused. Is there any configuration in YARN-Site.xml file specific to spark ??? Please suggest me.