Make sure your firewall isn't blocking the requests.

Thanks
Best Regards

On Sat, Oct 24, 2015 at 5:04 PM, Amit Singh Hora <hora.a...@gmail.com>
wrote:

> Hi All,
>
> I am trying to wrote an RDD as Sequence file into my Hadoop cluster but
> getting connection time out again and again ,I can ping the hadoop cluster
> and also directory gets created with the file name i specify ,I believe I
> am
> missing some configuration ,Kindly help me
>
> object WriteSequenceFileDemo {
>   def main(args: Array[String]): Unit = {
>     val sparkConf = new
> SparkConf().setAppName("sequenceFile").setMaster("local[2]")
>
>     val sparkContext=new SparkContext(sparkConf)
>     val data=Array((1,"a"),(2,"b"),(3,"c"))
>
>     val dataRDD=sparkContext.parallelize(data,2)
>     val hadoopRDD=dataRDD.map(x=>{
>       (new IntWritable(x._1),new Text(x._2))
>     })
> //
> hadoopRDD.saveAsSequenceFile("hdfs://ambarimaster/tmp/Demosequencefile")
>
>
> SparkContext.rddToSequenceFileRDDFunctions(hadoopRDD.asInstanceOf[RDD[(IntWritable,
> Text)]])
> .saveAsSequenceFile("hdfs://ambarimaster/tmp/Demosequencefile")
>   }
> }
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-use-saveAsSequenceFile-tp25190.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to