This piece of code

saveAsHadoopFile[TextOutputFormat[NullWritable,Text]]("hdfs://
masteripaddress:9000/root/test-app/test1/")

Saves the RDD into HDFS, and yes you can physically see the files using the
hadoop command (hadoop fs -ls /root/test-app/test1 - yes you need to login
to the cluster). In case if you are not able to execute the command (like
hadoop command not found), you can do like $HADOOP_HOME/bin/hadoop fs -ls
/root/test-app/test1



Thanks
Best Regards


On Thu, Jul 24, 2014 at 4:34 PM, lmk <lakshmi.muralikrish...@gmail.com>
wrote:

> Hi Akhil,
> I am sure that the RDD that I saved is not empty. I have tested it using
> take.
> But is there no way that I can see this saved physically like we do in the
> normal context? Can't I view this folder as I am already logged into the
> cluster?
> And, should I run hadoop fs -ls
> hdfs://masteripaddress:9000/root/test-app/test1/
> after I login to the cluster?
>
> Thanks,
> lmk
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/save-to-HDFS-tp10578p10581.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Reply via email to