Hi Devs, I'm needing to read a json file from hdfs and turn that into a scala string, I have dug around for documentation on how to do this and found this:
http://stackoverflow.com/questions/30445263/how-to-read-whole-file-in-one-string [https://cdn.sstatic.net/Sites/stackoverflow/img/apple-touch-i...@2.png?v=73d79a89bded]<http://stackoverflow.com/questions/30445263/how-to-read-whole-file-in-one-string> How to read whole file in one string - Stack Overflow<http://stackoverflow.com/questions/30445263/how-to-read-whole-file-in-one-string> stackoverflow.com "How to read whole [HDFS] file in one string [in Spark, to use as sql]": e.g. // Put file to hdfs from edge-node's shell... hdfs dfs -put The following two lines of code dont seem to do the job: rdd = sc.wholeTextFiles("hdfs://nameservice1/user/me/test.txt") rdd.collect.foreach(t=>println(t._2)) I have tried to set the second line to a scala string but that doesn't seem to work, I would really appreciate some insights into how to do this. Thanks in advance. [https://cdn.sstatic.net/Sites/stackoverflow/img/apple-touch-i...@2.png?v=73d79a89bded]<http://stackoverflow.com/questions/30445263/how-to-read-whole-file-in-one-string> How to read whole file in one string - Stack Overflow<http://stackoverflow.com/questions/30445263/how-to-read-whole-file-in-one-string> stackoverflow.com "How to read whole [HDFS] file in one string [in Spark, to use as sql]": e.g. // Put file to hdfs from edge-node's shell... hdfs dfs -put