The built-in Spark JSON functionality cannot read normal JSON arrays. The format it expects is a bunch of individual JSON objects without any outer array syntax, with one complete JSON object per line of the input file.
AFAIK your options are to read the JSON in the driver and parallelize it out to the workers or to fix your input file to match the spec. For one-off conversions I usually use a combination of jq and regex-replaces to get the source file in the right format. ________________________________________ From: SparknewUser [melanie.galloi...@gmail.com] Sent: Wednesday, July 29, 2015 6:37 AM To: user@spark.apache.org Subject: How to read a Json file with a specific format? I'm trying to read a Json file which is like : [ {"IFAM":"EQR","KTM":1430006400000,"COL":21,"DATA":[{"MLrate":"30","Nrout":"0","up":null,"Crate":"2"} ,{"MLrate":"30","Nrout":"0","up":null,"Crate":"2"} ,{"MLrate":"30","Nrout":"0","up":null,"Crate":"2"} ,{"MLrate":"30","Nrout":"0","up":null,"Crate":"2"} ,{"MLrate":"30","Nrout":"0","up":null,"Crate":"2"} ,{"MLrate":"30","Nrout":"0","up":null,"Crate":"2"} ]} ,{"IFAM":"EQR","KTM":1430006400000,"COL":22,"DATA":[{"MLrate":"30","Nrout":"0","up":null,"Crate":"2"} ,{"MLrate":"30","Nrout":"0","up":null,"Crate":"2"} ,{"MLrate":"30","Nrout":"0","up":null,"Crate":"2"} ,{"MLrate":"30","Nrout":"0","up":null,"Crate":"2"} ,{"MLrate":"30","Nrout":"0","up":null,"Crate":"2"} ,{"MLrate":"30","Nrout":"0","up":null,"Crate":"2"} ]} ] I've tried the command: val df = sqlContext.read.json("namefile") df.show() But this does not work : my columns are not recognized... -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-read-a-Json-file-with-a-specific-format-tp24061.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org