I am very new to Spark.

I have a very basic question. I read a file in Spark RDD in which each line
is a JSON. I want to make apply groupBy like transformations. So I want to
transform each JSON line into a PairRDD. Is there a straight forward way to
do it in Java?

My json is like this:

{
        "tmpl": "p",
        "bw": "874",
        "aver": {"cnac": "US","t1": "2"},
}

Currently, the way I am trying is the to split by , first and then by :. Is
there any straight forward way to do this?

My current code:

val pairs = setECrecords.flatMap(x => (x.split(",")))
pairs.foreach(println)

val pairsastuple = pairs.map(x => if(x.split("=").length>1)
(x.split("=")(0), x.split("=")(1)) else (x.split("=")(0), x))





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Loading-json-data-into-Pair-RDD-in-Spark-using-java-tp24624.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to