Filtering an rdd depending upon a list of values in Spark

2015-09-09 Thread prachicsa


I want to apply filter based on a list of values in Spark. This is how I get
the list:

DataFrame df = sqlContext.read().json("../sample.json");

df.groupBy("token").count().show();

Tokens = df.select("token").collect();
for(int i = 0; i < Tokens.length; i++){
System.out.println(Tokens[i].get(0)); // Need to apply filter
for Token[i].get(0)
}

Rdd on which I want apply filter is this:

JavaRDD file = context.textFile(args[0]);

I figured out a way to filter in java:

private static final Function Filter =
new Function() {
@Override
public Boolean call(String s) {
return s.contains("Set");
}
};

How do I go about it?




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Filtering-an-rdd-depending-upon-a-list-of-values-in-Spark-tp24631.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Loading json data into Pair RDD in Spark using java

2015-09-09 Thread prachicsa


I am very new to Spark.

I have a very basic question. I read a file in Spark RDD in which each line
is a JSON. I want to make apply groupBy like transformations. So I want to
transform each JSON line into a PairRDD. Is there a straight forward way to
do it in Java?

My json is like this:

{
"tmpl": "p",
"bw": "874",
"aver": {"cnac": "US","t1": "2"},
}

Currently, the way I am trying is the to split by , first and then by :. Is
there any straight forward way to do this?

My current code:

val pairs = setECrecords.flatMap(x => (x.split(",")))
pairs.foreach(println)

val pairsastuple = pairs.map(x => if(x.split("=").length>1)
(x.split("=")(0), x.split("=")(1)) else (x.split("=")(0), x))





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Loading-json-data-into-Pair-RDD-in-Spark-using-java-tp24624.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



I am very new to Spark. I have a very basic question. I have an array of values: listofECtokens: Array[String] = Array(EC-17A5206955089011B, EC-17A5206955089011A) I want to filter an RDD for all of

2015-09-09 Thread prachicsa


I am very new to Spark.

I have a very basic question. I have an array of values:

listofECtokens: Array[String] = Array(EC-17A5206955089011B,
EC-17A5206955089011A)

I want to filter an RDD for all of these token values. I tried the following
way:

val ECtokens = for (token <- listofECtokens) rddAll.filter(line =>
line.contains(token))

Output:

ECtokens: Unit = ()

I got an empty Unit even when there are records with these tokens. What am I
doing wrong?




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/I-am-very-new-to-Spark-I-have-a-very-basic-question-I-have-an-array-of-values-listofECtokens-Array-S-tp24617.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Filtering records for all values of an array in Spark

2015-09-09 Thread prachicsa


I am very new to Spark.

I have a very basic question. I have an array of values:

listofECtokens: Array[String] = Array(EC-17A5206955089011B,
EC-17A5206955089011A)

I want to filter an RDD for all of these token values. I tried the following
way:

val ECtokens = for (token <- listofECtokens) rddAll.filter(line =>
line.contains(token))

Output:

ECtokens: Unit = ()

I got an empty Unit even when there are records with these tokens. What am I
doing wrong?




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Filtering-records-for-all-values-of-an-array-in-Spark-tp24618.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org