Re: How to write mapreduce programming in spark by using java on user-defined javaPairRDD?

2015-07-07 Thread Feynman Liang
Hi MIssie, In the Java API, you should consider: 1. RDD.map https://spark.apache.org/docs/latest/api/java/org/apache/spark/rdd/RDD.html#map(scala.Function1,%20scala.reflect.ClassTag) to transform the text 2. RDD.sortBy

How to write mapreduce programming in spark by using java on user-defined javaPairRDD?

2015-07-07 Thread 付雅丹
Hi, everyone! I've got key,value pair in form of LongWritable, Text, where I used the following code: SparkConf conf = new SparkConf().setAppName(MapReduceFileInput); JavaSparkContext sc = new JavaSparkContext(conf); Configuration confHadoop = new Configuration(); JavaPairRDDLongWritable,Text