Hi, everyone!
I've got key,value pair in form of LongWritable, Text, where I used the
following code:
SparkConf conf = new SparkConf().setAppName(MapReduceFileInput);
JavaSparkContext sc = new JavaSparkContext(conf);
Configuration confHadoop = new Configuration();
JavaPairRDDLongWritable,Text
Hi, everyone!
I've got key,value pair in form of LongWritable, Text, where I used the
following code:
SparkConf conf = new SparkConf().setAppName(MapReduceFileInput);
JavaSparkContext sc = new JavaSparkContext(conf);
Configuration confHadoop = new Configuration();
JavaPairRDDLongWritable,Text
(
/sigmoid, DataInputFormat.class, LongWritable.class,
Text.class)
Thanks
Best Regards
On Tue, Jun 23, 2015 at 6:58 AM, 付雅丹 yadanfu1...@gmail.com wrote:
Hello, everyone! I'm new in spark. I have already written programs in
Hadoop2.5.2, where I defined my own InputFormat and OutputFormat
Hello, everyone! I'm new in spark. I have already written programs in
Hadoop2.5.2, where I defined my own InputFormat and OutputFormat. Now I
want to move my codes to spark using java language. The first problem I
encountered is how to transform big txt file in local storage to RDD, which
is