I am trying to create a new JavaPairRDD from data in an HDFS file. My code is:
sparkContext = new JavaSparkContext("yarn-client", "SumFramesPerTimeUnit",
sparkConf);
JavaPairRDD<LongWritable, BytesWritable> inputRDD =
sparkContext.newAPIHadoopFile(fileFilter, FixedLengthInputFormat.class,
LongWritable.class, BytesWritable.class, config);
However, when I run the job I get the following error:
com.fasterxml.jackson.databind.JsonMappingException: Infinite recursion
(StackOverflowError) (through reference chain:
scala.collection.convert.IterableWrapper[0]->org.apache.spark.rdd.RDDOperationScope["allScopes"]->scala.collection.convert.IterableWrapper[0]->org.apache.spark.rdd.RDDOperationScope["allScopes"]->...)
at
com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:680)
at
com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:156)
at
com.fasterxml.jackson.databind.ser.std.CollectionSerializer.serializeContents(CollectionSerializer.java:132)
at
com.fasterxml.jackson.module.scala.ser.IterableSerializer.serializeContents(IterableSerializerModule.scala:30)
at
com.fasterxml.jackson.module.scala.ser.IterableSerializer.serializeContents(IterableSerializerModule.scala:16)
at
com.fasterxml.jackson.databind.ser.std.AsArraySerializerBase.serialize(AsArraySerializerBase.java:185)
at
com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:575)
at
com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:666)
at
com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:156)
Any thoughts as to what may be going wrong?
David
David R Robison
Senior Systems Engineer
O. +1 512 247 3700
M. +1 757 286 0022
[email protected]<mailto:[email protected]>
www.psgglobal.net<http://www.psgglobal.net/>
Prometheus Security Group Global, Inc.
3019 Alvin Devane Boulevard
Building 4, Suite 450
Austin, TX 78741