leventov opened a new pull request #6027: Make Parser.parseToMap() to return a 
mutable Map
URL: https://github.com/apache/incubator-druid/pull/6027
 
 
   I encountered this error somewhere inside Druid/Spark/Kryo/Jackson interop:
   ```
   java.lang.UnsupportedOperationException
        at 
io.druid.java.util.common.parsers.ObjectFlatteners$1$1.put(ObjectFlatteners.java:121)
 ~[java-util-0.12.1-rc3-SNAPSHOT.jar:0.12.1-rc3-SNAPSHOT]
        at 
io.druid.java.util.common.parsers.ObjectFlatteners$1$1.put(ObjectFlatteners.java:77)
 ~[java-util-0.12.1-rc3-SNAPSHOT.jar:0.12.1-rc3-SNAPSHOT]
        at 
com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:162)
 ~[kryo-shaded-3.0.3.jar:?]
        at 
com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:39) 
~[kryo-shaded-3.0.3.jar:?]
        at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:793) 
~[kryo-shaded-3.0.3.jar:?]
        at com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:42) 
~[chill_2.10-0.8.0.jar:0.8.0]
        at com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:33) 
~[chill_2.10-0.8.0.jar:0.8.0]
        at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:793) 
~[kryo-shaded-3.0.3.jar:?]
        at 
org.apache.spark.serializer.KryoDeserializationStream.readObject(KryoSerializer.scala:244)
 ~[spark-core_2.10-2.1.0.jar:2.1.0]
        at 
org.apache.spark.serializer.DeserializationStream.readKey(Serializer.scala:157) 
~[spark-core_2.10-2.1.0.jar:2.1.0]
        at 
org.apache.spark.serializer.DeserializationStream$$anon$2.getNext(Serializer.scala:189)
 ~[spark-core_2.10-2.1.0.jar:2.1.0]
        at 
org.apache.spark.serializer.DeserializationStream$$anon$2.getNext(Serializer.scala:186)
 ~[spark-core_2.10-2.1.0.jar:2.1.0]
        at org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:73) 
~[spark-core_2.10-2.1.0.jar:2.1.0]
        at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371) 
~[scala-library-2.10.6.jar:?]
        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327) 
~[scala-library-2.10.6.jar:?]
        at 
org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:32) 
~[spark-core_2.10-2.1.0.jar:2.1.0]
        at 
org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:39) 
~[spark-core_2.10-2.1.0.jar:2.1.0]
        at scala.collection.Iterator$GroupedIterator.fill(Iterator.scala:966) 
~[scala-library-2.10.6.jar:?]
        at 
scala.collection.Iterator$GroupedIterator.hasNext(Iterator.scala:972) 
~[scala-library-2.10.6.jar:?]
        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327) 
~[scala-library-2.10.6.jar:?]
        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327) 
~[scala-library-2.10.6.jar:?]
        at scala.collection.Iterator$class.foreach(Iterator.scala:727) 
~[scala-library-2.10.6.jar:?]
        at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) 
~[scala-library-2.10.6.jar:?]
        at 
scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48) 
~[scala-library-2.10.6.jar:?]
        at 
scala.collection.mutable.ListBuffer.$plus$plus$eq(ListBuffer.scala:176) 
~[scala-library-2.10.6.jar:?]
        at 
scala.collection.mutable.ListBuffer.$plus$plus$eq(ListBuffer.scala:45) 
~[scala-library-2.10.6.jar:?]
        at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273) 
~[scala-library-2.10.6.jar:?]
        at scala.collection.AbstractIterator.to(Iterator.scala:1157) 
~[scala-library-2.10.6.jar:?]
        at 
scala.collection.TraversableOnce$class.toList(TraversableOnce.scala:257) 
~[scala-library-2.10.6.jar:?]
        at scala.collection.AbstractIterator.toList(Iterator.scala:1157) 
~[scala-library-2.10.6.jar:?]
        at 
io.druid.indexer.spark.SparkDruidIndexer$$anonfun$14.apply(SparkDruidIndexer.scala:341)
 ~[classes/:?]
        at 
io.druid.indexer.spark.SparkDruidIndexer$$anonfun$14.apply(SparkDruidIndexer.scala:239)
 ~[classes/:?]
        at 
org.apache.spark.rdd.RDD$$anonfun$mapPartitionsWithIndex$1$$anonfun$apply$26.apply(RDD.scala:843)
 ~[spark-core_2.10-2.1.0.jar:2.1.0]
        at 
org.apache.spark.rdd.RDD$$anonfun$mapPartitionsWithIndex$1$$anonfun$apply$26.apply(RDD.scala:843)
 ~[spark-core_2.10-2.1.0.jar:2.1.0]
        at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) 
~[spark-core_2.10-2.1.0.jar:2.1.0]
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) 
~[spark-core_2.10-2.1.0.jar:2.1.0]
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) 
~[spark-core_2.10-2.1.0.jar:2.1.0]
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) 
~[spark-core_2.10-2.1.0.jar:2.1.0]
        at org.apache.spark.scheduler.Task.run(Task.scala:99) 
~[spark-core_2.10-2.1.0.jar:2.1.0]
        at 
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282) 
[spark-core_2.10-2.1.0.jar:2.1.0]
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 
[?:1.8.0_162]
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 
[?:1.8.0_162]
        at java.lang.Thread.run(Thread.java:748) [?:1.8.0_162]
   ```
   
   The relevant kryo source is 
[here](https://github.com/EsotericSoftware/kryo/blob/135a31bfd4c40cd7e059ca2a5269c35a97c0b08d/src/com/esotericsoftware/kryo/serializers/MapSerializer.java#L123-L165),
 if anybody is interested.
   
   This PR fixes this error by making `Parser`s to return mutable maps.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@druid.apache.org
For additional commands, e-mail: dev-h...@druid.apache.org

Reply via email to