*Facing casting issues while working with the spark UDF*
UDF1 mode1 = new UDF1<WrappedArray<Map<Double, Integer>>, String>() { @Override public String call(WrappedArray<Map<Double, Integer>> maps) throws Exception { List<Map<Double, Integer>> lis = (List<Map<Double, Integer>>) JavaConverters.seqAsJavaListConverter(maps).asJava(); java.util.Map<Double,Integer> a= lis.stream().flatMap(map -> map.entrySet().stream()) .collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue)); System.out.println(a.get(key)); return ""; } }; *error: * / Caused by: java.lang.ClassCastException: scala.collection.immutable.Map$Map1 cannot be cast to java.util.Map at java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:269) / -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org