I'm not sure why this would have have changed from 1.4.1 to 1.5.1, but I have seen similar exceptions in my code. It seems to me that values with SQL type "ArrayType" are stored internally as an instance of the Scala "WrappedArray" class (regardless if is was originally an instance of Scala "List"). To deal with this, I would just change List to Seq when calling "getAs".
TLDR: val risk_items = r.getAs[Seq[Map[String, String]]]("risk_items") -Nick -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/ClassCastException-when-use-spark1-5-1-tp25006p25036.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org