[ https://issues.apache.org/jira/browse/SPARK-32829?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Manjay Kumar updated SPARK-32829: --------------------------------- Description: Caused by: java.lang.ClassCastException: org.apache.spark.sql.catalyst.expressions.MutableDouble cannot be cast to org.apache.spark.sql.catalyst.expressions.MutableLong I am trying to parse the the parquet file to dataset using case class with encoders. this is very much known error , but don't see the resolution. could you please help on this. [https://stackoverflow.com/questions/51300978/spark-error-reading-parquet?rq=1] similar like this :: https://issues.apache.org/jira/browse/SPARK-17477 was: Caused by: java.lang.ClassCastException: org.apache.spark.sql.catalyst.expressions.MutableDouble cannot be cast to org.apache.spark.sql.catalyst.expressions.MutableLong this is very much known error , but don't see the resolution. could you please help on this. [https://stackoverflow.com/questions/51300978/spark-error-reading-parquet?rq=1] > while parsing the data from parquet to case class > ------------------------------------------------- > > Key: SPARK-32829 > URL: https://issues.apache.org/jira/browse/SPARK-32829 > Project: Spark > Issue Type: Question > Components: Spark Core > Affects Versions: 2.4.3 > Reporter: Manjay Kumar > Priority: Minor > > Caused by: java.lang.ClassCastException: > org.apache.spark.sql.catalyst.expressions.MutableDouble cannot be cast to > org.apache.spark.sql.catalyst.expressions.MutableLong > > I am trying to parse the the parquet file to dataset using case class with > encoders. > this is very much known error , but don't see the resolution. > could you please help on this. > > [https://stackoverflow.com/questions/51300978/spark-error-reading-parquet?rq=1] > similar like this :: https://issues.apache.org/jira/browse/SPARK-17477 > -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org