[ https://issues.apache.org/jira/browse/SPARK-26045?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun resolved SPARK-26045. ----------------------------------- Resolution: Fixed Assignee: Sean Owen Fix Version/s: 3.0.0 2.4.4 This is resolved via https://github.com/apache/spark/pull/24680 > Error in the spark 2.4 release package with the spark-avro_2.11 depdency > ------------------------------------------------------------------------ > > Key: SPARK-26045 > URL: https://issues.apache.org/jira/browse/SPARK-26045 > Project: Spark > Issue Type: Bug > Components: Build > Affects Versions: 2.4.0 > Environment: 4.15.0-38-generic #41-Ubuntu SMP Wed Oct 10 10:59:38 UTC > 2018 x86_64 x86_64 x86_64 GNU/Linux > Reporter: Oscar garcía > Assignee: Sean Owen > Priority: Major > Fix For: 2.4.4, 3.0.0 > > Original Estimate: 2h > Remaining Estimate: 2h > > Hello I have been problems with the last spark 2.4 release, the read avro > file feature does not seem to be working, I have fixed it in local building > the source code and updating the *avro-1.8.2.jar* on the *$SPARK_HOME*/jars/ > dependencies. > With the default spark 2.4 release when I try to read an avro file spark > raise the following exception. > {code:java} > spark-shell --packages org.apache.spark:spark-avro_2.11:2.4.0 > scala> spark.read.format("avro").load("file.avro") > java.lang.NoSuchMethodError: > org.apache.avro.Schema.getLogicalType()Lorg/apache/avro/LogicalType; > at > org.apache.spark.sql.avro.SchemaConverters$.toSqlTypeHelper(SchemaConverters.scala:51) > at > org.apache.spark.sql.avro.SchemaConverters$.toSqlTypeHelper(SchemaConverters.scala:105 > {code} > Checksum: spark-2.4.0-bin-without-hadoop.tgz: 7670E29B 59EAE7A8 5DBC9350 > 085DD1E0 F056CA13 11365306 7A6A32E9 B607C68E A8DAA666 EF053350 008D0254 > 318B70FB DE8A8B97 6586CA19 D65BA2B3 FD7F919E > > > -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org