jacob-talroo commented on issue #1137: URL: https://github.com/apache/sedona/issues/1137#issuecomment-2103468942
On Databricks, I too am getting `NoSuchMethodError: 'java.util.List com.uber.h3core.H3Core.polygonToCells(java.util.List, java.util.List, int)'`. This is weird since I am on a [Databricks cluster that should NOT support H3](https://docs.databricks.com/en/sql/language-manual/sql-ref-h3-geospatial-functions.html) - it neither SQL nor Photon. > The main reason is that we shaded the uber-h3 jar into `sedona-spark-shaded` which leads to conflicts. Another alternative to fix this is that: use `sedona-spark` jar which does not shade anything, and manually download all dependency jars of Sedona: https://github.com/apache/sedona/blob/master/pom.xml#L139 I think that the issue is that h3 is _not_ shaded currently? It appears that Sedona is using the package `com.uber`. If it was shaded, wouldn't it use a different package name? I think the current "shaded" JAR might currently just be an [Uber Jar](https://maven.apache.org/plugins/maven-shade-plugin/examples/includes-excludes.html) - not to be confused with the company behind H3. To shade, I think we need some [relocations](https://maven.apache.org/plugins/maven-shade-plugin/examples/class-relocation.html). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@sedona.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org