[ 
https://issues.apache.org/jira/browse/SPARK-30132?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17005841#comment-17005841
 ] 

Sean R. Owen commented on SPARK-30132:
--------------------------------------

https://github.com/scala/bug/issues/11840

> Scala 2.13 compile errors from Hadoop LocalFileSystem subclasses
> ----------------------------------------------------------------
>
>                 Key: SPARK-30132
>                 URL: https://issues.apache.org/jira/browse/SPARK-30132
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Spark Core
>    Affects Versions: 3.0.0
>            Reporter: Sean R. Owen
>            Priority: Minor
>
> A few classes in our test code extend Hadoop's LocalFileSystem. Scala 2.13 
> returns a compile error here - not for the Spark code, but because the Hadoop 
> code (it says) illegally overrides appendFile() with slightly different 
> generic types in its return value. This code is valid Java, evidently, and 
> the code actually doesn't define any generic types, so, I even wonder if it's 
> a scalac bug.
> So far I don't see a workaround for this.
> This only affects the Hadoop 3.2 build, in that it comes up with respect to a 
> method new in Hadoop 3. (There is actually another instance of a similar 
> problem that affects Hadoop 2, but I can see a tiny hack workaround for it).



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to