[ 
https://issues.apache.org/jira/browse/SPARK-35758?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hyukjin Kwon resolved SPARK-35758.
----------------------------------
    Fix Version/s: 3.1.3
                   3.2.0
       Resolution: Fixed

Issue resolved by pull request 32917
[https://github.com/apache/spark/pull/32917]

> Update the document about building Spark with Hadoop for Hadoop 2.x and 3.x
> ---------------------------------------------------------------------------
>
>                 Key: SPARK-35758
>                 URL: https://issues.apache.org/jira/browse/SPARK-35758
>             Project: Spark
>          Issue Type: Bug
>          Components: docs
>    Affects Versions: 3.1.1, 3.1.2, 3.2.0
>            Reporter: Ferran Puig-Calvache
>            Assignee: Kousuke Saruta
>            Priority: Trivial
>             Fix For: 3.2.0, 3.1.3
>
>
> When trying to follow steps in documentation about building Spark with a 
> specific Hadoop version, this seems to fail with the following compiler error.
> {code:java}
>  [INFO] Compiling 560 Scala sources and 99 Java sources to 
> /Users/puigcalvachef/Documents/os/spark/core/target/scala-2.12/classes ...
> [ERROR] [Error] 
> /Users/puigcalvachef/Documents/os/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107:
>  type mismatch;
>  found   : K where type K
>  required: String
> [ERROR] [Error] 
> /Users/puigcalvachef/Documents/os/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107:
>  value map is not a member of V
> [ERROR] [Error] 
> /Users/puigcalvachef/Documents/os/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107:
>  missing argument list for method stripXSS in class XssSafeRequest
> Unapplied methods are only converted to functions when a function type is 
> expected.
> You can make this conversion explicit by writing `stripXSS _` or 
> `stripXSS(_)` instead of `stripXSS`.
> [ERROR] [Error] 
> /Users/puigcalvachef/Documents/os/spark/core/src/main/scala/org/apache/spark/ui/PagedTable.scala:307:
>  value startsWith is not a member of K
> [ERROR] four errors found
> {code}
> The minimal build to reproduce it would be just by building spark core module 
> using:
> {code:bash}
> ./build/mvn -Dhadoop.version=2.7.4 -pl :spark-core_2.12  -DskipTests clean 
> install {code}
> After some testing, it seems that something changed between 3.1.1-rc1 and 
> 3.1.1-rc2 that made the feature start failing.
> I tried a few versions of  Hadoop that fail: 2.7.4, 2.8.1, 2.8.5 
> But it was successful when using Hadoop 3.0.0.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to