[ 
https://issues.apache.org/jira/browse/SPARK-35758?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17363307#comment-17363307
 ] 

Kousuke Saruta commented on SPARK-35758:
----------------------------------------

[~ferranjr]
Could you build with -Phadoop-2.7 ?
I can build with the following command.
{code}
build/mvn -Phadoop-2.7 -Dhadoop.version=2.7.4 -pl :spark-core_2.12 -DskipTests 
compile
{code}

> Spark Core doesn't build when selecting -Dhadoop.version=2.x from Spark 3.1.1
> -----------------------------------------------------------------------------
>
>                 Key: SPARK-35758
>                 URL: https://issues.apache.org/jira/browse/SPARK-35758
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 3.1.1, 3.1.2
>            Reporter: Ferran Puig-Calvache
>            Priority: Trivial
>
> When trying to follow steps in documentation about building Spark with a 
> specific Hadoop version, this seems to fail with the following compiler error.
> {code:java}
>  [INFO] Compiling 560 Scala sources and 99 Java sources to 
> /Users/puigcalvachef/Documents/os/spark/core/target/scala-2.12/classes ...
> [ERROR] [Error] 
> /Users/puigcalvachef/Documents/os/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107:
>  type mismatch;
>  found   : K where type K
>  required: String
> [ERROR] [Error] 
> /Users/puigcalvachef/Documents/os/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107:
>  value map is not a member of V
> [ERROR] [Error] 
> /Users/puigcalvachef/Documents/os/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107:
>  missing argument list for method stripXSS in class XssSafeRequest
> Unapplied methods are only converted to functions when a function type is 
> expected.
> You can make this conversion explicit by writing `stripXSS _` or 
> `stripXSS(_)` instead of `stripXSS`.
> [ERROR] [Error] 
> /Users/puigcalvachef/Documents/os/spark/core/src/main/scala/org/apache/spark/ui/PagedTable.scala:307:
>  value startsWith is not a member of K
> [ERROR] four errors found
> {code}
> The minimal build to reproduce it would be just by building spark core module 
> using:
> {code:bash}
> ./build/mvn -Dhadoop.version=2.7.4 -pl :spark-core_2.12  -DskipTests clean 
> install {code}
> After some testing, it seems that something changed between 3.1.1-rc1 and 
> 3.1.1-rc2 that made the feature start failing.
> I tried a few versions of  Hadoop that fail: 2.7.4, 2.8.1, 2.8.5 
> But it was successful when using Hadoop 3.0.0.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to