Re: compile spark 3.1.1 error

2021-07-15 Thread jiahong li
currently, no solutions find!

Dereck Li
Apache Spark Contributor
Continuing Learner
@Hangzhou,China


jason_xu  于2021年5月11日周二 上午8:01写道:

> Hi Jiahong, I got the same failure on building spark 3.1.1 with hadoop
> 2.8.5.
> Any chance you find a solution?
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>


Re: compile spark 3.1.1 error

2021-05-10 Thread jason_xu
Hi Jiahong, I got the same failure on building spark 3.1.1 with hadoop 2.8.5.
Any chance you find a solution?



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Fwd: compile spark 3.1.1 error

2021-03-12 Thread Attila Zsolt Piros
Hi!

Zinc cache be cleaned by the shutdown i showed you earlier:
./build/zinc-0.3.15/bin/zinc -shutdown

But as I just seen SPARK-34539
: it is absolutely not
needed anymore as the standalone zinc is not used by the compiler plugin
from v3.0.0.

Regarding make-distribution.sh with hadoop-2.7 I can confirm it is working
on my machine:

$ SPARK_HOME=$PWD ./dev/make-distribution.sh --name custom-spark --pip
 --tgz -Phive -Phive-thriftserver -Pyarn -Phadoop-2.7  -DskipTests
...
[INFO] Reactor Summary for Spark Project Parent POM 3.1.1:
[INFO]
[INFO] Spark Project Parent POM ... SUCCESS [
 2.251 s]
[INFO] Spark Project Tags . SUCCESS [
 4.513 s]
[INFO] Spark Project Sketch ... SUCCESS [
 4.878 s]
[INFO] Spark Project Local DB . SUCCESS [
 1.259 s]
[INFO] Spark Project Networking ... SUCCESS [
 3.173 s]
[INFO] Spark Project Shuffle Streaming Service  SUCCESS [
 1.364 s]
[INFO] Spark Project Unsafe ... SUCCESS [
 6.672 s]
[INFO] Spark Project Launcher . SUCCESS [
 1.782 s]
[INFO] Spark Project Core . SUCCESS [01:48
min]
[INFO] Spark Project ML Local Library . SUCCESS [
33.861 s]
[INFO] Spark Project GraphX ... SUCCESS [
30.114 s]
[INFO] Spark Project Streaming  SUCCESS [
42.267 s]
[INFO] Spark Project Catalyst . SUCCESS [02:15
min]
[INFO] Spark Project SQL .. SUCCESS [03:05
min]
[INFO] Spark Project ML Library ... SUCCESS [02:16
min]
[INFO] Spark Project Tools  SUCCESS [
 7.109 s]
[INFO] Spark Project Hive . SUCCESS [01:20
min]
[INFO] Spark Project REPL . SUCCESS [
20.758 s]
[INFO] Spark Project YARN Shuffle Service . SUCCESS [
10.377 s]
[INFO] Spark Project YARN . SUCCESS [
47.571 s]
[INFO] Spark Project Hive Thrift Server ... SUCCESS [
36.327 s]
[INFO] Spark Project Assembly . SUCCESS [
 4.618 s]
[INFO] Kafka 0.10+ Token Provider for Streaming ... SUCCESS [
18.895 s]
[INFO] Spark Integration for Kafka 0.10 ... SUCCESS [
28.380 s]
[INFO] Kafka 0.10+ Source for Structured Streaming  SUCCESS [
 02:00 h]
[INFO] Spark Project Examples . SUCCESS [26:28
min]
[INFO] Spark Integration for Kafka 0.10 Assembly .. SUCCESS [
 4.884 s]
[INFO] Spark Avro . SUCCESS [
35.016 s]
[INFO]

[INFO] BUILD SUCCESS
[INFO]

[INFO] Total time:  02:43 h
[INFO] Finished at: 2021-03-11T08:12:02+01:00
[INFO]

...
Creating tar archive
removing 'pyspark-3.1.1' (and everything under it)
+ popd
+ '[' false == true ']'
+ echo 'Skipping building R source package'
Skipping building R source package
+ mkdir /Users/attilazsoltpiros/git/attilapiros/spark/dist/conf
+ cp
/Users/attilazsoltpiros/git/attilapiros/spark/conf/fairscheduler.xml.template
/Users/attilazsoltpiros/git/attilapiros/spark/conf/log4j.properties.template
/Users/attilazsoltpiros/git/attilapiros/spark/conf/metrics.properties.template
/Users/attilazsoltpiros/git/attilapiros/spark/conf/spark-defaults.conf.template
/Users/attilazsoltpiros/git/attilapiros/spark/conf/spark-env.sh.template
/Users/attilazsoltpiros/git/attilapiros/spark/conf/workers.template
/Users/attilazsoltpiros/git/attilapiros/spark/dist/conf
+ cp /Users/attilazsoltpiros/git/attilapiros/spark/README.md
/Users/attilazsoltpiros/git/attilapiros/spark/dist
+ cp -r /Users/attilazsoltpiros/git/attilapiros/spark/bin
/Users/attilazsoltpiros/git/attilapiros/spark/dist
+ cp -r /Users/attilazsoltpiros/git/attilapiros/spark/python
/Users/attilazsoltpiros/git/attilapiros/spark/dist
+ '[' true == true ']'
+ rm -f
/Users/attilazsoltpiros/git/attilapiros/spark/dist/python/dist/pyspark-3.1.1.tar.gz
/Users/attilazsoltpiros/git/attilapiros/spark/dist/python/dist/pyspark-3.2.0.dev0.tar.gz
+ cp -r /Users/attilazsoltpiros/git/attilapiros/spark/sbin
/Users/attilazsoltpiros/git/attilapiros/spark/dist
+ '[' -d /Users/attilazsoltpiros/git/attilapiros/spark/R/lib/SparkR ']'
+ mkdir -p /Users/attilazsoltpiros/git/attilapiros/spark/dist/R/lib
+ cp -r /Users/attilazsoltpiros/git/attilapiros/spark/R/lib/SparkR
/Users/attilazsoltpiros/git/attilapiros/spark/dist/R/lib
+ cp /Users/attilazsoltpiros/git/attilapiros/spark/R/lib/sparkr.zip
/Users/attilazsoltpiros/git/attilapiros/spark/dist/R/lib
+ '[' true == true ']'
+ 

Re: compile spark 3.1.1 error

2021-03-10 Thread jiahong li
Maybe it is my environment cause

jiahong li  于2021年3月11日周四 上午11:14写道:

> it not the cause,when i set -Phadoop-2.7 instead of
> -Dhadoop.version=2.6.0-cdh5.13.1, the same errors come out.
>
> Attila Zsolt Piros  于2021年3月10日周三 下午8:56写道:
>
>> I see, this must be because of hadoop version you are selecting by using
>> "-Dhadoop.version=2.6.0-cdh5.13.1".
>> Spark 3.1.1 only support hadoop-2.7 and hadoop-3.2, at least these two
>> can be given via profiles:  -Phadoop-2.7  and -Phadoop-3.2 (the default).
>>
>>
>> On Wed, Mar 10, 2021 at 12:26 PM jiahong li 
>> wrote:
>>
>>> i use ./build/mvn to compile ,and after execute command 
>>> :./build/zinc-0.3.15/bin/zinc
>>> -shutdown
>>> and execute command like this: /dev/make-distribution.sh --name
>>> custom-spark --pip  --tgz -Phive -Phive-thriftserver -Pyarn
>>> -Dhadoop.version=2.6.0-cdh5.13.1 -DskipTests
>>> same error appear.
>>> and execute command: ps -ef |grep zinc, there is nothing containe zinc
>>>
>>> Attila Zsolt Piros  于2021年3月10日周三
>>> 下午6:55写道:
>>>
 hi!

 Are you compiling Spark itself?
 Do you use "./build/mvn" from the project root?
 If you compiled an other version of Spark before and there the scala
 version was different then zinc/nailgun could cached the old classes which
 can cause similar troubles.
 In that case this could help:

 ./build/zinc-0.3.15/bin/zinc -shutdown

 Best Regards,
 Attila

 On Wed, Mar 10, 2021 at 11:27 AM jiahong li 
 wrote:

> hi, everybody, when i compile spark 3.1.1 from tag v3.1.1 ,encounter
> error like this:
>
> INFO] --- scala-maven-plugin:4.3.0:compile (scala-compile-first) @
> spark-core_2.12 ---
> [INFO] Using incremental compilation using Mixed compile order
> [INFO] Compiler bridge file:
> .sbt/1.0/zinc/org.scala-sbt/org.scala-sbt-compiler-bridge_2.12-1.3.1-bin_2.12.10__52.0-1.3.1_20191012T045515.jar
> [INFO] compiler plugin:
> BasicArtifact(com.github.ghik,silencer-plugin_2.12.10,1.6.0,null)
> [INFO] Compiling 560 Scala sources and 99 Java sources to
> git/spark/core/target/scala-2.12/classes ...
> [ERROR] [Error]
> git/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107:
> type mismatch;
>  found   : K where type K
>  required: String
> [ERROR] [Error]
> git/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107:
> value map is not a member of V
> [ERROR] [Error]
> git/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107:
> missing argument list for method stripXSS in class XssSafeRequest
> Unapplied methods are only converted to functions when a function type
> is expected.
> You can make this conversion explicit by writing `stripXSS _` or
> `stripXSS(_)` instead of `stripXSS`.
> [ERROR] [Error]
> git/spark/core/src/main/scala/org/apache/spark/ui/PagedTable.scala:307:
> value startsWith is not a member of K
> [ERROR] [Error]
> git/spark/core/src/main/scala/org/apache/spark/util/Utils.scala:580: value
> toLowerCase is not a member of object org.apache.hadoop.util.StringUtils
> [ERROR] 5 errors found
>
> anybody encounter error like this?
>
>



Re: compile spark 3.1.1 error

2021-03-10 Thread jiahong li
it not the cause,when i set -Phadoop-2.7 instead of
-Dhadoop.version=2.6.0-cdh5.13.1, the same errors come out.

Attila Zsolt Piros  于2021年3月10日周三 下午8:56写道:

> I see, this must be because of hadoop version you are selecting by using
> "-Dhadoop.version=2.6.0-cdh5.13.1".
> Spark 3.1.1 only support hadoop-2.7 and hadoop-3.2, at least these two can
> be given via profiles:  -Phadoop-2.7  and -Phadoop-3.2 (the default).
>
>
> On Wed, Mar 10, 2021 at 12:26 PM jiahong li 
> wrote:
>
>> i use ./build/mvn to compile ,and after execute command 
>> :./build/zinc-0.3.15/bin/zinc
>> -shutdown
>> and execute command like this: /dev/make-distribution.sh --name
>> custom-spark --pip  --tgz -Phive -Phive-thriftserver -Pyarn
>> -Dhadoop.version=2.6.0-cdh5.13.1 -DskipTests
>> same error appear.
>> and execute command: ps -ef |grep zinc, there is nothing containe zinc
>>
>> Attila Zsolt Piros  于2021年3月10日周三 下午6:55写道:
>>
>>> hi!
>>>
>>> Are you compiling Spark itself?
>>> Do you use "./build/mvn" from the project root?
>>> If you compiled an other version of Spark before and there the scala
>>> version was different then zinc/nailgun could cached the old classes which
>>> can cause similar troubles.
>>> In that case this could help:
>>>
>>> ./build/zinc-0.3.15/bin/zinc -shutdown
>>>
>>> Best Regards,
>>> Attila
>>>
>>> On Wed, Mar 10, 2021 at 11:27 AM jiahong li 
>>> wrote:
>>>
 hi, everybody, when i compile spark 3.1.1 from tag v3.1.1 ,encounter
 error like this:

 INFO] --- scala-maven-plugin:4.3.0:compile (scala-compile-first) @
 spark-core_2.12 ---
 [INFO] Using incremental compilation using Mixed compile order
 [INFO] Compiler bridge file:
 .sbt/1.0/zinc/org.scala-sbt/org.scala-sbt-compiler-bridge_2.12-1.3.1-bin_2.12.10__52.0-1.3.1_20191012T045515.jar
 [INFO] compiler plugin:
 BasicArtifact(com.github.ghik,silencer-plugin_2.12.10,1.6.0,null)
 [INFO] Compiling 560 Scala sources and 99 Java sources to
 git/spark/core/target/scala-2.12/classes ...
 [ERROR] [Error]
 git/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107:
 type mismatch;
  found   : K where type K
  required: String
 [ERROR] [Error]
 git/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107:
 value map is not a member of V
 [ERROR] [Error]
 git/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107:
 missing argument list for method stripXSS in class XssSafeRequest
 Unapplied methods are only converted to functions when a function type
 is expected.
 You can make this conversion explicit by writing `stripXSS _` or
 `stripXSS(_)` instead of `stripXSS`.
 [ERROR] [Error]
 git/spark/core/src/main/scala/org/apache/spark/ui/PagedTable.scala:307:
 value startsWith is not a member of K
 [ERROR] [Error]
 git/spark/core/src/main/scala/org/apache/spark/util/Utils.scala:580: value
 toLowerCase is not a member of object org.apache.hadoop.util.StringUtils
 [ERROR] 5 errors found

 anybody encounter error like this?


>>>


Re: compile spark 3.1.1 error

2021-03-10 Thread Attila Zsolt Piros
I see, this must be because of hadoop version you are selecting by using
"-Dhadoop.version=2.6.0-cdh5.13.1".
Spark 3.1.1 only support hadoop-2.7 and hadoop-3.2, at least these two can
be given via profiles:  -Phadoop-2.7  and -Phadoop-3.2 (the default).


On Wed, Mar 10, 2021 at 12:26 PM jiahong li  wrote:

> i use ./build/mvn to compile ,and after execute command 
> :./build/zinc-0.3.15/bin/zinc
> -shutdown
> and execute command like this: /dev/make-distribution.sh --name
> custom-spark --pip  --tgz -Phive -Phive-thriftserver -Pyarn
> -Dhadoop.version=2.6.0-cdh5.13.1 -DskipTests
> same error appear.
> and execute command: ps -ef |grep zinc, there is nothing containe zinc
>
> Attila Zsolt Piros  于2021年3月10日周三 下午6:55写道:
>
>> hi!
>>
>> Are you compiling Spark itself?
>> Do you use "./build/mvn" from the project root?
>> If you compiled an other version of Spark before and there the scala
>> version was different then zinc/nailgun could cached the old classes which
>> can cause similar troubles.
>> In that case this could help:
>>
>> ./build/zinc-0.3.15/bin/zinc -shutdown
>>
>> Best Regards,
>> Attila
>>
>> On Wed, Mar 10, 2021 at 11:27 AM jiahong li 
>> wrote:
>>
>>> hi, everybody, when i compile spark 3.1.1 from tag v3.1.1 ,encounter
>>> error like this:
>>>
>>> INFO] --- scala-maven-plugin:4.3.0:compile (scala-compile-first) @
>>> spark-core_2.12 ---
>>> [INFO] Using incremental compilation using Mixed compile order
>>> [INFO] Compiler bridge file:
>>> .sbt/1.0/zinc/org.scala-sbt/org.scala-sbt-compiler-bridge_2.12-1.3.1-bin_2.12.10__52.0-1.3.1_20191012T045515.jar
>>> [INFO] compiler plugin:
>>> BasicArtifact(com.github.ghik,silencer-plugin_2.12.10,1.6.0,null)
>>> [INFO] Compiling 560 Scala sources and 99 Java sources to
>>> git/spark/core/target/scala-2.12/classes ...
>>> [ERROR] [Error]
>>> git/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107:
>>> type mismatch;
>>>  found   : K where type K
>>>  required: String
>>> [ERROR] [Error]
>>> git/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107:
>>> value map is not a member of V
>>> [ERROR] [Error]
>>> git/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107:
>>> missing argument list for method stripXSS in class XssSafeRequest
>>> Unapplied methods are only converted to functions when a function type
>>> is expected.
>>> You can make this conversion explicit by writing `stripXSS _` or
>>> `stripXSS(_)` instead of `stripXSS`.
>>> [ERROR] [Error]
>>> git/spark/core/src/main/scala/org/apache/spark/ui/PagedTable.scala:307:
>>> value startsWith is not a member of K
>>> [ERROR] [Error]
>>> git/spark/core/src/main/scala/org/apache/spark/util/Utils.scala:580: value
>>> toLowerCase is not a member of object org.apache.hadoop.util.StringUtils
>>> [ERROR] 5 errors found
>>>
>>> anybody encounter error like this?
>>>
>>>
>>


Re: compile spark 3.1.1 error

2021-03-10 Thread jiahong li
i use ./build/mvn to compile ,and after execute command
:./build/zinc-0.3.15/bin/zinc
-shutdown
and execute command like this: /dev/make-distribution.sh --name
custom-spark --pip  --tgz -Phive -Phive-thriftserver -Pyarn
-Dhadoop.version=2.6.0-cdh5.13.1 -DskipTests
same error appear.
and execute command: ps -ef |grep zinc, there is nothing containe zinc

Attila Zsolt Piros  于2021年3月10日周三 下午6:55写道:

> hi!
>
> Are you compiling Spark itself?
> Do you use "./build/mvn" from the project root?
> If you compiled an other version of Spark before and there the scala
> version was different then zinc/nailgun could cached the old classes which
> can cause similar troubles.
> In that case this could help:
>
> ./build/zinc-0.3.15/bin/zinc -shutdown
>
> Best Regards,
> Attila
>
> On Wed, Mar 10, 2021 at 11:27 AM jiahong li 
> wrote:
>
>> hi, everybody, when i compile spark 3.1.1 from tag v3.1.1 ,encounter
>> error like this:
>>
>> INFO] --- scala-maven-plugin:4.3.0:compile (scala-compile-first) @
>> spark-core_2.12 ---
>> [INFO] Using incremental compilation using Mixed compile order
>> [INFO] Compiler bridge file:
>> .sbt/1.0/zinc/org.scala-sbt/org.scala-sbt-compiler-bridge_2.12-1.3.1-bin_2.12.10__52.0-1.3.1_20191012T045515.jar
>> [INFO] compiler plugin:
>> BasicArtifact(com.github.ghik,silencer-plugin_2.12.10,1.6.0,null)
>> [INFO] Compiling 560 Scala sources and 99 Java sources to
>> git/spark/core/target/scala-2.12/classes ...
>> [ERROR] [Error]
>> git/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107:
>> type mismatch;
>>  found   : K where type K
>>  required: String
>> [ERROR] [Error]
>> git/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107:
>> value map is not a member of V
>> [ERROR] [Error]
>> git/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107:
>> missing argument list for method stripXSS in class XssSafeRequest
>> Unapplied methods are only converted to functions when a function type is
>> expected.
>> You can make this conversion explicit by writing `stripXSS _` or
>> `stripXSS(_)` instead of `stripXSS`.
>> [ERROR] [Error]
>> git/spark/core/src/main/scala/org/apache/spark/ui/PagedTable.scala:307:
>> value startsWith is not a member of K
>> [ERROR] [Error]
>> git/spark/core/src/main/scala/org/apache/spark/util/Utils.scala:580: value
>> toLowerCase is not a member of object org.apache.hadoop.util.StringUtils
>> [ERROR] 5 errors found
>>
>> anybody encounter error like this?
>>
>>
>


Re: compile spark 3.1.1 error

2021-03-10 Thread Attila Zsolt Piros
hi!

Are you compiling Spark itself?
Do you use "./build/mvn" from the project root?
If you compiled an other version of Spark before and there the scala
version was different then zinc/nailgun could cached the old classes which
can cause similar troubles.
In that case this could help:

./build/zinc-0.3.15/bin/zinc -shutdown

Best Regards,
Attila

On Wed, Mar 10, 2021 at 11:27 AM jiahong li  wrote:

> hi, everybody, when i compile spark 3.1.1 from tag v3.1.1 ,encounter error
> like this:
>
> INFO] --- scala-maven-plugin:4.3.0:compile (scala-compile-first) @
> spark-core_2.12 ---
> [INFO] Using incremental compilation using Mixed compile order
> [INFO] Compiler bridge file:
> .sbt/1.0/zinc/org.scala-sbt/org.scala-sbt-compiler-bridge_2.12-1.3.1-bin_2.12.10__52.0-1.3.1_20191012T045515.jar
> [INFO] compiler plugin:
> BasicArtifact(com.github.ghik,silencer-plugin_2.12.10,1.6.0,null)
> [INFO] Compiling 560 Scala sources and 99 Java sources to
> git/spark/core/target/scala-2.12/classes ...
> [ERROR] [Error]
> git/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107:
> type mismatch;
>  found   : K where type K
>  required: String
> [ERROR] [Error]
> git/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107:
> value map is not a member of V
> [ERROR] [Error]
> git/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107:
> missing argument list for method stripXSS in class XssSafeRequest
> Unapplied methods are only converted to functions when a function type is
> expected.
> You can make this conversion explicit by writing `stripXSS _` or
> `stripXSS(_)` instead of `stripXSS`.
> [ERROR] [Error]
> git/spark/core/src/main/scala/org/apache/spark/ui/PagedTable.scala:307:
> value startsWith is not a member of K
> [ERROR] [Error]
> git/spark/core/src/main/scala/org/apache/spark/util/Utils.scala:580: value
> toLowerCase is not a member of object org.apache.hadoop.util.StringUtils
> [ERROR] 5 errors found
>
> anybody encounter error like this?
>
>


compile spark 3.1.1 error

2021-03-10 Thread jiahong li
hi, everybody, when i compile spark 3.1.1 from tag v3.1.1 ,encounter error
like this:

INFO] --- scala-maven-plugin:4.3.0:compile (scala-compile-first) @
spark-core_2.12 ---
[INFO] Using incremental compilation using Mixed compile order
[INFO] Compiler bridge file:
.sbt/1.0/zinc/org.scala-sbt/org.scala-sbt-compiler-bridge_2.12-1.3.1-bin_2.12.10__52.0-1.3.1_20191012T045515.jar
[INFO] compiler plugin:
BasicArtifact(com.github.ghik,silencer-plugin_2.12.10,1.6.0,null)
[INFO] Compiling 560 Scala sources and 99 Java sources to
git/spark/core/target/scala-2.12/classes ...
[ERROR] [Error]
git/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107:
type mismatch;
 found   : K where type K
 required: String
[ERROR] [Error]
git/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107:
value map is not a member of V
[ERROR] [Error]
git/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107:
missing argument list for method stripXSS in class XssSafeRequest
Unapplied methods are only converted to functions when a function type is
expected.
You can make this conversion explicit by writing `stripXSS _` or
`stripXSS(_)` instead of `stripXSS`.
[ERROR] [Error]
git/spark/core/src/main/scala/org/apache/spark/ui/PagedTable.scala:307:
value startsWith is not a member of K
[ERROR] [Error]
git/spark/core/src/main/scala/org/apache/spark/util/Utils.scala:580: value
toLowerCase is not a member of object org.apache.hadoop.util.StringUtils
[ERROR] 5 errors found

anybody encounter error like this?