currently, no solutions find!
Dereck Li
Apache Spark Contributor
Continuing Learner
@Hangzhou,China
jason_xu 于2021年5月11日周二 上午8:01写道:
> Hi Jiahong, I got the same failure on building spark 3.1.1 with hadoop
> 2.8.5.
> Any chance you find a solution?
>
>
>
> --
> Sent from: http://apache-spark-u
Hi,everyone,
when i compile combine with hadoop version 2.6.0-cdh5.13.1 ,compile comand
is
./dev/make-distribution.sh --name 2.6.0-cdh5.13.1 --pip --tgz -Phive
-Phive-thriftserver -Pyarn -Dhadoop.version=2.6.0-cdh5.13.1,
there exists error like this:
[INFO] --- scala-maven-plugin:4.3.0:compile
Maybe it is my environment cause
jiahong li 于2021年3月11日周四 上午11:14写道:
> it not the cause,when i set -Phadoop-2.7 instead of
> -Dhadoop.version=2.6.0-cdh5.13.1, the same errors come out.
>
> Attila Zsolt Piros 于2021年3月10日周三 下午8:56写道:
>
>> I see, this must be because of h
3.1.1 only support hadoop-2.7 and hadoop-3.2, at least these two can
> be given via profiles: -Phadoop-2.7 and -Phadoop-3.2 (the default).
>
>
> On Wed, Mar 10, 2021 at 12:26 PM jiahong li
> wrote:
>
>> i use ./build/mvn to compile ,and after execute command
>>
sion was different then zinc/nailgun could cached the old classes which
> can cause similar troubles.
> In that case this could help:
>
> ./build/zinc-0.3.15/bin/zinc -shutdown
>
> Best Regards,
> Attila
>
> On Wed, Mar 10, 2021 at 11:27 AM jiahong li
> wrote:
>
hi, everybody, when i compile spark 3.1.1 from tag v3.1.1 ,encounter error
like this:
INFO] --- scala-maven-plugin:4.3.0:compile (scala-compile-first) @
spark-core_2.12 ---
[INFO] Using incremental compilation using Mixed compile order
[INFO] Compiler bridge file:
.sbt/1.0/zinc/org.scala-sbt/org.s
modern
> database management systems to Apache Spark <http://spark.apache.org/>.*
>
>
>
> On 03/10/2021 10:56,jiahong li
> wrote:
>
> Hi,sorry to bother you.In spark 3.0.1,hive-1.2 is supported,but in spark
> 3.1.x maven profile hive-1.1 is removed.Is that means hive-1.2 does not
> supported in spark 3.1.x? how can i support hive-1.2 in spark 3.1.x,or any
> jira? can anyone help me ?
>
>
Hi,sorry to bother you.In spark 3.0.1,hive-1.2 is supported,but in spark
3.1.x maven profile hive-1.1 is removed.Is that means hive-1.2 does not
supported in spark 3.1.x? how can i support hive-1.2 in spark 3.1.x,or any
jira? can anyone help me ?