Hi Sean:
I have tried to use following script to build package but have problem( I am
building a spark package for Hive on Spark, so use hadoop2-without-hive)
./dev/make-distribution.sh --name hadoop2-without-hive --tgz -Pscala-2.12
-Phadoop-2.7 -Pyarn -Pparquet-provided -Dhadoop.version=2.7.3
The problem is
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal
on project spark-sketch_2.11: Could not resolve dependencies for project
org.apache.spa rk:spark-sketch_2.11:jar:2.3.0-SNAPSHOT: The following
artifacts could not be resolved:
org.apache.spark:spark-tags_2.12:jar:2.3.0-SNAPSHOT, org.apache.spark:spark-ta
gs_2.12:jar:tests:2.3.0-SNAPSHOT: Failure to find
org.apache.spark:spark-tags_2.12:jar:2.3.0-SNAPSHOT in
https://repository.apache.org/snapshots was cached in the loc al
repository,
After then I found that the artifactId in $SPARK_SOURCE/common/tags/pom.xml is
spark-tags_2.11. I change the $SPARK_SOURCE/common/tags/pom.xml as following,
the problem about {{.spark:spark-tags_2.12:jar:2.3.0-SNAPSHOT}} seems not exist
git diff common/tags/pom.xml
diff --git a/common/tags/pom.xml b/common/tags/pom.xml
index f7e586e..5f48105 100644
--- a/common/tags/pom.xml
+++ b/common/tags/pom.xml
@@ -26,7 +26,8 @@
<relativePath>../../pom.xml</relativePath>
</parent>
- <artifactId>spark-tags_2.11</artifactId>
+ <!--<artifactId>spark-tags_2.11</artifactId>-->
+ <artifactId>spark-tags_2.12</artifactId>
<packaging>jar</packaging>
<name>Spark Project Tags</name>
<url>http://spark.apache.org/</url<http://spark.apache.org/%3c/url>>
My question is
1. Should I change the $SPARK_SOURCE/common/tags/pom.xml manually to
spark-tags_2.12, if need, I guess I need to change {{spark-streaming}}
Appreciate to get some feedback from you.
Best Regards
Kelly Zhang/Zhang,Liyun
From: Sean Owen [mailto:[email protected]]
Sent: Tuesday, November 28, 2017 9:52 PM
To: Ofir Manor <[email protected]>
Cc: Zhang, Liyun <[email protected]>; dev <[email protected]>
Subject: Re: Does anyone know how to build spark with scala12.4?
The Scala 2.12 profile mostly works, but not all tests pass. Use -Pscala-2.12
on the command line to build.
On Tue, Nov 28, 2017 at 5:36 AM Ofir Manor
<[email protected]<mailto:[email protected]>> wrote:
Hi,
as far as I know, Spark does not support Scala 2.12.
There is on-going work to make refactor / fix Spark source code to support
Scala 2.12 - look for multiple emails on this list in the last months from Sean
Owen on his progress.
Once Spark supports Scala 2.12, I think the next target would be JDK 9 support.
Ofir Manor
Co-Founder & CTO | Equalum
Mobile: +972-54-7801286<tel:%2B972-54-7801286> | Email:
[email protected]<mailto:[email protected]>
On Tue, Nov 28, 2017 at 9:20 AM, Zhang, Liyun
<[email protected]<mailto:[email protected]>> wrote:
Hi all:
Does anyone know how to build spark with scala12.4? I want to test whether
spark can work on jdk9 or not. Scala12.4 supports jdk9. Does anyone try to
build spark with scala 12.4 or compile successfully with jdk9.Appreciate to get
some feedback from you.
Best Regards
Kelly Zhang/Zhang,Liyun