Re: maven doesn't build dependencies with Scala 2.11

2015-02-05 Thread Ted Yu
Now that Kafka 0.8.2.0 has been released, adding external/kafka module
works.

FYI

On Sun, Jan 18, 2015 at 7:36 PM, Ted Yu yuzhih...@gmail.com wrote:

 bq. there was no 2.11 Kafka available

 That's right. Adding external/kafka module resulted in:

 [ERROR] Failed to execute goal on project spark-streaming-kafka_2.11:
 Could not resolve dependencies for project
 org.apache.spark:spark-streaming-kafka_2.11:jar:1.3.0-SNAPSHOT: Could not
 find artifact org.apache.kafka:kafka_2.11:jar:0.8.0 in central (
 https://repo1.maven.org/maven2) - [Help 1]

 Cheers

 On Sun, Jan 18, 2015 at 10:41 AM, Sean Owen so...@cloudera.com wrote:

 I could be wrong, but I thought this was on purpose. At the time it
 was set up, there was no 2.11 Kafka available? or one of its
 dependencies wouldn't work with 2.11?

 But I'm not sure what the OP means by maven doesn't build Spark's
 dependencies because Ted indicates it does, and of course you can see
 that these artifacts are published.

 On Sun, Jan 18, 2015 at 2:46 AM, Ted Yu yuzhih...@gmail.com wrote:
  There're 3 jars under lib_managed/jars directory with and without
  -Dscala-2.11 flag.
 
  Difference between scala-2.10 and scala-2.11 profiles is that scala-2.10
  profile has the following:
modules
  moduleexternal/kafka/module
/modules
 
  FYI
 
  On Sat, Jan 17, 2015 at 4:07 PM, Ted Yu yuzhih...@gmail.com wrote:
 
  I did the following:
   1655  dev/change-version-to-2.11.sh
   1657  mvn -DHADOOP_PROFILE=hadoop-2.4 -Pyarn,hive -Phadoop-2.4
  -Dscala-2.11 -DskipTests clean package
 
  And mvn command passed.
 
  Did you see any cross-compilation errors ?
 
  Cheers
 
  BTW the two links you mentioned are consistent in terms of building for
  Scala 2.11
 
  On Sat, Jan 17, 2015 at 3:43 PM, Walrus theCat walrusthe...@gmail.com
 
  wrote:
 
  Hi,
 
  When I run this:
 
  dev/change-version-to-2.11.sh
  mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package
 
  as per here, maven doesn't build Spark's dependencies.
 
  Only when I run:
 
  dev/change-version-to-2.11.sh
  sbt/sbt -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests  clean package
 
  as gathered from here, do I get Spark's dependencies built without any
  cross-compilation errors.
 
  Question:
 
  - How can I make maven do this?
 
  - How can I specify the use of Scala 2.11 in my own .pom files?
 
  Thanks
 
 
 





Re: maven doesn't build dependencies with Scala 2.11

2015-01-18 Thread Ted Yu
bq. there was no 2.11 Kafka available

That's right. Adding external/kafka module resulted in:

[ERROR] Failed to execute goal on project spark-streaming-kafka_2.11: Could
not resolve dependencies for project
org.apache.spark:spark-streaming-kafka_2.11:jar:1.3.0-SNAPSHOT: Could not
find artifact org.apache.kafka:kafka_2.11:jar:0.8.0 in central (
https://repo1.maven.org/maven2) - [Help 1]

Cheers

On Sun, Jan 18, 2015 at 10:41 AM, Sean Owen so...@cloudera.com wrote:

 I could be wrong, but I thought this was on purpose. At the time it
 was set up, there was no 2.11 Kafka available? or one of its
 dependencies wouldn't work with 2.11?

 But I'm not sure what the OP means by maven doesn't build Spark's
 dependencies because Ted indicates it does, and of course you can see
 that these artifacts are published.

 On Sun, Jan 18, 2015 at 2:46 AM, Ted Yu yuzhih...@gmail.com wrote:
  There're 3 jars under lib_managed/jars directory with and without
  -Dscala-2.11 flag.
 
  Difference between scala-2.10 and scala-2.11 profiles is that scala-2.10
  profile has the following:
modules
  moduleexternal/kafka/module
/modules
 
  FYI
 
  On Sat, Jan 17, 2015 at 4:07 PM, Ted Yu yuzhih...@gmail.com wrote:
 
  I did the following:
   1655  dev/change-version-to-2.11.sh
   1657  mvn -DHADOOP_PROFILE=hadoop-2.4 -Pyarn,hive -Phadoop-2.4
  -Dscala-2.11 -DskipTests clean package
 
  And mvn command passed.
 
  Did you see any cross-compilation errors ?
 
  Cheers
 
  BTW the two links you mentioned are consistent in terms of building for
  Scala 2.11
 
  On Sat, Jan 17, 2015 at 3:43 PM, Walrus theCat walrusthe...@gmail.com
  wrote:
 
  Hi,
 
  When I run this:
 
  dev/change-version-to-2.11.sh
  mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package
 
  as per here, maven doesn't build Spark's dependencies.
 
  Only when I run:
 
  dev/change-version-to-2.11.sh
  sbt/sbt -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests  clean package
 
  as gathered from here, do I get Spark's dependencies built without any
  cross-compilation errors.
 
  Question:
 
  - How can I make maven do this?
 
  - How can I specify the use of Scala 2.11 in my own .pom files?
 
  Thanks
 
 
 



Re: maven doesn't build dependencies with Scala 2.11

2015-01-18 Thread Sean Owen
I could be wrong, but I thought this was on purpose. At the time it
was set up, there was no 2.11 Kafka available? or one of its
dependencies wouldn't work with 2.11?

But I'm not sure what the OP means by maven doesn't build Spark's
dependencies because Ted indicates it does, and of course you can see
that these artifacts are published.

On Sun, Jan 18, 2015 at 2:46 AM, Ted Yu yuzhih...@gmail.com wrote:
 There're 3 jars under lib_managed/jars directory with and without
 -Dscala-2.11 flag.

 Difference between scala-2.10 and scala-2.11 profiles is that scala-2.10
 profile has the following:
   modules
 moduleexternal/kafka/module
   /modules

 FYI

 On Sat, Jan 17, 2015 at 4:07 PM, Ted Yu yuzhih...@gmail.com wrote:

 I did the following:
  1655  dev/change-version-to-2.11.sh
  1657  mvn -DHADOOP_PROFILE=hadoop-2.4 -Pyarn,hive -Phadoop-2.4
 -Dscala-2.11 -DskipTests clean package

 And mvn command passed.

 Did you see any cross-compilation errors ?

 Cheers

 BTW the two links you mentioned are consistent in terms of building for
 Scala 2.11

 On Sat, Jan 17, 2015 at 3:43 PM, Walrus theCat walrusthe...@gmail.com
 wrote:

 Hi,

 When I run this:

 dev/change-version-to-2.11.sh
 mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package

 as per here, maven doesn't build Spark's dependencies.

 Only when I run:

 dev/change-version-to-2.11.sh
 sbt/sbt -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests  clean package

 as gathered from here, do I get Spark's dependencies built without any
 cross-compilation errors.

 Question:

 - How can I make maven do this?

 - How can I specify the use of Scala 2.11 in my own .pom files?

 Thanks




-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: maven doesn't build dependencies with Scala 2.11

2015-01-17 Thread Ted Yu
There're 3 jars under lib_managed/jars directory with and
without -Dscala-2.11 flag.

Difference between scala-2.10 and scala-2.11 profiles is that scala-2.10
profile has the following:
  modules
moduleexternal/kafka/module
  /modules

FYI

On Sat, Jan 17, 2015 at 4:07 PM, Ted Yu yuzhih...@gmail.com wrote:

 I did the following:
  1655  dev/change-version-to-2.11.sh
  1657  mvn -DHADOOP_PROFILE=hadoop-2.4 -Pyarn,hive -Phadoop-2.4
 -Dscala-2.11 -DskipTests clean package

 And mvn command passed.

 Did you see any cross-compilation errors ?

 Cheers

 BTW the two links you mentioned are consistent in terms of building for
 Scala 2.11

 On Sat, Jan 17, 2015 at 3:43 PM, Walrus theCat walrusthe...@gmail.com
 wrote:

 Hi,

 When I run this:

 dev/change-version-to-2.11.sh
 mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package

 as per here
 https://spark.apache.org/docs/latest/building-spark.html#building-for-scala-211,
 maven doesn't build Spark's dependencies.

 Only when I run:

 dev/change-version-to-2.11.sh
 sbt/sbt -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests  clean package

 as gathered from here 
 https://github.com/ScrapCodes/spark-1/blob/patch-3/docs/building-spark.md, 
 do I get Spark's dependencies built without any cross-compilation errors.

 *Question*:

 - How can I make maven do this?

 - How can I specify the use of Scala 2.11 in my own .pom files?

 Thanks





Re: maven doesn't build dependencies with Scala 2.11

2015-01-17 Thread Ted Yu
I did the following:
 1655  dev/change-version-to-2.11.sh
 1657  mvn -DHADOOP_PROFILE=hadoop-2.4 -Pyarn,hive -Phadoop-2.4
-Dscala-2.11 -DskipTests clean package

And mvn command passed.

Did you see any cross-compilation errors ?

Cheers

BTW the two links you mentioned are consistent in terms of building for
Scala 2.11

On Sat, Jan 17, 2015 at 3:43 PM, Walrus theCat walrusthe...@gmail.com
wrote:

 Hi,

 When I run this:

 dev/change-version-to-2.11.sh
 mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package

 as per here
 https://spark.apache.org/docs/latest/building-spark.html#building-for-scala-211,
 maven doesn't build Spark's dependencies.

 Only when I run:

 dev/change-version-to-2.11.sh
 sbt/sbt -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests  clean package

 as gathered from here 
 https://github.com/ScrapCodes/spark-1/blob/patch-3/docs/building-spark.md, 
 do I get Spark's dependencies built without any cross-compilation errors.

 *Question*:

 - How can I make maven do this?

 - How can I specify the use of Scala 2.11 in my own .pom files?

 Thanks