Now that Kafka 0.8.2.0 has been released, adding external/kafka module
works.
FYI
On Sun, Jan 18, 2015 at 7:36 PM, Ted Yu yuzhih...@gmail.com wrote:
bq. there was no 2.11 Kafka available
That's right. Adding external/kafka module resulted in:
[ERROR] Failed to execute goal on project
bq. there was no 2.11 Kafka available
That's right. Adding external/kafka module resulted in:
[ERROR] Failed to execute goal on project spark-streaming-kafka_2.11: Could
not resolve dependencies for project
org.apache.spark:spark-streaming-kafka_2.11:jar:1.3.0-SNAPSHOT: Could not
find artifact
I could be wrong, but I thought this was on purpose. At the time it
was set up, there was no 2.11 Kafka available? or one of its
dependencies wouldn't work with 2.11?
But I'm not sure what the OP means by maven doesn't build Spark's
dependencies because Ted indicates it does, and of course you
Hi,
When I run this:
dev/change-version-to-2.11.sh
mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package
as per here
https://spark.apache.org/docs/latest/building-spark.html#building-for-scala-211,
maven doesn't build Spark's dependencies.
Only when I run:
There're 3 jars under lib_managed/jars directory with and
without -Dscala-2.11 flag.
Difference between scala-2.10 and scala-2.11 profiles is that scala-2.10
profile has the following:
modules
moduleexternal/kafka/module
/modules
FYI
On Sat, Jan 17, 2015 at 4:07 PM, Ted Yu
I did the following:
1655 dev/change-version-to-2.11.sh
1657 mvn -DHADOOP_PROFILE=hadoop-2.4 -Pyarn,hive -Phadoop-2.4
-Dscala-2.11 -DskipTests clean package
And mvn command passed.
Did you see any cross-compilation errors ?
Cheers
BTW the two links you mentioned are consistent in terms of