I'm trying to build spark 1.4.1 against CDH 5.3.2. I created a profile
called cdh5.3.2 in spark_parent.pom, made some changes for
sql/hive/v0.13.1, and the build finished successfully.

Here is my problem:

   - If I run `mvn -Pcdh5.3.2,yarn,hive install`, the artifacts are
   installed into my local repo.
   - I expected `hadoop-client` version should be
   `hadoop-client-2.5.0-cdh5.3.2`, but it actually `hadoop-client-2.2.0`.

If I add a dependency of `spark-sql-1.2.0-cdh5.3.2`, the version is
`hadoop-client-2.5.0-cdh5.3.2`.

What's the trick behind it?

Reply via email to