Github user srowen commented on the pull request: https://github.com/apache/spark/pull/5786#issuecomment-101284746 @pwendell I don't think the previous update was wrong, certainly not for development. It was insufficient for creating a Hadoop 2.2 assembly from defaults, but that's not how Hadoop 2.2 assemblies are created. In that sense, this is not required for release 1.4 to be as correct as ever. Still the idea is that it would be better to make the default fully consistent, as if it were ready for a Hadoop 2.2 assembly. I think the cat is out of the bag on #5027; I believe 1.3 was accidentally released with, effectively, this change? So I don't think undo that, certainly not if it's solving more problems than it causes. (This is not at all about building for CDH.) This doesn't remove any profiles in order to reduce impact on build scripts, yes -- otherwise `-Phadoop-2.2` would start being an error. However it must add a `hadoop-1` profile to allow selecting the Hadoop 1.x settings. This "profile" has always silently existed as the unofficial collection of defaults. Adding it does indeed require a developer change -- but only for those who need to build for Hadoop 1.x explicitly. It at least makes this explicit. The cleanup is appealing, of course. I would campaign modestly for introducing this into 1.4. If the above hasn't swayed your second opinion here though, then let's just do nothing for 1.4, and put this into master for 1.5. By that point I think the case will be stronger still, and there will have been time to get used to the change for the small subset of people who need to build for 1.x.
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org