Github user srowen commented on the pull request:

    https://github.com/apache/spark/pull/5786#issuecomment-101438765
  
    `spark-shell` works locally for me. You're right, this may not work on 
Hadoop cluster X, but haven't we always generally needed to build Spark for 
Hadoop X to avoid this? I get it though, maybe the inconsistent Hadoop client 
libs don't work whereas a consistent Hadoop 1.x client lib set did, even 
against a mismatched cluster version.
    
    Fair point and all that but this isn't the right way to build Spark anyway, 
and I'm afraid this change was effectively already released. I'm narrowly 
arguing against undoing the `hadoop.version=2.2.0` change. I'm also asserting 
that the 1.4 release artifacts will be fine.
    
    And then saying we should fix-forward the rest of this for 1.5, if not 1.4.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to