Github user nkronenfeld commented on the pull request:

    https://github.com/apache/spark/pull/5565#issuecomment-94554558
  
    @srowen I'm sure, for what it's worth, that part works out - it's whether 
the surrounding application structure can fit in that paradigm that is the 
issue.
    
    Just to be clear, I didn't really expect this to ever get merged in as is 
:-)  I am hoping for a few things from this PR:
    
    1. I wanted a list of ununified points the various APIs needed to be unified
    2. I wanted to know what sort of compatibility was needed (code-compatible, 
which with one exception, this is, vs. binary compatible, which it definitely 
isn't)
    3. I was hoping to foster in the community some sense that this was a goal 
we could work towards
      * So that when we got to the next point where we could make compatibility 
changes, we would be ready to do so
      * And to try and prevent further changes from moving the APIs even 
farther appart.
    4. And, lastly, I was hoping that someone whos scala was better than mine 
might have some ideas on how to do this without as many API changes (for 
instance, modifying this so that we could have all 4 versions of reduceByKey, 
with the bad one deprecated, and still have the compiler intuit types correctly)
    



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to