Github user rmetzger commented on the pull request:

    https://github.com/apache/incubator-flink/pull/252#issuecomment-66112318
  
    The reason why I added the dependency to `hadoop-common` is that I need it 
to be able to instantiate the `NullWritable`.
    
    Before that, we only needed the `Writable` interface which didn't require 
any other classes from Hadoop. However the `NullWritable` is depending on a lot 
of classes. I first tried it by copy-pasting classes from Hadoop, but after the 
5th file or so I gave up. I don't know how many files it will be in the end. 
    But it quickly becomes dangerous shipping Hadoop code due to incompatible 
versions.
    
    Also, at the end of the day, people are going to have the Hadoop jars in 
their classpath anyways, because flink-runtime is depending on it.
    
    The only argument left is probably the collection based execution. I think 
that one only requires flink-core and flink-java. But if somebody has an issue 
with the hadoop dependency, they can exclude it.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

Reply via email to