But as you can see from the test results on that PR, there's stuff in
hadoop-hdfs and hadoop-mapreduce which expects those codes to always been
on the classpath.

It sounds like incompatible change of hadoop-common rather than
test scope issue of HDFS and MapReduce.
How about making hadoop-compression a compile scope dependency
of hadoop-common first then removing it on the next major release (4.0.0)?

Thanks,
Masatake Iwasaki

On 2021/01/12 22:58, Steve Loughran wrote:
Following on from discussions in a previous PR, Liang-Chi Hsieh has created
a PR to add a new hadoop-compression module, with the goal being "new
compression codecs and their dependencies can go in here rather than
hadoop-common"

https://github.com/apache/hadoop/pull/2611

I think this is the right thing to do as it keeps the stuff out of
hadoop-common dependencies, and isolates the changes.

But as you can see from the test results on that PR, there's stuff in
hadoop-hdfs and hadoop-mapreduce which expects those codes to always been
on the classpath.

What to do?

1. We add the new pom as a dependency of hadoop hdfs server and its tests
(but not hdfs client), and for MR tests
2. we leave the old codecs in hadoop-common, and its only the recent stuff
which we add to the new module.

Suggestions?


---------------------------------------------------------------------
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org

Reply via email to