+1 on cleaning up hadoop 2.x from the repository. 

It will also help get rid of many CVE prone and the mingled set of jars from 
the assembly. Also I think except `spark/hbase-spark` module no other module 
even supports `hadoop.profile`, as hadoop 2.x is hardcoded at all other places. 
This is quite troublesome when building fro hadoop-3 only. As a hack in our 
organisation we are forced to build with flags `hadoop-two.version` set to be 
same as `hadoop-three.version`.

If we are really to keep this maybe we should add profile handling across all 
modules. 

Regards,
Nihal

On 2025/05/09 05:53:30 Istvan Toth wrote:
> Hi!
> 
> I was trying to clean up the Hadoop 2 profile in connectors (but failed
> because of some kind of maven-flatten-plugin issues):
> 
> It would make our life much easier to drop Hadoop2 support from the repo.
> 
> For spark, Hadoop2/3 compatibility should be taken care of  using the
> appropriate hadoop-mapreduce-shaded, so I don't think this would hurt
> Hadoop2 compatibility, but we could clean up the test dependencies.
> 
> For Kafka, this would result in including newer and less CVE-riddled
> artifacts in the distro.
> 
> WDYT ?
> 
> Istvan
> 

Reply via email to