Hi! I was trying to clean up the Hadoop 2 profile in connectors (but failed because of some kind of maven-flatten-plugin issues):
It would make our life much easier to drop Hadoop2 support from the repo. For spark, Hadoop2/3 compatibility should be taken care of using the appropriate hadoop-mapreduce-shaded, so I don't think this would hurt Hadoop2 compatibility, but we could clean up the test dependencies. For Kafka, this would result in including newer and less CVE-riddled artifacts in the distro. WDYT ? Istvan