The different protobuf classes are not generated on-the-fly on ARM build, it’s likely that the tracked source code is out-of-date.
https://github.com/apache/hadoop/blob/branch-3.4.3/hadoop-common-project/hadoop-common/src/main/arm-java/org/apache/hadoop/ipc/protobuf/ProtobufRpcEngineProtos.java Thanks, Cheng Pan On Mar 8, 2026, at 22:58, Steve Loughran <[email protected]> wrote: I spent a couple of hours this w/e getting claude to answer the question for me "do the x86 and aarch JAR files differ? The 3.4.3 release ended up using the arm build as its source of jar files because using the cloud x86 vm I was using for release produced multiple staging repos in apache nexus, something suggested to be VPN/IP address related: nexus saw requests coming in from different source IP addresses and so assigned the artifacts to different repos. I was curious about whether the binaries were different, and also whether it'd be possible to detect and defend against malicious release managers. That is: if I'd added a back door into the code, would it have been detected. Here then is my auditor https://github.com/steveloughran/auditor And here are the results. https://gist.github.com/steveloughran/d3c9ad6a718bfec68085b08584ae414e The main issue to flag is that in hadoop common, the protobuf classes are somehow different. leveldbjni is different too, which is interesting but not too concerning, though "auditor" does flag the different code is looking at the system environment so extra suspicious. I doubt thats new; just something we've never noticed before. Whatever the native protoc compilers are doing, they seem to be generating different classes, even with the same shaded protobuf being used. Thanks, Cheng Pan
