[ 
https://issues.apache.org/jira/browse/HADOOP-19174?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bilwa S T updated HADOOP-19174:
-------------------------------
    Description: 
There are two issues here:

*1. We are running tez 0.10.3 which uses hadoop 3.3.6 version. Tez has protobuf 
version 3.21.1*

Below is the exception we get. This is due to protobuf-2.5.0 in our hadoop 
classpath
{code:java}
java.lang.IllegalAccessError: class 
org.apache.tez.dag.api.records.DAGProtos$ConfigurationProto tried to access 
private field com.google.protobuf.AbstractMessage.memoizedSize 
(org.apache.tez.dag.api.records.DAGProtos$ConfigurationProto and 
com.google.protobuf.AbstractMessage are in unnamed module of loader 'app')
at 
org.apache.tez.dag.api.records.DAGProtos$ConfigurationProto.getSerializedSize(DAGProtos.java:21636)
at com.google.protobuf.AbstractMessageLite.writeTo(AbstractMessageLite.java:75)
at org.apache.tez.common.TezUtils.writeConfInPB(TezUtils.java:170)
at org.apache.tez.common.TezUtils.createByteStringFromConf(TezUtils.java:83)
at org.apache.tez.common.TezUtils.createUserPayloadFromConf(TezUtils.java:101)
at org.apache.tez.dag.app.DAGAppMaster.serviceInit(DAGAppMaster.java:436)
at org.apache.hadoop.service.AbstractService.init(AbstractService.java:164)
at org.apache.tez.dag.app.DAGAppMaster$9.run(DAGAppMaster.java:2600)
at 
java.base/java.security.AccessController.doPrivileged(AccessController.java:712)
at java.base/javax.security.auth.Subject.doAs(Subject.java:439)
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899)
at 
org.apache.tez.dag.app.DAGAppMaster.initAndStartAppMaster(DAGAppMaster.java:2597)
at org.apache.tez.dag.app.DAGAppMaster.main(DAGAppMaster.java:2384)
2024-04-18 16:27:54,741 [INFO] [shutdown-hook-0] |app.DAGAppMaster|: 
DAGAppMasterShutdownHook invoked
2024-04-18 16:27:54,743 [INFO] [shutdown-hook-0] |service.AbstractService|: 
Service org.apache.tez.dag.app.DAGAppMaster failed in state STOPPED
java.lang.NullPointerException: Cannot invoke 
"org.apache.tez.dag.app.rm.TaskSchedulerManager.initiateStop()" because 
"this.taskSchedulerManager" is null
at org.apache.tez.dag.app.DAGAppMaster.initiateStop(DAGAppMaster.java:2111)
at org.apache.tez.dag.app.DAGAppMaster.serviceStop(DAGAppMaster.java:2126)
at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:220)
at 
org.apache.tez.dag.app.DAGAppMaster$DAGAppMasterShutdownHook.run(DAGAppMaster.java:2432)
at 
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
at java.base/java.lang.Thread.run(Thread.java:840)
2024-04-18 16:27:54,744 [WARN] [Thread-2] |util.ShutdownHookManager|: 
ShutdownHook 'DAGAppMasterShutdownHook' failed, 
java.util.concurrent.ExecutionException: java.lang.NullPointerException: Cannot 
invoke "org.apache.tez.dag.app.rm.TaskSchedulerManager.initiateStop()" because 
"this.taskSchedulerManager" is null
java.util.concurrent.ExecutionException: java.lang.NullPointerException: Cannot 
invoke "org.apache.tez.dag.app.rm.TaskSchedulerManager.initiateStop()" because 
"this.taskSchedulerManager" is null
at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:205)
at 
org.apache.hadoop.util.ShutdownHookManager.executeShutdown(ShutdownHookManager.java:124)
at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:95)
Caused by: java.lang.NullPointerException: Cannot invoke 
"org.apache.tez.dag.app.rm.TaskSchedulerManager.initiateStop()" because 
"this.taskSchedulerManager" is null
at org.apache.tez.dag.app.DAGAppMaster.initiateStop(DAGAppMaster.java:2111)
at org.apache.tez.dag.app.DAGAppMaster.serviceStop(DAGAppMaster.java:2126)
at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:220)
at 
org.apache.tez.dag.app.DAGAppMaster$DAGAppMasterShutdownHook.run(DAGAppMaster.java:2432)
at 
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
at java.base/java.lang.Thread.run(Thread.java:840){code}
2. Run Hive having protobuf 3.24.4 with hadoop 3.3.6

Containers fail with below exception :

 
{code:java}
2024-04-20 13:23:28,008 [INFO] [Dispatcher thread
{Central}
] |container.AMContainerImpl|: Container 
container_e02_1713455139547_0111_01_000004 exited with diagnostics set to 
Container failed, exitCode=-1000. [2024-04-20 
13:23:27.799]com/google/protobuf/ServiceException 
java.lang.NoClassDefFoundError: com/google/protobuf/ServiceException at 
org.apache.hadoop.hdfs.protocolPB.PBHelperClient.convert(PBHelperClient.java:807)
 at 
org.apache.hadoop.hdfs.protocolPB.PBHelperClient.convertLocatedBlockProto(PBHelperClient.java:680)
 at 
org.apache.hadoop.hdfs.protocolPB.PBHelperClient.convertLocatedBlocks(PBHelperClient.java:985)
 at 
org.apache.hadoop.hdfs.protocolPB.PBHelperClient.convert(PBHelperClient.java:837)
 at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:337)
 at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native 
Method) at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
 at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.base/java.lang.reflect.Method.invoke(Method.java:568) at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:433)
 at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:166)
 at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:158)
 at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:96)
 at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:362)
 at jdk.proxy2/jdk.proxy2.$Proxy16.getBlockLocations(Unknown Source) at 
org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:900) at 
org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:889) at 
org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:878) at 
org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1046) at 
org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:343)
 at 
org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:339)
 at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
 at 
org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:356)
 at 
org.apache.hadoop.fs.FileSystem.lambda$openFileWithOptions$0(FileSystem.java:4776)
 at org.apache.hadoop.util.LambdaUtils.eval(LambdaUtils.java:52) at 
org.apache.hadoop.fs.FileSystem.openFileWithOptions(FileSystem.java:4774) at 
org.apache.hadoop.fs.FileSystem$FSDataInputStreamBuilder.build(FileSystem.java:4913)
 at org.apache.hadoop.yarn.util.FSDownload.unpack(FSDownload.java:342) at 
org.apache.hadoop.yarn.util.FSDownload.downloadAndUnpack(FSDownload.java:314) 
at org.apache.hadoop.yarn.util.FSDownload.verifyAndCopy(FSDownload.java:292) at 
org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:72) at 
org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:425) at 
org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:422) at 
java.base/java.security.AccessController.doPrivileged(AccessController.java:712)
 at java.base/javax.security.auth.Subject.doAs(Subject.java:439) at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899)
 at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:422) at 
org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.doDownloadCall(ContainerLocalizer.java:255)
 at 
org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:248)
 at 
org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:236)
 at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at 
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
 at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
 at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
 at java.base/java.lang.Thread.run(Thread.java:840) Caused by: 
java.lang.ClassNotFoundException: com.google.protobuf.ServiceException at 
java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:641)
 at 
java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188
{code}
 

  was:
There are two issues here:

1. We are running tez 0.10.3 which uses hadoop 3.3.6 version. Tez has protobuf 
version 3.21.1

Below is the exception we get. This is due to protobuf-2.5.0 in our hadoop 
classpath

java.lang.IllegalAccessError: class 
org.apache.tez.dag.api.records.DAGProtos$ConfigurationProto tried to access 
private field com.google.protobuf.AbstractMessage.memoizedSize 
(org.apache.tez.dag.api.records.DAGProtos$ConfigurationProto and 
com.google.protobuf.AbstractMessage are in unnamed module of loader 'app')
    at 
org.apache.tez.dag.api.records.DAGProtos$ConfigurationProto.getSerializedSize(DAGProtos.java:21636)
    at 
com.google.protobuf.AbstractMessageLite.writeTo(AbstractMessageLite.java:75)
    at org.apache.tez.common.TezUtils.writeConfInPB(TezUtils.java:170)
    at org.apache.tez.common.TezUtils.createByteStringFromConf(TezUtils.java:83)
    at 
org.apache.tez.common.TezUtils.createUserPayloadFromConf(TezUtils.java:101)
    at org.apache.tez.dag.app.DAGAppMaster.serviceInit(DAGAppMaster.java:436)
    at org.apache.hadoop.service.AbstractService.init(AbstractService.java:164)
    at org.apache.tez.dag.app.DAGAppMaster$9.run(DAGAppMaster.java:2600)
    at 
java.base/java.security.AccessController.doPrivileged(AccessController.java:712)
    at java.base/javax.security.auth.Subject.doAs(Subject.java:439)
    at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899)
    at 
org.apache.tez.dag.app.DAGAppMaster.initAndStartAppMaster(DAGAppMaster.java:2597)
    at org.apache.tez.dag.app.DAGAppMaster.main(DAGAppMaster.java:2384)
2024-04-18 16:27:54,741 [INFO] [shutdown-hook-0] |app.DAGAppMaster|: 
DAGAppMasterShutdownHook invoked
2024-04-18 16:27:54,743 [INFO] [shutdown-hook-0] |service.AbstractService|: 
Service org.apache.tez.dag.app.DAGAppMaster failed in state STOPPED
java.lang.NullPointerException: Cannot invoke 
"org.apache.tez.dag.app.rm.TaskSchedulerManager.initiateStop()" because 
"this.taskSchedulerManager" is null
    at org.apache.tez.dag.app.DAGAppMaster.initiateStop(DAGAppMaster.java:2111)
    at org.apache.tez.dag.app.DAGAppMaster.serviceStop(DAGAppMaster.java:2126)
    at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:220)
    at 
org.apache.tez.dag.app.DAGAppMaster$DAGAppMasterShutdownHook.run(DAGAppMaster.java:2432)
    at 
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
    at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
    at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
    at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
    at java.base/java.lang.Thread.run(Thread.java:840)
2024-04-18 16:27:54,744 [WARN] [Thread-2] |util.ShutdownHookManager|: 
ShutdownHook 'DAGAppMasterShutdownHook' failed, 
java.util.concurrent.ExecutionException: java.lang.NullPointerException: Cannot 
invoke "org.apache.tez.dag.app.rm.TaskSchedulerManager.initiateStop()" because 
"this.taskSchedulerManager" is null
java.util.concurrent.ExecutionException: java.lang.NullPointerException: Cannot 
invoke "org.apache.tez.dag.app.rm.TaskSchedulerManager.initiateStop()" because 
"this.taskSchedulerManager" is null
    at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122)
    at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:205)
    at 
org.apache.hadoop.util.ShutdownHookManager.executeShutdown(ShutdownHookManager.java:124)
    at 
org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:95)
Caused by: java.lang.NullPointerException: Cannot invoke 
"org.apache.tez.dag.app.rm.TaskSchedulerManager.initiateStop()" because 
"this.taskSchedulerManager" is null
    at org.apache.tez.dag.app.DAGAppMaster.initiateStop(DAGAppMaster.java:2111)
    at org.apache.tez.dag.app.DAGAppMaster.serviceStop(DAGAppMaster.java:2126)
    at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:220)
    at 
org.apache.tez.dag.app.DAGAppMaster$DAGAppMasterShutdownHook.run(DAGAppMaster.java:2432)
    at 
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
    at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
    at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
    at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
    at java.base/java.lang.Thread.run(Thread.java:840)


2.  Run Hive having protobuf 3.24.4 with hadoop 3.3.6

Containers fail with below exception :

2024-04-20 13:23:28,008 [INFO] [Dispatcher thread

{Central}
] |container.AMContainerImpl|: Container 
container_e02_1713455139547_0111_01_000004 exited with diagnostics set to 
Container failed, exitCode=-1000. [2024-04-20 
13:23:27.799]com/google/protobuf/ServiceException 
java.lang.NoClassDefFoundError: com/google/protobuf/ServiceException at 
org.apache.hadoop.hdfs.protocolPB.PBHelperClient.convert(PBHelperClient.java:807)
 at 
org.apache.hadoop.hdfs.protocolPB.PBHelperClient.convertLocatedBlockProto(PBHelperClient.java:680)
 at 
org.apache.hadoop.hdfs.protocolPB.PBHelperClient.convertLocatedBlocks(PBHelperClient.java:985)
 at 
org.apache.hadoop.hdfs.protocolPB.PBHelperClient.convert(PBHelperClient.java:837)
 at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:337)
 at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native 
Method) at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
 at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.base/java.lang.reflect.Method.invoke(Method.java:568) at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:433)
 at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:166)
 at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:158)
 at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:96)
 at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:362)
 at jdk.proxy2/jdk.proxy2.$Proxy16.getBlockLocations(Unknown Source) at 
org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:900) at 
org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:889) at 
org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:878) at 
org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1046) at 
org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:343)
 at 
org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:339)
 at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
 at 
org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:356)
 at 
org.apache.hadoop.fs.FileSystem.lambda$openFileWithOptions$0(FileSystem.java:4776)
 at org.apache.hadoop.util.LambdaUtils.eval(LambdaUtils.java:52) at 
org.apache.hadoop.fs.FileSystem.openFileWithOptions(FileSystem.java:4774) at 
org.apache.hadoop.fs.FileSystem$FSDataInputStreamBuilder.build(FileSystem.java:4913)
 at org.apache.hadoop.yarn.util.FSDownload.unpack(FSDownload.java:342) at 
org.apache.hadoop.yarn.util.FSDownload.downloadAndUnpack(FSDownload.java:314) 
at org.apache.hadoop.yarn.util.FSDownload.verifyAndCopy(FSDownload.java:292) at 
org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:72) at 
org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:425) at 
org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:422) at 
java.base/java.security.AccessController.doPrivileged(AccessController.java:712)
 at java.base/javax.security.auth.Subject.doAs(Subject.java:439) at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899)
 at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:422) at 
org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.doDownloadCall(ContainerLocalizer.java:255)
 at 
org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:248)
 at 
org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:236)
 at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at 
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
 at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
 at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
 at java.base/java.lang.Thread.run(Thread.java:840) Caused by: 
java.lang.ClassNotFoundException: com.google.protobuf.ServiceException at 
java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:641)
 at 
java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188

 




> Tez and hive jobs fail due to google's protobuf 2.5.0 in classpath
> ------------------------------------------------------------------
>
>                 Key: HADOOP-19174
>                 URL: https://issues.apache.org/jira/browse/HADOOP-19174
>             Project: Hadoop Common
>          Issue Type: Bug
>            Reporter: Bilwa S T
>            Assignee: Bilwa S T
>            Priority: Major
>
> There are two issues here:
> *1. We are running tez 0.10.3 which uses hadoop 3.3.6 version. Tez has 
> protobuf version 3.21.1*
> Below is the exception we get. This is due to protobuf-2.5.0 in our hadoop 
> classpath
> {code:java}
> java.lang.IllegalAccessError: class 
> org.apache.tez.dag.api.records.DAGProtos$ConfigurationProto tried to access 
> private field com.google.protobuf.AbstractMessage.memoizedSize 
> (org.apache.tez.dag.api.records.DAGProtos$ConfigurationProto and 
> com.google.protobuf.AbstractMessage are in unnamed module of loader 'app')
> at 
> org.apache.tez.dag.api.records.DAGProtos$ConfigurationProto.getSerializedSize(DAGProtos.java:21636)
> at 
> com.google.protobuf.AbstractMessageLite.writeTo(AbstractMessageLite.java:75)
> at org.apache.tez.common.TezUtils.writeConfInPB(TezUtils.java:170)
> at org.apache.tez.common.TezUtils.createByteStringFromConf(TezUtils.java:83)
> at org.apache.tez.common.TezUtils.createUserPayloadFromConf(TezUtils.java:101)
> at org.apache.tez.dag.app.DAGAppMaster.serviceInit(DAGAppMaster.java:436)
> at org.apache.hadoop.service.AbstractService.init(AbstractService.java:164)
> at org.apache.tez.dag.app.DAGAppMaster$9.run(DAGAppMaster.java:2600)
> at 
> java.base/java.security.AccessController.doPrivileged(AccessController.java:712)
> at java.base/javax.security.auth.Subject.doAs(Subject.java:439)
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899)
> at 
> org.apache.tez.dag.app.DAGAppMaster.initAndStartAppMaster(DAGAppMaster.java:2597)
> at org.apache.tez.dag.app.DAGAppMaster.main(DAGAppMaster.java:2384)
> 2024-04-18 16:27:54,741 [INFO] [shutdown-hook-0] |app.DAGAppMaster|: 
> DAGAppMasterShutdownHook invoked
> 2024-04-18 16:27:54,743 [INFO] [shutdown-hook-0] |service.AbstractService|: 
> Service org.apache.tez.dag.app.DAGAppMaster failed in state STOPPED
> java.lang.NullPointerException: Cannot invoke 
> "org.apache.tez.dag.app.rm.TaskSchedulerManager.initiateStop()" because 
> "this.taskSchedulerManager" is null
> at org.apache.tez.dag.app.DAGAppMaster.initiateStop(DAGAppMaster.java:2111)
> at org.apache.tez.dag.app.DAGAppMaster.serviceStop(DAGAppMaster.java:2126)
> at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:220)
> at 
> org.apache.tez.dag.app.DAGAppMaster$DAGAppMasterShutdownHook.run(DAGAppMaster.java:2432)
> at 
> java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
> at 
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
> at 
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
> at java.base/java.lang.Thread.run(Thread.java:840)
> 2024-04-18 16:27:54,744 [WARN] [Thread-2] |util.ShutdownHookManager|: 
> ShutdownHook 'DAGAppMasterShutdownHook' failed, 
> java.util.concurrent.ExecutionException: java.lang.NullPointerException: 
> Cannot invoke "org.apache.tez.dag.app.rm.TaskSchedulerManager.initiateStop()" 
> because "this.taskSchedulerManager" is null
> java.util.concurrent.ExecutionException: java.lang.NullPointerException: 
> Cannot invoke "org.apache.tez.dag.app.rm.TaskSchedulerManager.initiateStop()" 
> because "this.taskSchedulerManager" is null
> at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122)
> at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:205)
> at 
> org.apache.hadoop.util.ShutdownHookManager.executeShutdown(ShutdownHookManager.java:124)
> at 
> org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:95)
> Caused by: java.lang.NullPointerException: Cannot invoke 
> "org.apache.tez.dag.app.rm.TaskSchedulerManager.initiateStop()" because 
> "this.taskSchedulerManager" is null
> at org.apache.tez.dag.app.DAGAppMaster.initiateStop(DAGAppMaster.java:2111)
> at org.apache.tez.dag.app.DAGAppMaster.serviceStop(DAGAppMaster.java:2126)
> at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:220)
> at 
> org.apache.tez.dag.app.DAGAppMaster$DAGAppMasterShutdownHook.run(DAGAppMaster.java:2432)
> at 
> java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
> at 
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
> at 
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
> at java.base/java.lang.Thread.run(Thread.java:840){code}
> 2. Run Hive having protobuf 3.24.4 with hadoop 3.3.6
> Containers fail with below exception :
>  
> {code:java}
> 2024-04-20 13:23:28,008 [INFO] [Dispatcher thread
> {Central}
> ] |container.AMContainerImpl|: Container 
> container_e02_1713455139547_0111_01_000004 exited with diagnostics set to 
> Container failed, exitCode=-1000. [2024-04-20 
> 13:23:27.799]com/google/protobuf/ServiceException 
> java.lang.NoClassDefFoundError: com/google/protobuf/ServiceException at 
> org.apache.hadoop.hdfs.protocolPB.PBHelperClient.convert(PBHelperClient.java:807)
>  at 
> org.apache.hadoop.hdfs.protocolPB.PBHelperClient.convertLocatedBlockProto(PBHelperClient.java:680)
>  at 
> org.apache.hadoop.hdfs.protocolPB.PBHelperClient.convertLocatedBlocks(PBHelperClient.java:985)
>  at 
> org.apache.hadoop.hdfs.protocolPB.PBHelperClient.convert(PBHelperClient.java:837)
>  at 
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:337)
>  at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native 
> Method) at 
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
>  at 
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.base/java.lang.reflect.Method.invoke(Method.java:568) at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:433)
>  at 
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:166)
>  at 
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:158)
>  at 
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:96)
>  at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:362)
>  at jdk.proxy2/jdk.proxy2.$Proxy16.getBlockLocations(Unknown Source) at 
> org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:900) at 
> org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:889) at 
> org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:878) at 
> org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1046) at 
> org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:343)
>  at 
> org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:339)
>  at 
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>  at 
> org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:356)
>  at 
> org.apache.hadoop.fs.FileSystem.lambda$openFileWithOptions$0(FileSystem.java:4776)
>  at org.apache.hadoop.util.LambdaUtils.eval(LambdaUtils.java:52) at 
> org.apache.hadoop.fs.FileSystem.openFileWithOptions(FileSystem.java:4774) at 
> org.apache.hadoop.fs.FileSystem$FSDataInputStreamBuilder.build(FileSystem.java:4913)
>  at org.apache.hadoop.yarn.util.FSDownload.unpack(FSDownload.java:342) at 
> org.apache.hadoop.yarn.util.FSDownload.downloadAndUnpack(FSDownload.java:314) 
> at org.apache.hadoop.yarn.util.FSDownload.verifyAndCopy(FSDownload.java:292) 
> at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:72) at 
> org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:425) at 
> org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:422) at 
> java.base/java.security.AccessController.doPrivileged(AccessController.java:712)
>  at java.base/javax.security.auth.Subject.doAs(Subject.java:439) at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899)
>  at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:422) at 
> org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.doDownloadCall(ContainerLocalizer.java:255)
>  at 
> org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:248)
>  at 
> org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:236)
>  at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at 
> java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
>  at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at 
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
>  at 
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
>  at java.base/java.lang.Thread.run(Thread.java:840) Caused by: 
> java.lang.ClassNotFoundException: com.google.protobuf.ServiceException at 
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:641)
>  at 
> java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188
> {code}
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org

Reply via email to