[jira] [Commented] (HADOOP-13809) hive: 'java.lang.IllegalStateException(zip file closed)'

2017-12-24 Thread jiangxy (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-13809?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16302808#comment-16302808
 ] 

jiangxy commented on HADOOP-13809:
--

It seems like a JDK bug.  
We upgrade OpenJDK and put a "jaxp.properties" file under JAVA_HOME/lib:

{code}
// jaxp.properties
javax.xml.parsers.DocumentBuilderFactory=org.apache.xerces.jaxp.DocumentBuilderFactoryImpl
{code}

then the exception disappear.

> hive: 'java.lang.IllegalStateException(zip file closed)'
> 
>
> Key: HADOOP-13809
> URL: https://issues.apache.org/jira/browse/HADOOP-13809
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: conf
>Affects Versions: 2.8.0
>Reporter: Adriano
>
> Randomly some of the hive queries are failing with the below exception on 
> HS2: 
> {code}
> 2016-11-07 02:36:40,996 ERROR org.apache.hadoop.hive.ql.exec.Task: 
> [HiveServer2-Background-Pool: Thread-1823748]: Ended Job = 
> job_1478336955303_31030 with exception 'java.lang.IllegalStateException(zip 
> file 
>  closed)' 
> java.lang.IllegalStateException: zip file closed 
> at java.util.zip.ZipFile.ensureOpen(ZipFile.java:634) 
> at java.util.zip.ZipFile.getEntry(ZipFile.java:305) 
> at java.util.jar.JarFile.getEntry(JarFile.java:227) 
> at sun.net.www.protocol.jar.URLJarFile.getEntry(URLJarFile.java:128) 
> at 
> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:132) 
> at 
> sun.net.www.protocol.jar.JarURLConnection.getInputStream(JarURLConnection.java:150)
>  
> at 
> java.net.URLClassLoader.getResourceAsStream(URLClassLoader.java:233) 
> at javax.xml.parsers.SecuritySupport$4.run(SecuritySupport.java:94) 
> at java.security.AccessController.doPrivileged(Native Method) 
> at 
> javax.xml.parsers.SecuritySupport.getResourceAsStream(SecuritySupport.java:87)
>  
> at 
> javax.xml.parsers.FactoryFinder.findJarServiceProvider(FactoryFinder.java:283)
>  
> at javax.xml.parsers.FactoryFinder.find(FactoryFinder.java:255) 
> at 
> javax.xml.parsers.DocumentBuilderFactory.newInstance(DocumentBuilderFactory.java:121)
>  
> at 
> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2526) 
> at 
> org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2503) 
> at 
> org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2409) 
> at org.apache.hadoop.conf.Configuration.get(Configuration.java:982) 
> at 
> org.apache.hadoop.mapred.JobConf.checkAndWarnDeprecation(JobConf.java:2032) 
> at org.apache.hadoop.mapred.JobConf.(JobConf.java:484) 
> at org.apache.hadoop.mapred.JobConf.(JobConf.java:474) 
> at org.apache.hadoop.mapreduce.Cluster.getJob(Cluster.java:210) 
> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:596) 
> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:594) 
> at java.security.AccessController.doPrivileged(Native Method) 
> at javax.security.auth.Subject.doAs(Subject.java:415) 
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
>  
> at 
> org.apache.hadoop.mapred.JobClient.getJobUsingCluster(JobClient.java:594) 
> at 
> org.apache.hadoop.mapred.JobClient.getTaskReports(JobClient.java:665) 
> at 
> org.apache.hadoop.mapred.JobClient.getReduceTaskReports(JobClient.java:689) 
> at 
> org.apache.hadoop.hive.ql.exec.mr.HadoopJobExecHelper.progress(HadoopJobExecHelper.java:272)
>  
> at 
> org.apache.hadoop.hive.ql.exec.mr.HadoopJobExecHelper.progress(HadoopJobExecHelper.java:549)
>  
> at 
> org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:435) 
> at 
> org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:137) 
> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) 
> at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) 
> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1770) 
> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1527) 
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1306) 
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1115) 
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1108) 
> at 
> org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:178)
>  
> at 
> org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:72)
>  
> at 
> org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:232)
>  
> at java.security.AccessController.doPrivileged(Native Method) 
> at 

[jira] [Commented] (HADOOP-13809) hive: 'java.lang.IllegalStateException(zip file closed)'

2017-12-19 Thread Steve Loughran (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-13809?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16296867#comment-16296867
 ] 

Steve Loughran commented on HADOOP-13809:
-

Configuration XML parsing has been completely redone for 2.0+, moving from a 
DOM to streaming parser. This is probably going to be a "Cannot Reproduce".

w.r.t CDH, you'll have to talk them I'm afraid

> hive: 'java.lang.IllegalStateException(zip file closed)'
> 
>
> Key: HADOOP-13809
> URL: https://issues.apache.org/jira/browse/HADOOP-13809
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: conf
>Affects Versions: 2.8.0
>Reporter: Adriano
>
> Randomly some of the hive queries are failing with the below exception on 
> HS2: 
> {code}
> 2016-11-07 02:36:40,996 ERROR org.apache.hadoop.hive.ql.exec.Task: 
> [HiveServer2-Background-Pool: Thread-1823748]: Ended Job = 
> job_1478336955303_31030 with exception 'java.lang.IllegalStateException(zip 
> file 
>  closed)' 
> java.lang.IllegalStateException: zip file closed 
> at java.util.zip.ZipFile.ensureOpen(ZipFile.java:634) 
> at java.util.zip.ZipFile.getEntry(ZipFile.java:305) 
> at java.util.jar.JarFile.getEntry(JarFile.java:227) 
> at sun.net.www.protocol.jar.URLJarFile.getEntry(URLJarFile.java:128) 
> at 
> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:132) 
> at 
> sun.net.www.protocol.jar.JarURLConnection.getInputStream(JarURLConnection.java:150)
>  
> at 
> java.net.URLClassLoader.getResourceAsStream(URLClassLoader.java:233) 
> at javax.xml.parsers.SecuritySupport$4.run(SecuritySupport.java:94) 
> at java.security.AccessController.doPrivileged(Native Method) 
> at 
> javax.xml.parsers.SecuritySupport.getResourceAsStream(SecuritySupport.java:87)
>  
> at 
> javax.xml.parsers.FactoryFinder.findJarServiceProvider(FactoryFinder.java:283)
>  
> at javax.xml.parsers.FactoryFinder.find(FactoryFinder.java:255) 
> at 
> javax.xml.parsers.DocumentBuilderFactory.newInstance(DocumentBuilderFactory.java:121)
>  
> at 
> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2526) 
> at 
> org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2503) 
> at 
> org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2409) 
> at org.apache.hadoop.conf.Configuration.get(Configuration.java:982) 
> at 
> org.apache.hadoop.mapred.JobConf.checkAndWarnDeprecation(JobConf.java:2032) 
> at org.apache.hadoop.mapred.JobConf.(JobConf.java:484) 
> at org.apache.hadoop.mapred.JobConf.(JobConf.java:474) 
> at org.apache.hadoop.mapreduce.Cluster.getJob(Cluster.java:210) 
> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:596) 
> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:594) 
> at java.security.AccessController.doPrivileged(Native Method) 
> at javax.security.auth.Subject.doAs(Subject.java:415) 
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
>  
> at 
> org.apache.hadoop.mapred.JobClient.getJobUsingCluster(JobClient.java:594) 
> at 
> org.apache.hadoop.mapred.JobClient.getTaskReports(JobClient.java:665) 
> at 
> org.apache.hadoop.mapred.JobClient.getReduceTaskReports(JobClient.java:689) 
> at 
> org.apache.hadoop.hive.ql.exec.mr.HadoopJobExecHelper.progress(HadoopJobExecHelper.java:272)
>  
> at 
> org.apache.hadoop.hive.ql.exec.mr.HadoopJobExecHelper.progress(HadoopJobExecHelper.java:549)
>  
> at 
> org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:435) 
> at 
> org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:137) 
> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) 
> at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) 
> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1770) 
> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1527) 
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1306) 
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1115) 
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1108) 
> at 
> org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:178)
>  
> at 
> org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:72)
>  
> at 
> org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:232)
>  
> at java.security.AccessController.doPrivileged(Native Method) 
> at 

[jira] [Commented] (HADOOP-13809) hive: 'java.lang.IllegalStateException(zip file closed)'

2017-12-19 Thread jiangxiyang (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-13809?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16296701#comment-16296701
 ] 

jiangxiyang commented on HADOOP-13809:
--

Any progress on this bug?
We met the same problem, with OpenJDK 1.7.0_79 and CDH 5.13

> hive: 'java.lang.IllegalStateException(zip file closed)'
> 
>
> Key: HADOOP-13809
> URL: https://issues.apache.org/jira/browse/HADOOP-13809
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: conf
>Affects Versions: 2.8.0, 3.0.0-alpha1
>Reporter: Adriano
>
> Randomly some of the hive queries are failing with the below exception on 
> HS2: 
> {code}
> 2016-11-07 02:36:40,996 ERROR org.apache.hadoop.hive.ql.exec.Task: 
> [HiveServer2-Background-Pool: Thread-1823748]: Ended Job = 
> job_1478336955303_31030 with exception 'java.lang.IllegalStateException(zip 
> file 
>  closed)' 
> java.lang.IllegalStateException: zip file closed 
> at java.util.zip.ZipFile.ensureOpen(ZipFile.java:634) 
> at java.util.zip.ZipFile.getEntry(ZipFile.java:305) 
> at java.util.jar.JarFile.getEntry(JarFile.java:227) 
> at sun.net.www.protocol.jar.URLJarFile.getEntry(URLJarFile.java:128) 
> at 
> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:132) 
> at 
> sun.net.www.protocol.jar.JarURLConnection.getInputStream(JarURLConnection.java:150)
>  
> at 
> java.net.URLClassLoader.getResourceAsStream(URLClassLoader.java:233) 
> at javax.xml.parsers.SecuritySupport$4.run(SecuritySupport.java:94) 
> at java.security.AccessController.doPrivileged(Native Method) 
> at 
> javax.xml.parsers.SecuritySupport.getResourceAsStream(SecuritySupport.java:87)
>  
> at 
> javax.xml.parsers.FactoryFinder.findJarServiceProvider(FactoryFinder.java:283)
>  
> at javax.xml.parsers.FactoryFinder.find(FactoryFinder.java:255) 
> at 
> javax.xml.parsers.DocumentBuilderFactory.newInstance(DocumentBuilderFactory.java:121)
>  
> at 
> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2526) 
> at 
> org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2503) 
> at 
> org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2409) 
> at org.apache.hadoop.conf.Configuration.get(Configuration.java:982) 
> at 
> org.apache.hadoop.mapred.JobConf.checkAndWarnDeprecation(JobConf.java:2032) 
> at org.apache.hadoop.mapred.JobConf.(JobConf.java:484) 
> at org.apache.hadoop.mapred.JobConf.(JobConf.java:474) 
> at org.apache.hadoop.mapreduce.Cluster.getJob(Cluster.java:210) 
> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:596) 
> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:594) 
> at java.security.AccessController.doPrivileged(Native Method) 
> at javax.security.auth.Subject.doAs(Subject.java:415) 
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
>  
> at 
> org.apache.hadoop.mapred.JobClient.getJobUsingCluster(JobClient.java:594) 
> at 
> org.apache.hadoop.mapred.JobClient.getTaskReports(JobClient.java:665) 
> at 
> org.apache.hadoop.mapred.JobClient.getReduceTaskReports(JobClient.java:689) 
> at 
> org.apache.hadoop.hive.ql.exec.mr.HadoopJobExecHelper.progress(HadoopJobExecHelper.java:272)
>  
> at 
> org.apache.hadoop.hive.ql.exec.mr.HadoopJobExecHelper.progress(HadoopJobExecHelper.java:549)
>  
> at 
> org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:435) 
> at 
> org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:137) 
> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) 
> at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) 
> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1770) 
> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1527) 
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1306) 
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1115) 
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1108) 
> at 
> org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:178)
>  
> at 
> org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:72)
>  
> at 
> org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:232)
>  
> at java.security.AccessController.doPrivileged(Native Method) 
> at javax.security.auth.Subject.doAs(Subject.java:415) 
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
>  

[jira] [Commented] (HADOOP-13809) hive: 'java.lang.IllegalStateException(zip file closed)'

2017-07-27 Thread frank luo (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-13809?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16103509#comment-16103509
 ] 

frank luo commented on HADOOP-13809:


I believe hive-11681 and this one all related to 
https://bugs.openjdk.java.net/browse/JDK-6947916, which hasn't been released.

I am able to recreate it with oracle jdk 1.8.0_131.

> hive: 'java.lang.IllegalStateException(zip file closed)'
> 
>
> Key: HADOOP-13809
> URL: https://issues.apache.org/jira/browse/HADOOP-13809
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: conf
>Affects Versions: 2.8.0, 3.0.0-alpha1
>Reporter: Adriano
>
> Randomly some of the hive queries are failing with the below exception on 
> HS2: 
> {code}
> 2016-11-07 02:36:40,996 ERROR org.apache.hadoop.hive.ql.exec.Task: 
> [HiveServer2-Background-Pool: Thread-1823748]: Ended Job = 
> job_1478336955303_31030 with exception 'java.lang.IllegalStateException(zip 
> file 
>  closed)' 
> java.lang.IllegalStateException: zip file closed 
> at java.util.zip.ZipFile.ensureOpen(ZipFile.java:634) 
> at java.util.zip.ZipFile.getEntry(ZipFile.java:305) 
> at java.util.jar.JarFile.getEntry(JarFile.java:227) 
> at sun.net.www.protocol.jar.URLJarFile.getEntry(URLJarFile.java:128) 
> at 
> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:132) 
> at 
> sun.net.www.protocol.jar.JarURLConnection.getInputStream(JarURLConnection.java:150)
>  
> at 
> java.net.URLClassLoader.getResourceAsStream(URLClassLoader.java:233) 
> at javax.xml.parsers.SecuritySupport$4.run(SecuritySupport.java:94) 
> at java.security.AccessController.doPrivileged(Native Method) 
> at 
> javax.xml.parsers.SecuritySupport.getResourceAsStream(SecuritySupport.java:87)
>  
> at 
> javax.xml.parsers.FactoryFinder.findJarServiceProvider(FactoryFinder.java:283)
>  
> at javax.xml.parsers.FactoryFinder.find(FactoryFinder.java:255) 
> at 
> javax.xml.parsers.DocumentBuilderFactory.newInstance(DocumentBuilderFactory.java:121)
>  
> at 
> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2526) 
> at 
> org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2503) 
> at 
> org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2409) 
> at org.apache.hadoop.conf.Configuration.get(Configuration.java:982) 
> at 
> org.apache.hadoop.mapred.JobConf.checkAndWarnDeprecation(JobConf.java:2032) 
> at org.apache.hadoop.mapred.JobConf.(JobConf.java:484) 
> at org.apache.hadoop.mapred.JobConf.(JobConf.java:474) 
> at org.apache.hadoop.mapreduce.Cluster.getJob(Cluster.java:210) 
> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:596) 
> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:594) 
> at java.security.AccessController.doPrivileged(Native Method) 
> at javax.security.auth.Subject.doAs(Subject.java:415) 
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
>  
> at 
> org.apache.hadoop.mapred.JobClient.getJobUsingCluster(JobClient.java:594) 
> at 
> org.apache.hadoop.mapred.JobClient.getTaskReports(JobClient.java:665) 
> at 
> org.apache.hadoop.mapred.JobClient.getReduceTaskReports(JobClient.java:689) 
> at 
> org.apache.hadoop.hive.ql.exec.mr.HadoopJobExecHelper.progress(HadoopJobExecHelper.java:272)
>  
> at 
> org.apache.hadoop.hive.ql.exec.mr.HadoopJobExecHelper.progress(HadoopJobExecHelper.java:549)
>  
> at 
> org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:435) 
> at 
> org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:137) 
> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) 
> at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) 
> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1770) 
> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1527) 
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1306) 
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1115) 
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1108) 
> at 
> org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:178)
>  
> at 
> org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:72)
>  
> at 
> org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:232)
>  
> at java.security.AccessController.doPrivileged(Native Method) 
> at javax.security.auth.Subject.doAs(Subject.java:415) 
> at 

[jira] [Commented] (HADOOP-13809) hive: 'java.lang.IllegalStateException(zip file closed)'

2017-07-27 Thread frank luo (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-13809?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16103420#comment-16103420
 ] 

frank luo commented on HADOOP-13809:


we are seeing it happening once in several days on hdp 2.5.3, jdk1.7.0_67, 
inside of hiveserver2 log. 

> hive: 'java.lang.IllegalStateException(zip file closed)'
> 
>
> Key: HADOOP-13809
> URL: https://issues.apache.org/jira/browse/HADOOP-13809
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: conf
>Affects Versions: 2.8.0, 3.0.0-alpha1
>Reporter: Adriano
>
> Randomly some of the hive queries are failing with the below exception on 
> HS2: 
> {code}
> 2016-11-07 02:36:40,996 ERROR org.apache.hadoop.hive.ql.exec.Task: 
> [HiveServer2-Background-Pool: Thread-1823748]: Ended Job = 
> job_1478336955303_31030 with exception 'java.lang.IllegalStateException(zip 
> file 
>  closed)' 
> java.lang.IllegalStateException: zip file closed 
> at java.util.zip.ZipFile.ensureOpen(ZipFile.java:634) 
> at java.util.zip.ZipFile.getEntry(ZipFile.java:305) 
> at java.util.jar.JarFile.getEntry(JarFile.java:227) 
> at sun.net.www.protocol.jar.URLJarFile.getEntry(URLJarFile.java:128) 
> at 
> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:132) 
> at 
> sun.net.www.protocol.jar.JarURLConnection.getInputStream(JarURLConnection.java:150)
>  
> at 
> java.net.URLClassLoader.getResourceAsStream(URLClassLoader.java:233) 
> at javax.xml.parsers.SecuritySupport$4.run(SecuritySupport.java:94) 
> at java.security.AccessController.doPrivileged(Native Method) 
> at 
> javax.xml.parsers.SecuritySupport.getResourceAsStream(SecuritySupport.java:87)
>  
> at 
> javax.xml.parsers.FactoryFinder.findJarServiceProvider(FactoryFinder.java:283)
>  
> at javax.xml.parsers.FactoryFinder.find(FactoryFinder.java:255) 
> at 
> javax.xml.parsers.DocumentBuilderFactory.newInstance(DocumentBuilderFactory.java:121)
>  
> at 
> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2526) 
> at 
> org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2503) 
> at 
> org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2409) 
> at org.apache.hadoop.conf.Configuration.get(Configuration.java:982) 
> at 
> org.apache.hadoop.mapred.JobConf.checkAndWarnDeprecation(JobConf.java:2032) 
> at org.apache.hadoop.mapred.JobConf.(JobConf.java:484) 
> at org.apache.hadoop.mapred.JobConf.(JobConf.java:474) 
> at org.apache.hadoop.mapreduce.Cluster.getJob(Cluster.java:210) 
> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:596) 
> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:594) 
> at java.security.AccessController.doPrivileged(Native Method) 
> at javax.security.auth.Subject.doAs(Subject.java:415) 
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
>  
> at 
> org.apache.hadoop.mapred.JobClient.getJobUsingCluster(JobClient.java:594) 
> at 
> org.apache.hadoop.mapred.JobClient.getTaskReports(JobClient.java:665) 
> at 
> org.apache.hadoop.mapred.JobClient.getReduceTaskReports(JobClient.java:689) 
> at 
> org.apache.hadoop.hive.ql.exec.mr.HadoopJobExecHelper.progress(HadoopJobExecHelper.java:272)
>  
> at 
> org.apache.hadoop.hive.ql.exec.mr.HadoopJobExecHelper.progress(HadoopJobExecHelper.java:549)
>  
> at 
> org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:435) 
> at 
> org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:137) 
> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) 
> at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) 
> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1770) 
> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1527) 
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1306) 
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1115) 
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1108) 
> at 
> org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:178)
>  
> at 
> org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:72)
>  
> at 
> org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:232)
>  
> at java.security.AccessController.doPrivileged(Native Method) 
> at javax.security.auth.Subject.doAs(Subject.java:415) 
> at 
> 

[jira] [Commented] (HADOOP-13809) hive: 'java.lang.IllegalStateException(zip file closed)'

2016-12-08 Thread Wangda Tan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-13809?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15733678#comment-15733678
 ] 

Wangda Tan commented on HADOOP-13809:
-

It looks related to one open Hive JIRA: 
https://issues.apache.org/jira/browse/HIVE-11681.

See the analysis: 
https://issues.apache.org/jira/browse/HIVE-11681?focusedCommentId=14736752=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-14736752

> hive: 'java.lang.IllegalStateException(zip file closed)'
> 
>
> Key: HADOOP-13809
> URL: https://issues.apache.org/jira/browse/HADOOP-13809
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: conf
>Affects Versions: 2.8.0, 3.0.0-alpha1
>Reporter: Adriano
>
> Randomly some of the hive queries are failing with the below exception on 
> HS2: 
> {code}
> 2016-11-07 02:36:40,996 ERROR org.apache.hadoop.hive.ql.exec.Task: 
> [HiveServer2-Background-Pool: Thread-1823748]: Ended Job = 
> job_1478336955303_31030 with exception 'java.lang.IllegalStateException(zip 
> file 
>  closed)' 
> java.lang.IllegalStateException: zip file closed 
> at java.util.zip.ZipFile.ensureOpen(ZipFile.java:634) 
> at java.util.zip.ZipFile.getEntry(ZipFile.java:305) 
> at java.util.jar.JarFile.getEntry(JarFile.java:227) 
> at sun.net.www.protocol.jar.URLJarFile.getEntry(URLJarFile.java:128) 
> at 
> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:132) 
> at 
> sun.net.www.protocol.jar.JarURLConnection.getInputStream(JarURLConnection.java:150)
>  
> at 
> java.net.URLClassLoader.getResourceAsStream(URLClassLoader.java:233) 
> at javax.xml.parsers.SecuritySupport$4.run(SecuritySupport.java:94) 
> at java.security.AccessController.doPrivileged(Native Method) 
> at 
> javax.xml.parsers.SecuritySupport.getResourceAsStream(SecuritySupport.java:87)
>  
> at 
> javax.xml.parsers.FactoryFinder.findJarServiceProvider(FactoryFinder.java:283)
>  
> at javax.xml.parsers.FactoryFinder.find(FactoryFinder.java:255) 
> at 
> javax.xml.parsers.DocumentBuilderFactory.newInstance(DocumentBuilderFactory.java:121)
>  
> at 
> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2526) 
> at 
> org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2503) 
> at 
> org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2409) 
> at org.apache.hadoop.conf.Configuration.get(Configuration.java:982) 
> at 
> org.apache.hadoop.mapred.JobConf.checkAndWarnDeprecation(JobConf.java:2032) 
> at org.apache.hadoop.mapred.JobConf.(JobConf.java:484) 
> at org.apache.hadoop.mapred.JobConf.(JobConf.java:474) 
> at org.apache.hadoop.mapreduce.Cluster.getJob(Cluster.java:210) 
> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:596) 
> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:594) 
> at java.security.AccessController.doPrivileged(Native Method) 
> at javax.security.auth.Subject.doAs(Subject.java:415) 
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
>  
> at 
> org.apache.hadoop.mapred.JobClient.getJobUsingCluster(JobClient.java:594) 
> at 
> org.apache.hadoop.mapred.JobClient.getTaskReports(JobClient.java:665) 
> at 
> org.apache.hadoop.mapred.JobClient.getReduceTaskReports(JobClient.java:689) 
> at 
> org.apache.hadoop.hive.ql.exec.mr.HadoopJobExecHelper.progress(HadoopJobExecHelper.java:272)
>  
> at 
> org.apache.hadoop.hive.ql.exec.mr.HadoopJobExecHelper.progress(HadoopJobExecHelper.java:549)
>  
> at 
> org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:435) 
> at 
> org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:137) 
> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) 
> at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) 
> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1770) 
> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1527) 
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1306) 
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1115) 
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1108) 
> at 
> org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:178)
>  
> at 
> org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:72)
>  
> at 
> org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:232)
>  
> at 

[jira] [Commented] (HADOOP-13809) hive: 'java.lang.IllegalStateException(zip file closed)'

2016-11-26 Thread Fei Hui (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-13809?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15699200#comment-15699200
 ] 

Fei Hui commented on HADOOP-13809:
--

I encounter the same problem. Maybe it's jdk's bug. 
DocumentBuilderFactory.newInstance() is in the stack and the function needs no 
params, so i think it is related to jdk.

> hive: 'java.lang.IllegalStateException(zip file closed)'
> 
>
> Key: HADOOP-13809
> URL: https://issues.apache.org/jira/browse/HADOOP-13809
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: conf
>Affects Versions: 2.8.0, 3.0.0-alpha1
>Reporter: Adriano
>
> Randomly some of the hive queries are failing with the below exception on 
> HS2: 
> {code}
> 2016-11-07 02:36:40,996 ERROR org.apache.hadoop.hive.ql.exec.Task: 
> [HiveServer2-Background-Pool: Thread-1823748]: Ended Job = 
> job_1478336955303_31030 with exception 'java.lang.IllegalStateException(zip 
> file 
>  closed)' 
> java.lang.IllegalStateException: zip file closed 
> at java.util.zip.ZipFile.ensureOpen(ZipFile.java:634) 
> at java.util.zip.ZipFile.getEntry(ZipFile.java:305) 
> at java.util.jar.JarFile.getEntry(JarFile.java:227) 
> at sun.net.www.protocol.jar.URLJarFile.getEntry(URLJarFile.java:128) 
> at 
> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:132) 
> at 
> sun.net.www.protocol.jar.JarURLConnection.getInputStream(JarURLConnection.java:150)
>  
> at 
> java.net.URLClassLoader.getResourceAsStream(URLClassLoader.java:233) 
> at javax.xml.parsers.SecuritySupport$4.run(SecuritySupport.java:94) 
> at java.security.AccessController.doPrivileged(Native Method) 
> at 
> javax.xml.parsers.SecuritySupport.getResourceAsStream(SecuritySupport.java:87)
>  
> at 
> javax.xml.parsers.FactoryFinder.findJarServiceProvider(FactoryFinder.java:283)
>  
> at javax.xml.parsers.FactoryFinder.find(FactoryFinder.java:255) 
> at 
> javax.xml.parsers.DocumentBuilderFactory.newInstance(DocumentBuilderFactory.java:121)
>  
> at 
> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2526) 
> at 
> org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2503) 
> at 
> org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2409) 
> at org.apache.hadoop.conf.Configuration.get(Configuration.java:982) 
> at 
> org.apache.hadoop.mapred.JobConf.checkAndWarnDeprecation(JobConf.java:2032) 
> at org.apache.hadoop.mapred.JobConf.(JobConf.java:484) 
> at org.apache.hadoop.mapred.JobConf.(JobConf.java:474) 
> at org.apache.hadoop.mapreduce.Cluster.getJob(Cluster.java:210) 
> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:596) 
> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:594) 
> at java.security.AccessController.doPrivileged(Native Method) 
> at javax.security.auth.Subject.doAs(Subject.java:415) 
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
>  
> at 
> org.apache.hadoop.mapred.JobClient.getJobUsingCluster(JobClient.java:594) 
> at 
> org.apache.hadoop.mapred.JobClient.getTaskReports(JobClient.java:665) 
> at 
> org.apache.hadoop.mapred.JobClient.getReduceTaskReports(JobClient.java:689) 
> at 
> org.apache.hadoop.hive.ql.exec.mr.HadoopJobExecHelper.progress(HadoopJobExecHelper.java:272)
>  
> at 
> org.apache.hadoop.hive.ql.exec.mr.HadoopJobExecHelper.progress(HadoopJobExecHelper.java:549)
>  
> at 
> org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:435) 
> at 
> org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:137) 
> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) 
> at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) 
> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1770) 
> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1527) 
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1306) 
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1115) 
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1108) 
> at 
> org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:178)
>  
> at 
> org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:72)
>  
> at 
> org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:232)
>  
> at java.security.AccessController.doPrivileged(Native Method) 
> at javax.security.auth.Subject.doAs(Subject.java:415) 
> at 
> 

[jira] [Commented] (HADOOP-13809) hive: 'java.lang.IllegalStateException(zip file closed)'

2016-11-11 Thread Steve Loughran (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-13809?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15657566#comment-15657566
 ] 

Steve Loughran commented on HADOOP-13809:
-

different cause I suspect. The stack trace you've got means that whatever was 
looking for Xerces on the CP hit a problem. It's not the stream/resource passed 
in which triggering the failure. More: something else on the CP went away and 
the service loader hit it.

Looking at the code, what may actually be good is for the 
DocumentBuilderFactory to be instantiated once (static constructor? or 
lazy-create of a static field?). Because the same builder options are used; 
there's no need to scan the classpath. Would that help? Maybe, but having 
things go off the CP is still pretty risky



> hive: 'java.lang.IllegalStateException(zip file closed)'
> 
>
> Key: HADOOP-13809
> URL: https://issues.apache.org/jira/browse/HADOOP-13809
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: conf
>Affects Versions: 2.8.0, 3.0.0-alpha1
>Reporter: Adriano
>
> Randomly some of the hive queries are failing with the below exception on 
> HS2: 
> {code}
> 2016-11-07 02:36:40,996 ERROR org.apache.hadoop.hive.ql.exec.Task: 
> [HiveServer2-Background-Pool: Thread-1823748]: Ended Job = 
> job_1478336955303_31030 with exception 'java.lang.IllegalStateException(zip 
> file 
>  closed)' 
> java.lang.IllegalStateException: zip file closed 
> at java.util.zip.ZipFile.ensureOpen(ZipFile.java:634) 
> at java.util.zip.ZipFile.getEntry(ZipFile.java:305) 
> at java.util.jar.JarFile.getEntry(JarFile.java:227) 
> at sun.net.www.protocol.jar.URLJarFile.getEntry(URLJarFile.java:128) 
> at 
> sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:132) 
> at 
> sun.net.www.protocol.jar.JarURLConnection.getInputStream(JarURLConnection.java:150)
>  
> at 
> java.net.URLClassLoader.getResourceAsStream(URLClassLoader.java:233) 
> at javax.xml.parsers.SecuritySupport$4.run(SecuritySupport.java:94) 
> at java.security.AccessController.doPrivileged(Native Method) 
> at 
> javax.xml.parsers.SecuritySupport.getResourceAsStream(SecuritySupport.java:87)
>  
> at 
> javax.xml.parsers.FactoryFinder.findJarServiceProvider(FactoryFinder.java:283)
>  
> at javax.xml.parsers.FactoryFinder.find(FactoryFinder.java:255) 
> at 
> javax.xml.parsers.DocumentBuilderFactory.newInstance(DocumentBuilderFactory.java:121)
>  
> at 
> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2526) 
> at 
> org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2503) 
> at 
> org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2409) 
> at org.apache.hadoop.conf.Configuration.get(Configuration.java:982) 
> at 
> org.apache.hadoop.mapred.JobConf.checkAndWarnDeprecation(JobConf.java:2032) 
> at org.apache.hadoop.mapred.JobConf.(JobConf.java:484) 
> at org.apache.hadoop.mapred.JobConf.(JobConf.java:474) 
> at org.apache.hadoop.mapreduce.Cluster.getJob(Cluster.java:210) 
> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:596) 
> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:594) 
> at java.security.AccessController.doPrivileged(Native Method) 
> at javax.security.auth.Subject.doAs(Subject.java:415) 
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
>  
> at 
> org.apache.hadoop.mapred.JobClient.getJobUsingCluster(JobClient.java:594) 
> at 
> org.apache.hadoop.mapred.JobClient.getTaskReports(JobClient.java:665) 
> at 
> org.apache.hadoop.mapred.JobClient.getReduceTaskReports(JobClient.java:689) 
> at 
> org.apache.hadoop.hive.ql.exec.mr.HadoopJobExecHelper.progress(HadoopJobExecHelper.java:272)
>  
> at 
> org.apache.hadoop.hive.ql.exec.mr.HadoopJobExecHelper.progress(HadoopJobExecHelper.java:549)
>  
> at 
> org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:435) 
> at 
> org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:137) 
> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) 
> at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) 
> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1770) 
> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1527) 
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1306) 
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1115) 
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1108) 
> at 
>