[ https://issues.apache.org/jira/browse/SPARK-17573?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen resolved SPARK-17573. ------------------------------- Resolution: Not A Problem No, this is covered by {{context.addTaskCompletionListener(context => cleanup())}} This probably isn't a productive way to analyze these by re-editing a JIRA. I'd focus on instances where a stream can clearly be cleaned up early, locally, and isn't. > The FileInputStream may be uncloseed when some exceptions occurs > ---------------------------------------------------------------- > > Key: SPARK-17573 > URL: https://issues.apache.org/jira/browse/SPARK-17573 > Project: Spark > Issue Type: Improvement > Components: Spark Core > Affects Versions: 2.0.0 > Reporter: Jianfei Wang > Priority: Trivial > Labels: performance > > I think that the InputStream may never be closed when some exceptions occur, > we should surround this with try catch. > {code} > private def addFilesToZipStream(parent: String, source: File, output: > ZipOutputStream): Unit = { > if (source.isDirectory()) { > output.putNextEntry(new ZipEntry(parent + source.getName())) > for (file <- source.listFiles()) { > addFilesToZipStream(parent + source.getName() + File.separator, file, > output) > } > } else { > val in = new FileInputStream(source) > output.putNextEntry(new ZipEntry(parent + source.getName())) > val buf = new Array[Byte](8192) > var n = 0 > while (n != -1) { > n = in.read(buf) > if (n != -1) { > output.write(buf, 0, n) > } > } > output.closeEntry() > in.close() > } > } > some code in TestUtils.scala > val in = new FileInputStream(file) > ByteStreams.copy(in, jarStream) > in.close() > some code in IvyTestUtils.scala > val in = new FileInputStream(file._2) > ByteStreams.copy(in, jarStream) > in.close() > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org