veghlaci05 commented on code in PR #4293:
URL: https://github.com/apache/hive/pull/4293#discussion_r1219154790
##########
ql/src/test/org/apache/hadoop/hive/ql/cleanup/TestCleanupService.java:
##########
@@ -77,6 +83,54 @@ public void
testEventualCleanupService_finishesCleanupBeforeExit() throws IOExce
assertTrue(cleanupService.await(1, TimeUnit.MINUTES));
}
+ /**
+ * Testing behaviour of ClearDanglingScratchDir service over local tmp
files/dirs
+ * @throws Exception
+ */
+ @Test
+ public void localDanglingFilesCleaning() throws Exception {
Review Comment:
Please move the test to `TestClearDanglingScratchDir` as
`TestCleanupService` is not related to `ClearDanglingScratchDir`, they are even
in different packages. Also, the method name should be
**test**localDanglingFilesCleaning() to conform naming conventions.
##########
ql/src/test/org/apache/hadoop/hive/ql/cleanup/TestCleanupService.java:
##########
@@ -77,6 +83,54 @@ public void
testEventualCleanupService_finishesCleanupBeforeExit() throws IOExce
assertTrue(cleanupService.await(1, TimeUnit.MINUTES));
}
+ /**
+ * Testing behaviour of ClearDanglingScratchDir service over local tmp
files/dirs
+ * @throws Exception
+ */
+ @Test
+ public void localDanglingFilesCleaning() throws Exception {
+ HiveConf conf = new HiveConf();
+ conf.set("fs.default.name", "file:///");
+ FileSystem fs = FileSystem.get(conf);
+
+ // constants
+ String appId = "appId_" + System.currentTimeMillis();
+ String userName = System.getProperty("user.name");
+ String hdfs = "hdfs";
+ String inuse = "inuse.lck";
+ String l = File.separator;
+
+ // simulating hdfs dangling dir and its inuse.lck file
+ Path hdfsRootDir = new Path( HiveConf.getVar(conf,
HiveConf.ConfVars.SCRATCHDIR) + l + userName + l + hdfs);
+ Path hdfsSessionDir = new Path(hdfsRootDir + l + userName + l + appId);
+ Path hdfsSessionLock = new Path(hdfsSessionDir + l + inuse);
+ fs.create(hdfsSessionLock);
+
+ // simulating local dangling files
+ String localTmpDir = HiveConf.getVar(conf,
HiveConf.ConfVars.LOCALSCRATCHDIR);
+ Path localSessionDir = new Path(localTmpDir + l + appId);
+ Path localPipeOutFileRemove = new Path(localTmpDir + l
+ + appId + "-started-with-session-name.pipeout");
+ Path localPipeOutFileNotRemove = new Path(localTmpDir + l
+ + "not-started-with-session-name" + appId + ".pipeout");
+ fs.mkdirs(localSessionDir);
+ fs.create(localPipeOutFileRemove);
+ fs.create(localPipeOutFileNotRemove);
+
+ // running only the new method, the main service will be identifying which
session files/dirs are dangling
+ ClearDanglingScratchDir clearDanglingScratchDirMain = new
ClearDanglingScratchDir(false,
+ false, true, hdfsRootDir.toString(), conf);
+ clearDanglingScratchDirMain.run();
+
+ // should remove all except localPipeOutFileNotRemove, because it does not
start with session name
+ Assert.assertFalse("Local session dir '" + localSessionDir
+ + "' still exists, should have been removed!",
fs.exists(localSessionDir));
+ Assert.assertFalse("Local .pipeout file '" + localPipeOutFileRemove
+ + "' still exists, should have been removed!",
fs.exists(localPipeOutFileRemove));
+ Assert.assertTrue("Local .pipeout file '" + localPipeOutFileNotRemove
+ + "' does not exist, should have not been removed!",
fs.exists(localPipeOutFileNotRemove));
Review Comment:
You may also create a file which is:
1. not the last in the listing
[here](https://github.com/apache/hive/blob/9d17812f7d6958cf843b399ddfc6595f38137a11/ql/src/java/org/apache/hadoop/hive/ql/session/ClearDanglingScratchDir.java#L256),
and check if the service still deletes every other files.
2. The service cannot delete it (either still open, or requires rights the
service doesn't have)
You should check if all the other files are still deleted.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]