LiuGuH commented on code in PR #6926:
URL: https://github.com/apache/hadoop/pull/6926#discussion_r1887840665


##########
hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/fsdataset/impl/BlockPoolSlice.java:
##########
@@ -1152,4 +1153,14 @@ void setDeleteDuplicateReplicasForTests(
     this.deleteDuplicateReplicas = deleteDuplicateReplicasForTests;
   }
 
+  public File hardLinkOneBlock(File src, File srcMeta, Block dstBlock) throws 
IOException {
+    File dstMeta = new File(tmpDir,
+        DatanodeUtil.getMetaName(dstBlock.getBlockName(), 
dstBlock.getGenerationStamp()));
+    HardLink.createHardLink(srcMeta, dstMeta);
+
+    File dstBlockFile = new File(tmpDir, dstBlock.getBlockName());
+    HardLink.createHardLink(src, dstBlockFile);

Review Comment:
   >>do you use this feature on a large scale? Does creating a lot of hard 
links have a maintenance impact on the system? 
   
   Now Hadoop version upgrade depends on  linux hardlink.   And hardlink is 
lightweight and it only link a new pointer points to the original file.   
   
   >> Have you considered implementing it based on rename?
   
   Now  src and dst have different  block name.  And they point the same file 
address. This is  what hardlink works.  @tomscut  



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to