jhungund commented on code in PR #5829:
URL: https://github.com/apache/hbase/pull/5829#discussion_r1572161825


##########
hbase-server/src/main/java/org/apache/hadoop/hbase/io/hfile/bucket/BucketCache.java:
##########
@@ -980,9 +991,18 @@ void freeSpace(final String why) {
       BucketEntryGroup bucketMemory =
         new BucketEntryGroup(bytesToFreeWithExtra, blockSize, 
getPartitionSize(memoryFactor));
 
+      long bytesFreed = 0;
+
       // Scan entire map putting bucket entry into appropriate bucket entry
       // group
       for (Map.Entry<BlockCacheKey, BucketEntry> bucketEntryWithKey : 
backingMap.entrySet()) {
+
+        if (coldFiles.containsKey(bucketEntryWithKey.getKey().getHfileName())) 
{
+          //coldBlocks.add(bucketEntryWithKey.getKey());
+          bytesFreed += bucketEntryWithKey.getValue().getLength();
+          evictBlock(bucketEntryWithKey.getKey());

Review Comment:
   You mean we need to evict these cold file blocks even before the 
calculations for bytesToFreeWithoutExtra? That will require one additional 
traversal of backingMap. I will incorporate this change and we can take a look 
if that looks ok.



##########
hbase-server/src/main/java/org/apache/hadoop/hbase/regionserver/DataTieringManager.java:
##########
@@ -219,4 +225,43 @@ private long getDataTieringHotDataAge(Configuration conf) {
     return Long.parseLong(
       conf.get(DATATIERING_HOT_DATA_AGE_KEY, 
String.valueOf(DEFAULT_DATATIERING_HOT_DATA_AGE)));
   }
+
+  /*
+   * This API takes the names of files as input and returns a subset of these 
file names
+   * that are cold.
+   * @parameter inputFileNames: Input list of file names
+   * @return List of names of files that are cold as per data-tiering logic.
+   */
+  public Map<String, String> getColdFilesList() {
+    Map<String, String> coldFiles = new HashMap<>();
+    for (HRegion r : this.onlineRegions.values()) {
+      for (HStore hStore : r.getStores()) {
+        Configuration conf = hStore.getReadOnlyConfiguration();
+        if (getDataTieringType(conf) != DataTieringType.TIME_RANGE) {
+          // Data-Tiering not enabled for the store. Just skip it.
+          continue;
+        }
+        Long hotDataAge = getDataTieringHotDataAge(conf);
+
+        for (HStoreFile hStoreFile : hStore.getStorefiles()) {
+          String hFileName =
+            
hStoreFile.getFileInfo().getHFileInfo().getHFileContext().getHFileName();
+            OptionalLong maxTimestamp = hStoreFile.getMaximumTimestamp();
+            if (!maxTimestamp.isPresent()) {
+              // We could throw from here, But we are already in the critical 
code-path
+              // of freeing space. Hence, we can ignore that file for now
+              // Or do we want to include it?

Review Comment:
   ack!



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@hbase.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to