[ 
https://issues.apache.org/jira/browse/HBASE-29825?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kevin Geiszler updated HBASE-29825:
-----------------------------------
    Description: 
Running an incremental backup at a certain time of the day will result in an 
exception (see below).  I noticed this error when running 
IntegrationTestContinuousBackupRestore on my machine using IntelliJ.  The 
interesting part is I did not see this issue during the day until at or after 4 
PM Pacific Time, which also happens to be 00:00:00 UTC.  The UTC timezone is 
not being set for the {{dateFormat}} variable in 
[BackupUtils.getValidWalDirs()|https://github.com/apache/hbase/blob/HBASE-28957/hbase-backup/src/main/java/org/apache/hadoop/hbase/backup/util/BackupUtils.java#L998].
{code:java}
2026-01-08T21:26:21,300 ERROR [Thread-223 {}] impl.TableBackupClient(232): 
Unexpected exception in incremental-backup: incremental copy 
backup_1767936354830Can not convert from directory  (check Hadoop, HBase and 
WALPlayer M/R job logs) 
java.io.IOException: Can not convert from directory  (check Hadoop, HBase and 
WALPlayer M/R job logs) 
 at 
org.apache.hadoop.hbase.backup.impl.IncrementalTableBackupClient.walToHFiles(IncrementalTableBackupClient.java:525)
 ~[classes/:?]
 at 
org.apache.hadoop.hbase.backup.impl.IncrementalTableBackupClient.convertWALsToHFiles(IncrementalTableBackupClient.java:461)
 ~[classes/:?]
 at 
org.apache.hadoop.hbase.backup.impl.IncrementalTableBackupClient.execute(IncrementalTableBackupClient.java:344)
 ~[classes/:?]
 at 
org.apache.hadoop.hbase.backup.impl.BackupAdminImpl.backupTables(BackupAdminImpl.java:681)
 ~[classes/:?]
 at 
org.apache.hadoop.hbase.backup.IntegrationTestBackupRestoreBase.backup(IntegrationTestBackupRestoreBase.java:384)
 ~[test-classes/:?]
 at 
org.apache.hadoop.hbase.backup.IntegrationTestBackupRestoreBase.runTestSingle(IntegrationTestBackupRestoreBase.java:299)
 ~[test-classes/:?]
 at 
org.apache.hadoop.hbase.backup.IntegrationTestBackupRestoreBase$BackupAndRestoreThread.run(IntegrationTestBackupRestoreBase.java:136)
 ~[test-classes/:?]
 at java.lang.Thread.run(Thread.java:840) ~[?:?]
Caused by: java.lang.IllegalArgumentException: Can not create a Path from an 
empty string
 at org.apache.hadoop.fs.Path.checkPathArg(Path.java:173) 
~[hadoop-common-3.4.1.jar:?]
 at org.apache.hadoop.fs.Path.<init>(Path.java:185) ~[hadoop-common-3.4.1.jar:?]
 at org.apache.hadoop.util.StringUtils.stringToPath(StringUtils.java:279) 
~[hadoop-common-3.4.1.jar:?]
 at 
org.apache.hadoop.hbase.mapreduce.WALInputFormat.getInputPaths(WALInputFormat.java:348)
 ~[classes/:?]
 at 
org.apache.hadoop.hbase.mapreduce.WALInputFormat.getSplits(WALInputFormat.java:311)
 ~[classes/:?]
 at 
org.apache.hadoop.hbase.mapreduce.WALInputFormat.getSplits(WALInputFormat.java:301)
 ~[classes/:?]
 at 
org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:311) 
~[hadoop-mapreduce-client-core-3.4.1.jar:?]
 at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:328) 
~[hadoop-mapreduce-client-core-3.4.1.jar:?]
 at 
org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:201)
 ~[hadoop-mapreduce-client-core-3.4.1.jar:?]
 at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1677) 
~[hadoop-mapreduce-client-core-3.4.1.jar:?]
 at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1674) 
~[hadoop-mapreduce-client-core-3.4.1.jar:?]
 at java.security.AccessController.doPrivileged(AccessController.java:712) 
~[?:?]
 at javax.security.auth.Subject.doAs(Subject.java:439) ~[?:?]
 at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1953)
 ~[hadoop-common-3.4.1.jar:?]
 at org.apache.hadoop.mapreduce.Job.submit(Job.java:1674) 
~[hadoop-mapreduce-client-core-3.4.1.jar:?]
 at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1695) 
~[hadoop-mapreduce-client-core-3.4.1.jar:?]
 at org.apache.hadoop.hbase.mapreduce.WALPlayer.run(WALPlayer.java:445) 
~[classes/:?]
 at 
org.apache.hadoop.hbase.backup.impl.IncrementalTableBackupClient.walToHFiles(IncrementalTableBackupClient.java:516)
 ~[classes/:?]
 ... 7 more {code}
 

 

  was:
Running an incremental backup at a certain time of the day will result in an 
exception (see below).  I noticed this error when running 
IntegrationTestContinuousBackupRestore on my machine using IntelliJ.   The 
interesting part is I did not see this issue during the day until at or after 4 
PM Pacific Time, which also happens to be 00:00:00 UTC.  The UTC timezone is 
not being set for the {{dateFormat}} variable in 
[BackupUtils.getValidWalDirs()|https://github.com/apache/hbase/blob/HBASE-28957/hbase-backup/src/main/java/org/apache/hadoop/hbase/backup/util/BackupUtils.java#L998].
{code:java}
2026-01-08T21:26:21,300 ERROR [Thread-223 {}] impl.TableBackupClient(232): 
Unexpected exception in incremental-backup: incremental copy 
backup_1767936354830Can not convert from directory  (check Hadoop, HBase and 
WALPlayer M/R job logs) 
java.io.IOException: Can not convert from directory  (check Hadoop, HBase and 
WALPlayer M/R job logs) 
 at 
org.apache.hadoop.hbase.backup.impl.IncrementalTableBackupClient.walToHFiles(IncrementalTableBackupClient.java:525)
 ~[classes/:?]
 at 
org.apache.hadoop.hbase.backup.impl.IncrementalTableBackupClient.convertWALsToHFiles(IncrementalTableBackupClient.java:461)
 ~[classes/:?]
 at 
org.apache.hadoop.hbase.backup.impl.IncrementalTableBackupClient.execute(IncrementalTableBackupClient.java:344)
 ~[classes/:?]
 at 
org.apache.hadoop.hbase.backup.impl.BackupAdminImpl.backupTables(BackupAdminImpl.java:681)
 ~[classes/:?]
 at 
org.apache.hadoop.hbase.backup.IntegrationTestBackupRestoreBase.backup(IntegrationTestBackupRestoreBase.java:384)
 ~[test-classes/:?]
 at 
org.apache.hadoop.hbase.backup.IntegrationTestBackupRestoreBase.runTestSingle(IntegrationTestBackupRestoreBase.java:299)
 ~[test-classes/:?]
 at 
org.apache.hadoop.hbase.backup.IntegrationTestBackupRestoreBase$BackupAndRestoreThread.run(IntegrationTestBackupRestoreBase.java:136)
 ~[test-classes/:?]
 at java.lang.Thread.run(Thread.java:840) ~[?:?]
Caused by: java.lang.IllegalArgumentException: Can not create a Path from an 
empty string
 at org.apache.hadoop.fs.Path.checkPathArg(Path.java:173) 
~[hadoop-common-3.4.1.jar:?]
 at org.apache.hadoop.fs.Path.<init>(Path.java:185) ~[hadoop-common-3.4.1.jar:?]
 at org.apache.hadoop.util.StringUtils.stringToPath(StringUtils.java:279) 
~[hadoop-common-3.4.1.jar:?]
 at 
org.apache.hadoop.hbase.mapreduce.WALInputFormat.getInputPaths(WALInputFormat.java:348)
 ~[classes/:?]
 at 
org.apache.hadoop.hbase.mapreduce.WALInputFormat.getSplits(WALInputFormat.java:311)
 ~[classes/:?]
 at 
org.apache.hadoop.hbase.mapreduce.WALInputFormat.getSplits(WALInputFormat.java:301)
 ~[classes/:?]
 at 
org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:311) 
~[hadoop-mapreduce-client-core-3.4.1.jar:?]
 at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:328) 
~[hadoop-mapreduce-client-core-3.4.1.jar:?]
 at 
org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:201)
 ~[hadoop-mapreduce-client-core-3.4.1.jar:?]
 at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1677) 
~[hadoop-mapreduce-client-core-3.4.1.jar:?]
 at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1674) 
~[hadoop-mapreduce-client-core-3.4.1.jar:?]
 at java.security.AccessController.doPrivileged(AccessController.java:712) 
~[?:?]
 at javax.security.auth.Subject.doAs(Subject.java:439) ~[?:?]
 at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1953)
 ~[hadoop-common-3.4.1.jar:?]
 at org.apache.hadoop.mapreduce.Job.submit(Job.java:1674) 
~[hadoop-mapreduce-client-core-3.4.1.jar:?]
 at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1695) 
~[hadoop-mapreduce-client-core-3.4.1.jar:?]
 at org.apache.hadoop.hbase.mapreduce.WALPlayer.run(WALPlayer.java:445) 
~[classes/:?]
 at 
org.apache.hadoop.hbase.backup.impl.IncrementalTableBackupClient.walToHFiles(IncrementalTableBackupClient.java:516)
 ~[classes/:?]
 ... 7 more {code}
 

 


> Incremental backup is failing due to incorrect timezone
> -------------------------------------------------------
>
>                 Key: HBASE-29825
>                 URL: https://issues.apache.org/jira/browse/HBASE-29825
>             Project: HBase
>          Issue Type: Bug
>          Components: backup&amp;restore
>            Reporter: Kevin Geiszler
>            Priority: Major
>
> Running an incremental backup at a certain time of the day will result in an 
> exception (see below).  I noticed this error when running 
> IntegrationTestContinuousBackupRestore on my machine using IntelliJ.  The 
> interesting part is I did not see this issue during the day until at or after 
> 4 PM Pacific Time, which also happens to be 00:00:00 UTC.  The UTC timezone 
> is not being set for the {{dateFormat}} variable in 
> [BackupUtils.getValidWalDirs()|https://github.com/apache/hbase/blob/HBASE-28957/hbase-backup/src/main/java/org/apache/hadoop/hbase/backup/util/BackupUtils.java#L998].
> {code:java}
> 2026-01-08T21:26:21,300 ERROR [Thread-223 {}] impl.TableBackupClient(232): 
> Unexpected exception in incremental-backup: incremental copy 
> backup_1767936354830Can not convert from directory  (check Hadoop, HBase and 
> WALPlayer M/R job logs) 
> java.io.IOException: Can not convert from directory  (check Hadoop, HBase and 
> WALPlayer M/R job logs) 
>  at 
> org.apache.hadoop.hbase.backup.impl.IncrementalTableBackupClient.walToHFiles(IncrementalTableBackupClient.java:525)
>  ~[classes/:?]
>  at 
> org.apache.hadoop.hbase.backup.impl.IncrementalTableBackupClient.convertWALsToHFiles(IncrementalTableBackupClient.java:461)
>  ~[classes/:?]
>  at 
> org.apache.hadoop.hbase.backup.impl.IncrementalTableBackupClient.execute(IncrementalTableBackupClient.java:344)
>  ~[classes/:?]
>  at 
> org.apache.hadoop.hbase.backup.impl.BackupAdminImpl.backupTables(BackupAdminImpl.java:681)
>  ~[classes/:?]
>  at 
> org.apache.hadoop.hbase.backup.IntegrationTestBackupRestoreBase.backup(IntegrationTestBackupRestoreBase.java:384)
>  ~[test-classes/:?]
>  at 
> org.apache.hadoop.hbase.backup.IntegrationTestBackupRestoreBase.runTestSingle(IntegrationTestBackupRestoreBase.java:299)
>  ~[test-classes/:?]
>  at 
> org.apache.hadoop.hbase.backup.IntegrationTestBackupRestoreBase$BackupAndRestoreThread.run(IntegrationTestBackupRestoreBase.java:136)
>  ~[test-classes/:?]
>  at java.lang.Thread.run(Thread.java:840) ~[?:?]
> Caused by: java.lang.IllegalArgumentException: Can not create a Path from an 
> empty string
>  at org.apache.hadoop.fs.Path.checkPathArg(Path.java:173) 
> ~[hadoop-common-3.4.1.jar:?]
>  at org.apache.hadoop.fs.Path.<init>(Path.java:185) 
> ~[hadoop-common-3.4.1.jar:?]
>  at org.apache.hadoop.util.StringUtils.stringToPath(StringUtils.java:279) 
> ~[hadoop-common-3.4.1.jar:?]
>  at 
> org.apache.hadoop.hbase.mapreduce.WALInputFormat.getInputPaths(WALInputFormat.java:348)
>  ~[classes/:?]
>  at 
> org.apache.hadoop.hbase.mapreduce.WALInputFormat.getSplits(WALInputFormat.java:311)
>  ~[classes/:?]
>  at 
> org.apache.hadoop.hbase.mapreduce.WALInputFormat.getSplits(WALInputFormat.java:301)
>  ~[classes/:?]
>  at 
> org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:311)
>  ~[hadoop-mapreduce-client-core-3.4.1.jar:?]
>  at 
> org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:328) 
> ~[hadoop-mapreduce-client-core-3.4.1.jar:?]
>  at 
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:201)
>  ~[hadoop-mapreduce-client-core-3.4.1.jar:?]
>  at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1677) 
> ~[hadoop-mapreduce-client-core-3.4.1.jar:?]
>  at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1674) 
> ~[hadoop-mapreduce-client-core-3.4.1.jar:?]
>  at java.security.AccessController.doPrivileged(AccessController.java:712) 
> ~[?:?]
>  at javax.security.auth.Subject.doAs(Subject.java:439) ~[?:?]
>  at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1953)
>  ~[hadoop-common-3.4.1.jar:?]
>  at org.apache.hadoop.mapreduce.Job.submit(Job.java:1674) 
> ~[hadoop-mapreduce-client-core-3.4.1.jar:?]
>  at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1695) 
> ~[hadoop-mapreduce-client-core-3.4.1.jar:?]
>  at org.apache.hadoop.hbase.mapreduce.WALPlayer.run(WALPlayer.java:445) 
> ~[classes/:?]
>  at 
> org.apache.hadoop.hbase.backup.impl.IncrementalTableBackupClient.walToHFiles(IncrementalTableBackupClient.java:516)
>  ~[classes/:?]
>  ... 7 more {code}
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to