[jira] [Commented] (KAFKA-6059) Kafka cant delete old log files on windows

2022-07-22 Thread Martin Pelak (Jira)


[ 
https://issues.apache.org/jira/browse/KAFKA-6059?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17569921#comment-17569921
 ] 

Martin Pelak commented on KAFKA-6059:
-

Hi, if you are a Windows user, you can use a fix in [this pull 
request|https://github.com/apache/kafka/pull/12331].

> Kafka cant delete old log files on windows
> --
>
> Key: KAFKA-6059
> URL: https://issues.apache.org/jira/browse/KAFKA-6059
> Project: Kafka
>  Issue Type: Bug
>  Components: log
>Affects Versions: 0.10.0.0, 0.10.0.1, 0.10.1.0, 0.10.1.1, 0.10.2.0, 
> 0.10.2.1, 0.11.0.0
> Environment: OS:windows 2016
> kafka:10.0.2.1
> zookeeper:3.5.2
>Reporter: rico
>Priority: Critical
>  Labels: windows
>
> I had a trouble for delete old log files.
> Now , I had disk space usage issues.
> I found exception from kafka's log and content as below:
> kafka.common.KafkaStorageException: Failed to change the log file suffix from 
>  to .deleted for log segment 0
>   at kafka.log.LogSegment.kafkaStorageException$1(LogSegment.scala:340)
>   at kafka.log.LogSegment.changeFileSuffixes(LogSegment.scala:344)
>   at kafka.log.Log.asyncDeleteSegment(Log.scala:981)
>   at kafka.log.Log.deleteSegment(Log.scala:971)
>   at kafka.log.Log.$anonfun$deleteOldSegments$1(Log.scala:673)
>   at kafka.log.Log.$anonfun$deleteOldSegments$1$adapted(Log.scala:673)
>   at scala.collection.immutable.List.foreach(List.scala:378)
>   at kafka.log.Log.deleteOldSegments(Log.scala:673)
>   at kafka.log.Log.deleteRetenionMsBreachedSegments(Log.scala:703)
>   at kafka.log.Log.deleteOldSegments(Log.scala:697)
>   at kafka.log.LogManager.$anonfun$cleanupLogs$3(LogManager.scala:474)
>   at 
> kafka.log.LogManager.$anonfun$cleanupLogs$3$adapted(LogManager.scala:472)
>   at 
> scala.collection.TraversableLike$WithFilter.$anonfun$foreach$1(TraversableLike.scala:789)
>   at scala.collection.Iterator.foreach(Iterator.scala:929)
>   at scala.collection.Iterator.foreach$(Iterator.scala:929)
>   at scala.collection.AbstractIterator.foreach(Iterator.scala:1406)
>   at scala.collection.IterableLike.foreach(IterableLike.scala:71)
>   at scala.collection.IterableLike.foreach$(IterableLike.scala:70)
>   at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
>   at 
> scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:788)
>   at kafka.log.LogManager.cleanupLogs(LogManager.scala:472)
>   at kafka.log.LogManager.$anonfun$startup$2(LogManager.scala:200)
>   at 
> kafka.utils.KafkaScheduler.$anonfun$schedule$2(KafkaScheduler.scala:110)
>   at kafka.utils.CoreUtils$$anon$1.run(CoreUtils.scala:57)
>   at 
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>   at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
>   at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
>   at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
>   at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>   at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>   at java.lang.Thread.run(Thread.java:748)
> Caused by: java.nio.file.FileSystemException: 
> C:\kafka\kafka-logs\telegraf1-0\.log -> 
> C:\kafka\kafka-logs\telegraf1-0\.log.deleted: The process 
> cannot access the file because it is being used by another process.
>   at 
> sun.nio.fs.WindowsException.translateToIOException(WindowsException.java:86)
>   at 
> sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:97)
>   at sun.nio.fs.WindowsFileCopy.move(WindowsFileCopy.java:387)
>   at 
> sun.nio.fs.WindowsFileSystemProvider.move(WindowsFileSystemProvider.java:287)
>   at java.nio.file.Files.move(Files.java:1395)
>   at 
> org.apache.kafka.common.utils.Utils.atomicMoveWithFallback(Utils.java:711)
>   at 
> org.apache.kafka.common.record.FileRecords.renameTo(FileRecords.java:210)
>   at kafka.log.LogSegment.changeFileSuffixes(LogSegment.scala:342)
>   ... 29 more
>   Suppressed: java.nio.file.FileSystemException: 
> C:\kafka\kafka-logs\telegraf1-0\.log -> 
> C:\kafka\kafka-logs\telegraf1-0\.log.deleted: The process 
> cannot access the file because it is being used by another process.
>   at 
> sun.nio.fs.WindowsException.translateToIOException(WindowsException.java:86)
>   at 
> sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:97)
>   at 

[jira] [Commented] (KAFKA-6059) Kafka cant delete old log files on windows

2018-09-20 Thread shadi (JIRA)


[ 
https://issues.apache.org/jira/browse/KAFKA-6059?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16621748#comment-16621748
 ] 

shadi commented on KAFKA-6059:
--

If this old issue is not going to be fixed, then there must be be a manual 
procedure that's safe to delete old log files. Please share the instructions.

> Kafka cant delete old log files on windows
> --
>
> Key: KAFKA-6059
> URL: https://issues.apache.org/jira/browse/KAFKA-6059
> Project: Kafka
>  Issue Type: Bug
>  Components: log
>Affects Versions: 0.10.0.0, 0.10.0.1, 0.10.1.0, 0.10.1.1, 0.10.2.0, 
> 0.10.2.1, 0.11.0.0
> Environment: OS:windows 2016
> kafka:10.0.2.1
> zookeeper:3.5.2
>Reporter: rico
>Priority: Critical
>  Labels: windows
>
> I had a trouble for delete old log files.
> Now , I had disk space usage issues.
> I found exception from kafka's log and content as below:
> kafka.common.KafkaStorageException: Failed to change the log file suffix from 
>  to .deleted for log segment 0
>   at kafka.log.LogSegment.kafkaStorageException$1(LogSegment.scala:340)
>   at kafka.log.LogSegment.changeFileSuffixes(LogSegment.scala:344)
>   at kafka.log.Log.asyncDeleteSegment(Log.scala:981)
>   at kafka.log.Log.deleteSegment(Log.scala:971)
>   at kafka.log.Log.$anonfun$deleteOldSegments$1(Log.scala:673)
>   at kafka.log.Log.$anonfun$deleteOldSegments$1$adapted(Log.scala:673)
>   at scala.collection.immutable.List.foreach(List.scala:378)
>   at kafka.log.Log.deleteOldSegments(Log.scala:673)
>   at kafka.log.Log.deleteRetenionMsBreachedSegments(Log.scala:703)
>   at kafka.log.Log.deleteOldSegments(Log.scala:697)
>   at kafka.log.LogManager.$anonfun$cleanupLogs$3(LogManager.scala:474)
>   at 
> kafka.log.LogManager.$anonfun$cleanupLogs$3$adapted(LogManager.scala:472)
>   at 
> scala.collection.TraversableLike$WithFilter.$anonfun$foreach$1(TraversableLike.scala:789)
>   at scala.collection.Iterator.foreach(Iterator.scala:929)
>   at scala.collection.Iterator.foreach$(Iterator.scala:929)
>   at scala.collection.AbstractIterator.foreach(Iterator.scala:1406)
>   at scala.collection.IterableLike.foreach(IterableLike.scala:71)
>   at scala.collection.IterableLike.foreach$(IterableLike.scala:70)
>   at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
>   at 
> scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:788)
>   at kafka.log.LogManager.cleanupLogs(LogManager.scala:472)
>   at kafka.log.LogManager.$anonfun$startup$2(LogManager.scala:200)
>   at 
> kafka.utils.KafkaScheduler.$anonfun$schedule$2(KafkaScheduler.scala:110)
>   at kafka.utils.CoreUtils$$anon$1.run(CoreUtils.scala:57)
>   at 
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>   at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
>   at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
>   at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
>   at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>   at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>   at java.lang.Thread.run(Thread.java:748)
> Caused by: java.nio.file.FileSystemException: 
> C:\kafka\kafka-logs\telegraf1-0\.log -> 
> C:\kafka\kafka-logs\telegraf1-0\.log.deleted: The process 
> cannot access the file because it is being used by another process.
>   at 
> sun.nio.fs.WindowsException.translateToIOException(WindowsException.java:86)
>   at 
> sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:97)
>   at sun.nio.fs.WindowsFileCopy.move(WindowsFileCopy.java:387)
>   at 
> sun.nio.fs.WindowsFileSystemProvider.move(WindowsFileSystemProvider.java:287)
>   at java.nio.file.Files.move(Files.java:1395)
>   at 
> org.apache.kafka.common.utils.Utils.atomicMoveWithFallback(Utils.java:711)
>   at 
> org.apache.kafka.common.record.FileRecords.renameTo(FileRecords.java:210)
>   at kafka.log.LogSegment.changeFileSuffixes(LogSegment.scala:342)
>   ... 29 more
>   Suppressed: java.nio.file.FileSystemException: 
> C:\kafka\kafka-logs\telegraf1-0\.log -> 
> C:\kafka\kafka-logs\telegraf1-0\.log.deleted: The process 
> cannot access the file because it is being used by another process.
>   at 
> sun.nio.fs.WindowsException.translateToIOException(WindowsException.java:86)
>   at 
> sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:97)
>   at 

[jira] [Commented] (KAFKA-6059) Kafka cant delete old log files on windows

2017-12-06 Thread Rainer Guessner (JIRA)

[ 
https://issues.apache.org/jira/browse/KAFKA-6059?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16280804#comment-16280804
 ] 

Rainer Guessner commented on KAFKA-6059:


I have the same exception. I get the exception by changing the server config as 
follows and then by producing some traffic.

log.cleaner.backoff.ms=1000
log.cleaner.min.cleanable.ratio=0.01
log.segment.bytes=1048576


[2017-12-06 14:05:30,845] ERROR Failed to clean up log for 
x-changelog-0 in dir C:\tmp\\kafka-logs due to IOException 
(kafka.server.LogDirFailureChannel)
java.nio.file.FileSystemException: 
C:\tmp\\kafka-logs\xxx-changelog-0\.log.cleaned 
-> C:\tmp\xxx\kafka-logs\xxx-changelog-0\.log.swap: The 
process cannot access the file because it is being used by another process.

at 
sun.nio.fs.WindowsException.translateToIOException(WindowsException.java:86)
at 
sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:97)
at sun.nio.fs.WindowsFileCopy.move(WindowsFileCopy.java:387)
at 
sun.nio.fs.WindowsFileSystemProvider.move(WindowsFileSystemProvider.java:287)
at java.nio.file.Files.move(Files.java:1395)
at 
org.apache.kafka.common.utils.Utils.atomicMoveWithFallback(Utils.java:682)
at 
org.apache.kafka.common.record.FileRecords.renameTo(FileRecords.java:212)
at kafka.log.LogSegment.changeFileSuffixes(LogSegment.scala:398)
at kafka.log.Log.replaceSegments(Log.scala:1635)
at kafka.log.Cleaner.cleanSegments(LogCleaner.scala:485)
at kafka.log.Cleaner.$anonfun$doClean$6(LogCleaner.scala:396)
at kafka.log.Cleaner.$anonfun$doClean$6$adapted(LogCleaner.scala:395)
at scala.collection.immutable.List.foreach(List.scala:389)
at kafka.log.Cleaner.doClean(LogCleaner.scala:395)
at kafka.log.Cleaner.clean(LogCleaner.scala:372)
at kafka.log.LogCleaner$CleanerThread.cleanOrSleep(LogCleaner.scala:263)
at kafka.log.LogCleaner$CleanerThread.doWork(LogCleaner.scala:243)
at kafka.utils.ShutdownableThread.run(ShutdownableThread.scala:64)
Suppressed: java.nio.file.FileSystemException: 
C:\tmp\xxx\kafka-logs\xx-changelog-0\.log.cleaned 
-> C:\tmp\xxx\kafka-logs\xxx-changelog-0\.log.swap: 
The process cannot access the file because it is being used by another process.

at 
sun.nio.fs.WindowsException.translateToIOException(WindowsException.java:86)
at 
sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:97)
at sun.nio.fs.WindowsFileCopy.move(WindowsFileCopy.java:301)
at 
sun.nio.fs.WindowsFileSystemProvider.move(WindowsFileSystemProvider.java:287)
at java.nio.file.Files.move(Files.java:1395)
at 
org.apache.kafka.common.utils.Utils.atomicMoveWithFallback(Utils.java:679)
... 12 more

> Kafka cant delete old log files on windows
> --
>
> Key: KAFKA-6059
> URL: https://issues.apache.org/jira/browse/KAFKA-6059
> Project: Kafka
>  Issue Type: Bug
>  Components: log
>Affects Versions: 0.10.0.0, 0.10.0.1, 0.10.1.0, 0.10.1.1, 0.10.2.0, 
> 0.10.2.1, 0.11.0.0
> Environment: OS:windows 2016
> kafka:10.0.2.1
> zookeeper:3.5.2
>Reporter: rico
>Priority: Critical
>  Labels: windows
>
> I had a trouble for delete old log files.
> Now , I had disk space usage issues.
> I found exception from kafka's log and content as below:
> kafka.common.KafkaStorageException: Failed to change the log file suffix from 
>  to .deleted for log segment 0
>   at kafka.log.LogSegment.kafkaStorageException$1(LogSegment.scala:340)
>   at kafka.log.LogSegment.changeFileSuffixes(LogSegment.scala:344)
>   at kafka.log.Log.asyncDeleteSegment(Log.scala:981)
>   at kafka.log.Log.deleteSegment(Log.scala:971)
>   at kafka.log.Log.$anonfun$deleteOldSegments$1(Log.scala:673)
>   at kafka.log.Log.$anonfun$deleteOldSegments$1$adapted(Log.scala:673)
>   at scala.collection.immutable.List.foreach(List.scala:378)
>   at kafka.log.Log.deleteOldSegments(Log.scala:673)
>   at kafka.log.Log.deleteRetenionMsBreachedSegments(Log.scala:703)
>   at kafka.log.Log.deleteOldSegments(Log.scala:697)
>   at kafka.log.LogManager.$anonfun$cleanupLogs$3(LogManager.scala:474)
>   at 
> kafka.log.LogManager.$anonfun$cleanupLogs$3$adapted(LogManager.scala:472)
>   at 
> scala.collection.TraversableLike$WithFilter.$anonfun$foreach$1(TraversableLike.scala:789)
>   at scala.collection.Iterator.foreach(Iterator.scala:929)
>   at scala.collection.Iterator.foreach$(Iterator.scala:929)
>   at