https://bugs.kde.org/show_bug.cgi?id=479204

            Bug ID: 479204
           Summary: Extremely slow compression - single-threaded
                    compression appears to be the limiting factor.
    Classification: Applications
           Product: kbackup
           Version: unspecified
          Platform: Other
                OS: Linux
            Status: REPORTED
          Severity: normal
          Priority: NOR
         Component: general
          Assignee: kol...@aon.at
          Reporter: pallasw...@proton.me
  Target Milestone: ---

SUMMARY
Compression is painfully slow.

I am currently backing up a profile including some rather large files - VM
images. One is 16.8GB, another is 84.1GB. Kbackup is compressing them, as per
the profile settings. So far I have finished the 16GB file and am at 23% of the
compression of the 84GB file, so I'm about 35 GB in, and the time is 1:50:00 (1
hour 50 minutes) edit: While writing this, the backup completed at 2:22:48,
with a total of about 101GB of files, compressed down to an impressive 6.7GB.

If I perform a full disk backup (which I do occasionally) using clonezilla, an
entire ~800GB of data can be archived and compressed (to a similar ratio) in
around 30-35 minutes. Accordingly, I would expect that Kbackup should be
capable of making a backup 1/8th of the size in roughly 1/8th of the time....
Which would be about 5 minutes. But at 142 minutes it's taking at least 30
times as long, which means it's about 30 times slower than one would reasonably
expect it to be.

I can see by looking at a process monitor that the archiving is using 100% of a
single 1 of the 24 cores available to it. Which lines up about right with the
rough math above. Kbackup's archiving algorithm simply isn't using the
resources available to it.


STEPS TO REPRODUCE
1. Backup a large file or files
2. Grow grey hairs while waiting
3. .... Don't profit? 
:)

OBSERVED RESULT
Extremely long backup times

EXPECTED RESULT
Extremely short backup times

SOFTWARE/OS VERSIONS

OpenSUSE Tumbleweed
KDE Plasma Version: 5.27.10
KDE Frameworks Version: 5.113.0
Qt Version: 5.15.11

ADDITIONAL INFORMATION
Enough complaining, because I generally *really* like Kbackup, so I'd like to
try to help with a solution.

Perhaps supporting a new compression algorithm/format, which uses
multithreading natively, might be an easy solution to this issue?  

Or perhaps better still, the existing compression formats, which produce an
exceptional compression ratio, can be configured to use a multithreaded
implementation? 

Another alternative, even if the existing compression algorithm was used, but
multiple files were compressed simultaneously (in parallel rather than
serially) that would make a great deal of difference. Given that the
compression is so slow, and single-threaded, the disk throughput should not
become a bottleneck here.

I really like Kbackup, it's just this one thing that makes it kinda painful. If
there's anything I could do to help, please do let me know. I used to work in
software development, and while I'm quite inexperienced with KDE such that I
would not be able to provide a solution single-handedly, I might be able to
assist someone else if you would like? I am retired disabled, so I have some
time on my hands to assist.

PS: On looking into this, I have discovered that the 84GB file which I just
spent two hours waiting to compress, is not even included in the backup. The
16GB file is, the 1GB file is, but the 84GB file is simply missing. I will file
a separate bug for this, as this is a serious problem.

-- 
You are receiving this mail because:
You are watching all bug changes.

Reply via email to