https://bugs.kde.org/show_bug.cgi?id=475205
--- Comment #4 from Benoît Tarrade <benoittarr...@hotmail.fr> --- Hi Farid, thank you for your reply. I just tested now with kdenlive on master branch and mlt on master branch as well : * kdenlive commit a89e93b672bf51f918945e1791bb8f24652bf871 -> Github Mirror * mlt commit 04e78fc3ecfa75de4c8d7a5ebb84d2160d606d37 # Loading time observations It takes more than 4 minutes (both Releases builds) just to compute the hashes. This is far too long, in the meantime the whole software is frozen, and Gnome (in my case) keeps asking if we can kill that rogue process, which is to be expected as Qt can't poll the event handler, for some reasons. Once hashes are computed and kdenlive populates the resource folder, it then proceeds to actually load the data in RAM, using MLT. It takes somewhere between 1 and 2 minutes on my machine, meanwhile the UI is responsive again (with the amount of task increasing fast in the top left corner, then decreasing slowly when kdenlive starts to process the data). # Ram usage A word about RAM usage (this would probably need to be in another ticket, as it is linked to this one but the cause is different) I have 33GB of data referenced in the project tree, and about 20 clips in the timeline, with an average size of 60MB each. Now, kdenlive (thus MLT, in this case) has the whole timeline loaded in memory (around 10GB), and as I have a 16GB machine we can very easily see that I won't be able to add much more files to the timeline. I tried to reproduce something close to that in Blender, which works differently and here are the major differences : * No project/source explorer whereas kdenlive has one : having a project explorer is very neat as it helps organize data, and that's a real + for kdenlive ; however there is absolutely NO delay and no hash calculation-induced delay on Blender size. The experience is 100% transparent, I believe this is what Kdenlive should aim for IMHO. * With 20 clips added to Blender's timeline, Ram usage is kept at a minimum and does not grow the more we add objects to timeline. What makes the RAM usage grow is the amount of overlapping clips, because we need to process the data altogether. Clip data is streamed on the fly ! And that makes for a video editor that can eat 100+++ Gb of clips and still only using just the clip size, sometimes less, depending on the clip size. => I am not advocating a solution is better (conceptually) than the other, Kdenlive is a crazy good software and it's at the right spot to do the job. Blender has different goals and is built in a very singular manner, and it's video editing tool, despite being really powerful, has a peculiar UX, if we compare it to what we are accustomed to in the video editing industry, I think kdenlive best fits this niche. -- You are receiving this mail because: You are watching all bug changes.