On 2018-12-23 15:28, Patrick Shanahan wrote:
* I. Ivanov <iv3...@gmail.com> [12-23-18 17:57]:
Hi Guys,

I had an unfortunate event to clear metadata from about 600 images (it took
me close to 3h to restore as much as I could).

Because there is no "undo" this brings me to the question.

What can I do to have a good backup for such cases? I can backup easily (and
very frequent) "/home/myhome/.config/darktable"

However - if metadata (or tags) happen to be deleted by accident and I
restore the DB - how can I push the DB to re populate the XMP files?
perhaps (not tried), remove the images from the db
                      import the images again
will/should read the accompaning xmp files

or, maybe just telling dt to read the xmp files will work

Not sure if I expressed myself correctly... If I backup

"/home/myhome/.config/darktable" then data.db and library.db would be the 
"correct" ones.
To my best knowledge - when Darktable works - it uses data from the .DB files 
(not from .XMP) and when the file changes - it populates the data to .XMP

The reason why I am trying to backup the database (as an opposite to the .XMP) 
is because I can easily backup the DB in the cloud with multiple versions (and 
restore what I need). I wouldn't be able to easily restore the .XMP files (to 
the belonging locations) even if I manage to accelerate the frequency of their 
backups.

DT can check for changed .XMP files but I am under impression that the checking 
and populating of changes is unidirectional - XMP to DB (not DB to XMP). Is it 
possible at all to push what is stored in the .DB files to the .XMP? For 
example - can I delete the .XMP files and rely that the information stored in 
the .DB would create accurate .XMP for all files that are missing the .XMP?

Regards,

B

____________________________________________________________________________
darktable user mailing list
to unsubscribe send a mail to darktable-user+unsubscr...@lists.darktable.org

Reply via email to