On 06/12/18 02:38 AM, Joachim Durchholz wrote:
> Sorry for responsing late, it took me a while to get around to checking what
> came of the advice.
> 
> Am 02.12.18 um 14:59 schrieb Anton Aylward:
>>> Q2: Is Darktable the right tool for the job?
>>
>> I would not think so.
> 
> The thing that made me interested in DT is that it has a good image 
> management GUI.

I can see how the central database over all images that were ever loaded can
appeal if it is heavily used for tagging and the like.    That aspect is not in
my work-flow.

And perhaps I've had more experience, am more familiar with the other Linux file
management tools and their plugins, despite their comparative, being
non-centralized, other aspects.

> During my experiments with Hugin and IM I found that it is pretty hard to keep
> track of what image variant with what tool history is in what directory. It's
> also hard to get a quick overview over all the images in a directory.

that may simply be your unfamiliarity with the other Linux GUI file browsers.
I'll freely admit that the move from KDE3 to KDE4 'crippled' in the minds of
many users, aspects of tools such as the primary file browser.  But there are
other file browsers.  Yes, you may need a 'visualization, plug in for JPGs, you
specific RAW, for PDFs, for office documents.  Yes, tat is documented, and yes
Linux documentation is not like Windows documentation.

And yes, those plugins let the file browsers scale up the images, but you always
have, even in DT, the issue that bigger preview images mean fewer of them
visible on the screen at once.  And bigger screens, from the POV of a computer
user, become counter-productive once they get wider than your immediate field of
view.  A 55-inch screen may be nice in the living room for viewing movies but at
computer operating distances it is too wide to take in atone.

IF AND ONLY IF you want more screen real estate I' do recommend a
one-screen-at-a-time approach.  A 27-inch screen for viewing and side 17-inch
screen for controls is very nice to work with.

I can do that with GIMP.  I'm just sad that DT doesn't have the ability to 'tear
off' the controls and move them to a side screen.

The other thing that newbies or Windows converts encounter with Linux is
overloading directories.
I grew up with hierarchical data models, call them 'taxonomies'.  So for me the
Linux file tree model is quite natural.  But many people try putting everything
into one flat directory/folder and wonder why the system is slow.
It need not be.  Some Linux file systems don't handle large file systems well
and many tool try sorting file names by default, which is not nice when you have
hundreds or thousands of elements.
Yes there are file systems that don't have that problem, if you choose them when
installing Linux.   Yes you can tell applications not to sort.  But in general
the designers of the installation set the defaults for what they considered
'normal' operations, and what we are considering here is far from that.    Not
everything scales well.


> I was thinking that DT might help me with that, apart from the image blend
> operation.

This is what I meant when I said
>> I think your mistake is in trying to define *single* a tool to do everything 

That DT can do many things reasonably well does not mean that there are not
tools that do some of those things specifically well.

And what you are asking, even the basic operation, never mind the scale, is
non-trivial.
Whatever you end up with there is going to be a learning curve to surmount.


>>
>> I don't see why you dragged ImageMagick into this[1][2].
> 
> I agree that it is too complicated to use.
> Worse, the detail behaviour of the individual operators can vary from version 
> to
> version.
> Still, what's the alternative?

Unless I want to simply resize I NEVER use ImageMagick.
If I want to do anything non-trivial to an arbitrary JPG I'd probably use GIMP.
GIMP is also scriptable :-)  I'm pretty sure that GIMP can do anything
ImageMagick can.



> Issue is that this means a lot of scripting, and coding, and it's taking 
> months
> to get a suitable workflow rolling.
> (Shell scripting won't work, I have switched to Python...)
> 
>> When you can demonstrate applying the 80/20 rule and have that 80% you will 
>> be
>> feeling more confident and have a better feel for many of the details.
> 
> It's roughly 400,000 pages, or 2,000 books. With 80/20 and just my free time,
> I'd be doing a book per day; you do the math how long it would take me to
> complete digitizing that.

Are these constraint imposed upon you?  Can't you go back to your principals on
this and show them the scale of the issue?

I've read of other projects like this at universities where they get the
students to 'volunteer' for this kind of tedious but intellectually demanding
jobs in return for course credits.



>> I say this proportion basted on other projects in other fields of my life; 
>> there
>> seems to be something in the nature of human project that means the 80/20 
>> rule
>> works "all the way down".
> 
> Sort of. You can do better, but the effort to automate rises.
> If a few months of tool selection / improvement spare me years of
> postprocessing, it's still worth it.

The tools ARE 'postprocessing'.
The more 'automation' means more scripting, so an appealing GUI becomes a
non-issue.  Taking the time to learn an difficult and awkward scripting systems
such as for hugin or GIMP becomes more worth while with the scale of the 
project.


____________________________________________________________________________
darktable user mailing list
to unsubscribe send a mail to darktable-user+unsubscr...@lists.darktable.org

Reply via email to