[darktable-user] DT speed

2018-12-07 Thread I. Ivanov

Hi Guys,

I am trying to assess and hopefully improve the speed of DT. The 
particular machine that I am testing on is windows 10 with DT 2.4.4. It 
has 2 GPU - built in Intel and Nvidia quadro P1000. The Nvidia GPU is 
4GB RAM, the computer is 16 GB and CPU is Core i5 gen 8.


Currently I can see it is using the proper GPU (Nvidia P1000) but 
utilization wise it is like small spikes to about 40%. RAM - usually I 
don't go above 8GB.


If I put some high values on number of background threads - like 30 - I 
can see it can have 100% of CPU usage for the sake of testing. But this 
is only when thumbnails are being done. I cannot see any speed gain or 
more GPU usage when exporting of pictures. This GPU is 640 CUDA cores 
and I am under the impression that DT is under utilizing it.


Is there a configuration that I am missing or maybe I am 
misunderstanding what to expect?


Regards,

B


darktable user mailing list
to unsubscribe send a mail to darktable-user+unsubscr...@lists.darktable.org



Re: [darktable-user] Inconsistent Output Message

2018-12-07 Thread I. Ivanov



On 2018-12-07 09:00, David Vincent-Jones wrote:


I occasionally get the message 'Inconsistent Output' on my screen ... 
most often when I am going through a screen zoom with the mouse wheel. 
Apparently there is no damage done. Anybody else seeing this?



I have seen it too.


darktable 2.5.0+979~g22ab1bc66


DT 2.4.4 on Ubuntu 16.04


David


Regards,

B



 
darktable user mailing list to unsubscribe send a mail to 
darktable-user+unsubscr...@lists.darktable.org


darktable user mailing list
to unsubscribe send a mail to darktable-user+unsubscr...@lists.darktable.org



[darktable-user] Inconsistent Output Message

2018-12-07 Thread David Vincent-Jones
I occasionally get the message 'Inconsistent Output' on my screen ...
most often when I am going through a screen zoom with the mouse wheel.
Apparently there is no damage done. Anybody else seeing this?

darktable 2.5.0+979~g22ab1bc66

David



darktable user mailing list
to unsubscribe send a mail to darktable-user+unsubscr...@lists.darktable.org



pEpkey.asc
Description: application/pgp-keys


Re: [darktable-user] opencl on fedora 29

2018-12-07 Thread David Vincent-Jones
You may be suffering from the same problem that I faced in Manjaro.

All of the OpenCL requirements were in place but nvidia-xconfig did not
get automatically created.

From root I ran nvidia-xconfig ... the config file was then created ...
I then did a reboot and dt now starts correctly with OpenCL.

Try that ...

David

On 2018-12-06 10:53 p.m., Germano Massullo wrote:
> Your packages seems to be okay, I don't know why you cannot use OpenCL
>
> 
> darktable user mailing list to unsubscribe send a mail to
> darktable-user+unsubscr...@lists.darktable.org


darktable user mailing list
to unsubscribe send a mail to darktable-user+unsubscr...@lists.darktable.org



pEpkey.asc
Description: application/pgp-keys


Re: [darktable-user] Is Darktable for "multi-image moved-object removal"?

2018-12-07 Thread Anton Aylward
On 06/12/18 02:38 AM, Joachim Durchholz wrote:
> Sorry for responsing late, it took me a while to get around to checking what
> came of the advice.
> 
> Am 02.12.18 um 14:59 schrieb Anton Aylward:
>>> Q2: Is Darktable the right tool for the job?
>>
>> I would not think so.
> 
> The thing that made me interested in DT is that it has a good image 
> management GUI.

I can see how the central database over all images that were ever loaded can
appeal if it is heavily used for tagging and the like.That aspect is not in
my work-flow.

And perhaps I've had more experience, am more familiar with the other Linux file
management tools and their plugins, despite their comparative, being
non-centralized, other aspects.

> During my experiments with Hugin and IM I found that it is pretty hard to keep
> track of what image variant with what tool history is in what directory. It's
> also hard to get a quick overview over all the images in a directory.

that may simply be your unfamiliarity with the other Linux GUI file browsers.
I'll freely admit that the move from KDE3 to KDE4 'crippled' in the minds of
many users, aspects of tools such as the primary file browser.  But there are
other file browsers.  Yes, you may need a 'visualization, plug in for JPGs, you
specific RAW, for PDFs, for office documents.  Yes, tat is documented, and yes
Linux documentation is not like Windows documentation.

And yes, those plugins let the file browsers scale up the images, but you always
have, even in DT, the issue that bigger preview images mean fewer of them
visible on the screen at once.  And bigger screens, from the POV of a computer
user, become counter-productive once they get wider than your immediate field of
view.  A 55-inch screen may be nice in the living room for viewing movies but at
computer operating distances it is too wide to take in atone.

IF AND ONLY IF you want more screen real estate I' do recommend a
one-screen-at-a-time approach.  A 27-inch screen for viewing and side 17-inch
screen for controls is very nice to work with.

I can do that with GIMP.  I'm just sad that DT doesn't have the ability to 'tear
off' the controls and move them to a side screen.

The other thing that newbies or Windows converts encounter with Linux is
overloading directories.
I grew up with hierarchical data models, call them 'taxonomies'.  So for me the
Linux file tree model is quite natural.  But many people try putting everything
into one flat directory/folder and wonder why the system is slow.
It need not be.  Some Linux file systems don't handle large file systems well
and many tool try sorting file names by default, which is not nice when you have
hundreds or thousands of elements.
Yes there are file systems that don't have that problem, if you choose them when
installing Linux.   Yes you can tell applications not to sort.  But in general
the designers of the installation set the defaults for what they considered
'normal' operations, and what we are considering here is far from that.Not
everything scales well.


> I was thinking that DT might help me with that, apart from the image blend
> operation.

This is what I meant when I said
>> I think your mistake is in trying to define *single* a tool to do everything 

That DT can do many things reasonably well does not mean that there are not
tools that do some of those things specifically well.

And what you are asking, even the basic operation, never mind the scale, is
non-trivial.
Whatever you end up with there is going to be a learning curve to surmount.


>>
>> I don't see why you dragged ImageMagick into this[1][2].
> 
> I agree that it is too complicated to use.
> Worse, the detail behaviour of the individual operators can vary from version 
> to
> version.
> Still, what's the alternative?

Unless I want to simply resize I NEVER use ImageMagick.
If I want to do anything non-trivial to an arbitrary JPG I'd probably use GIMP.
GIMP is also scriptable :-)  I'm pretty sure that GIMP can do anything
ImageMagick can.



> Issue is that this means a lot of scripting, and coding, and it's taking 
> months
> to get a suitable workflow rolling.
> (Shell scripting won't work, I have switched to Python...)
> 
>> When you can demonstrate applying the 80/20 rule and have that 80% you will 
>> be
>> feeling more confident and have a better feel for many of the details.
> 
> It's roughly 400,000 pages, or 2,000 books. With 80/20 and just my free time,
> I'd be doing a book per day; you do the math how long it would take me to
> complete digitizing that.

Are these constraint imposed upon you?  Can't you go back to your principals on
this and show them the scale of the issue?

I've read of other projects like this at universities where they get the
students to 'volunteer' for this kind of tedious but intellectually demanding
jobs in return for course credits.



>> I say this proportion basted on other projects in other fields of my life; 
>> there
>> seems to be something in the nature of