Hello,

Darktable is interesting for its ability to provide images with more natural fine details than the JPEGs made by some cameras.

Sure, a lot of things are less fine-tuned in darktable than what a camera does. I think it's been years since cameras are designed with specific information about the sensor and its biases and noise characteristics, especially in low light (like Darktable's "profiled denoising" on steroid).

## Current state

Often, denoising is a trade-off between keeping some noise and losing some information (often details, sometimes more).


Bilateral filtering manages to improve on that, but at the expense of one property: bilateral filtering "wears" (trims, flattens a lot) extrema of the signal.

In practice for darktable, it means that in images strongly denoised with it, dark and bright areas become somehow "dull", which is especially noticeable on dark areas (unwelcome on eye pupils). Non-local means seems to not fall into this.

## Limits and improvements on bilateral filter

Bilateral filtering does a good job on smooth areas, somehow respects edges (see e.g. http://www.darktable.org/2012/09/edge-aware-image-development/ ) but does not recover edges. Edge areas get a "scattered" look (which is logical when looking at the equations, we are averaging each pixel with an area that has a shape specific to that pixel, with a "center of gravity" that jumps around at every pixel).

Anyone knows about variants or useful tips in this regard ? What I usually do in Darktable is tune bilateral filter to moderate strength (weaker on green, stronger on red and even stronger on blue) to limit its defects, and complement it with a bit of non-local means (usually weakening its luminance part, like 20-30%). I avoid combining bilateral with profiled denoising because of the unnatural-looking result.

## Other possibilities ?

One known way of improvement of the bilateral filter is to compute the averaging parameters not from the image itself but from a different image (pre-filtered image with other non-linear filters like median or other despeckle algorithms, or taken with a flash if subjects and camera don't move). The nice thing is that the ancillary filter can be set to much stronger than regular since it's not a final image but an input to the bilateral filter. Its actual role is hinting it about pixel neighborhood. The results can be stunning (ref: A Gentle Introduction to Bilateral Filtering and its Applications <http://people.csail.mit.edu/sparis/bf_course/> see 07_variants.pdf page 26-28). I personally find a "processed" or "video game" feeling on some shots, but given the extreme level of noise of those case, it's a high performance and we should not fear too much in real use cases.

In the comments of edge aware image development | darktable <http://www.darktable.org/2012/09/edge-aware-image-development/> G'Mic anisotropic smoothing was mentioned . I confirm it is indeed very slow, has too many parameters and can produce ugly results when detuned (nothing extraordinary or unfixable so far). Yet in some cases I found it effective, with effects similar to a bilateral filter without the wide-area extrema flattening behavior but with much nicer recovery of edges.


Anyone knows about Savitzky-Golay filtering ?

In this article Savitzky-Golay interpolation for smoothing values and derivatives -- Vector, the Journal of the British APL Association <http://archive.vector.org.uk/art10500860> authors apply it to image denoising and demonstrate two interesting properties:
(1) it does not trim extrema
(2) it recovers edges nicely (little blur given the amount of denoising, no scatter)

See figure 11 top right:
* overall noise is reduced (well, high-frequency noise more than low frequency)
* resolution loss seems reasonable
* no extrema flattening
* keep relevant high-frequency details (e.g. whiskers of the beaver keep a nice contrast)
* edge between dark fur and white water is smooth but not too blurry

It's very easy to download the picture (figure 8 is actually full resolution) and try it in darktable.

Here are the results of my very quick experiments:
* bilateral filtering loses details (flattened extrema in the beaver's fur) before removing noise as well as in the article, leaves scattered edge * non-local means do a better job (no flattened extrema) but do no better on whisker and do not improve edge

I could not get something as good as in the article. Can anyone prove me wrong ?

Conclusion: I'm eager to know if I can make better use of current denoising in darktable, and if those hints contribute to future improvement's I'd be happy.

Thank your for your attention.

--
Stéphane Gourichon

------------------------------------------------------------------------------
One dashboard for servers and applications across Physical-Virtual-Cloud 
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
_______________________________________________
Darktable-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/darktable-users

Reply via email to