Re: [darktable-dev] Proposing better jpeg encoder by Google

2017-03-17 Thread Nils Holle
Hi,

I definitely agree that it shouldn't replace the current library (especially 
considering the performance, didn't think it would be that slow).
I think the nicest way would be a checkbox or a dropdown in the jpeg export 
section.

Nils

Von: Heiko Bauke
Gesendet: Freitag, 17. März 2017 19:43
An: darktable-dev@lists.darktable.org
Betreff: Re: [darktable-dev] Proposing better jpeg encoder by Google

Hi,

Am 17.03.2017 um 16:36 schrieb Holger Klemm:
> I have read, the encoder is very slow...

I just compiled the guetzli tool, took some random image (jpeg file of 
3000x2000 pixel) and recompressed this image, which took 5 minutes and 
32 sec. on my laptop (Intel(R) Core(TM) i7-4500U CPU @ 1.80GHz) 
utilizing more than a Gbyte of main memory.  The original file was 1.8 
Mbytes large, after recompression this reduced to about 1.2 Mbytes.

The new google tool is an option to go only if file size is a major 
issue but not cpu time while compressing.  This is not a general purpose 
jpeg library and therefore it should not replace the jpeg library that 
is currently employed in darktable.


    Heiko

-- 
-- Number Crunch Blog @ https://www.numbercrunch.de
--  Cluster Computing @ http://www.clustercomputing.de
--   Professional @ https://www.mpi-hd.mpg.de/personalhomes/bauke
--  Social Networking @ https://www.researchgate.net/profile/Heiko_Bauke
___
darktable developer mailing list
to unsubscribe send a mail to darktable-dev+unsubscr...@lists.darktable.org




___
darktable developer mailing list
to unsubscribe send a mail to darktable-dev+unsubscr...@lists.darktable.org



Re: [darktable-dev] Proposing better jpeg encoder by Google

2017-03-17 Thread Heiko Bauke

Hi,

Am 17.03.2017 um 16:36 schrieb Holger Klemm:

I have read, the encoder is very slow...


I just compiled the guetzli tool, took some random image (jpeg file of 
3000x2000 pixel) and recompressed this image, which took 5 minutes and 
32 sec. on my laptop (Intel(R) Core(TM) i7-4500U CPU @ 1.80GHz) 
utilizing more than a Gbyte of main memory.  The original file was 1.8 
Mbytes large, after recompression this reduced to about 1.2 Mbytes.


The new google tool is an option to go only if file size is a major 
issue but not cpu time while compressing.  This is not a general purpose 
jpeg library and therefore it should not replace the jpeg library that 
is currently employed in darktable.



Heiko

--
-- Number Crunch Blog @ https://www.numbercrunch.de
--  Cluster Computing @ http://www.clustercomputing.de
--   Professional @ https://www.mpi-hd.mpg.de/personalhomes/bauke
--  Social Networking @ https://www.researchgate.net/profile/Heiko_Bauke
___
darktable developer mailing list
to unsubscribe send a mail to darktable-dev+unsubscr...@lists.darktable.org



Fwd: [darktable-dev] Proposing better jpeg encoder by Google

2017-03-17 Thread Coding Dave
According to https://github.com/google/guetzli:

Introduction

Guetzli is a JPEG encoder that aims for excellent compression density at
high visual quality. Guetzli-generated images are typically 20-30% smaller
than images of equivalent quality generated by libjpeg. Guetzli generates
only sequential (nonprogressive) JPEGs due to faster decompression speeds
they offer.

Note: Guetzli uses a large amount of memory. You should provide 300MB of
memory per 1MPix of the input image.


So if you have a 24Mpix camera this will be 7.2 GB of RAM for exporting a
jpeg.

That does not mean I am against adding it as an option.

___
darktable developer mailing list
to unsubscribe send a mail to darktable-dev+unsubscr...@lists.darktable.org

Re: [darktable-dev] Proposing better jpeg encoder by Google

2017-03-17 Thread Colin Adams
Then it could be an option as to which library was used.

On Fri, 17 Mar 2017 at 15:37 Holger Klemm 
wrote:

> Hi,
> I have read, the encoder is very slow...
>
> Holger
>
>
> Am Freitag, 17. März 2017, 15:16:08 CET schrieb Nils Holle:
> > Hi!
> >
> > Yesterday Google announced Guetzli, a jpeg encoder producing better
> > results than libjpeg as can be read here:
> >
> >
> https://research.googleblog.com/2017/03/announcing-guetzli-new-open-source-j
> > peg.html
> >
> > Code is available on Github: https://github.com/google/guetzli/
> >
> > Wouldn't that be a nice thing to have in Darktable?
> >
> > Best
> > Nils
> >
> >
> ___
> > darktable developer mailing list
> > to unsubscribe send a mail to
> darktable-dev+unsubscr...@lists.darktable.org
>
>
> ___
> darktable developer mailing list
> to unsubscribe send a mail to
> darktable-dev+unsubscr...@lists.darktable.org
>
>

___
darktable developer mailing list
to unsubscribe send a mail to darktable-dev+unsubscr...@lists.darktable.org



Re: [darktable-dev] Proposing better jpeg encoder by Google

2017-03-17 Thread Holger Klemm
Hi,
I have read, the encoder is very slow...

Holger


Am Freitag, 17. März 2017, 15:16:08 CET schrieb Nils Holle:
> Hi!
> 
> Yesterday Google announced Guetzli, a jpeg encoder producing better
> results than libjpeg as can be read here:
> 
> https://research.googleblog.com/2017/03/announcing-guetzli-new-open-source-j
> peg.html
> 
> Code is available on Github: https://github.com/google/guetzli/
> 
> Wouldn't that be a nice thing to have in Darktable?
> 
> Best
> Nils
> 
> ___
> darktable developer mailing list
> to unsubscribe send a mail to darktable-dev+unsubscr...@lists.darktable.org


___
darktable developer mailing list
to unsubscribe send a mail to darktable-dev+unsubscr...@lists.darktable.org



[darktable-dev] Proposing better jpeg encoder by Google

2017-03-17 Thread Nils Holle

Hi!

Yesterday Google announced Guetzli, a jpeg encoder producing better 
results than libjpeg as can be read here:


https://research.googleblog.com/2017/03/announcing-guetzli-new-open-source-jpeg.html

Code is available on Github: https://github.com/google/guetzli/

Wouldn't that be a nice thing to have in Darktable?

Best
Nils

___
darktable developer mailing list
to unsubscribe send a mail to darktable-dev+unsubscr...@lists.darktable.org



[darktable-dev] Re: DT 2.2.3 much faster than 2.2.1 !!

2017-03-17 Thread Matthias Bodenbinder
FYI

The performance difference DT self-compiled vs. distributions binary package 
has been confirmed for debian: 
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=855359

Not sure about other distros. But could be similar.

Matthias



___
darktable developer mailing list
to unsubscribe send a mail to darktable-dev+unsubscr...@lists.darktable.org



Re: [darktable-dev] Fuji Compressed RAW Files

2017-03-17 Thread Roman Lebedev
On Fri, Mar 17, 2017 at 12:30 AM, Uwe Müssel  wrote:
> 2017-03-16 21:50 GMT+01:00 Roman Lebedev :
>>
>> Now, the important question: do you know if that code is noticeably
>> different from
>> the vanilla libraw/rawtherapee version? Does it have any
>> behavior-changing changes?
>> Like, producing different output for the same raw?
>
>
> The main diffrerence between this code and the  libraw/rawtherapee code is
> in reading the file. Libraw does it with fseek and fread into a local
> "small" buffer. In a first version I replaced this by memcpy operations. Now
> I use simple input.getData(...).

> The decompression algorithm itself is untouched. I even do not know how it
> is working in detail. I expect the raw the be equal.
Ok, good.

> There is some more cleanup to be done, e.g. remove obsolete fields.
Then please don't do that just yet.

>> Oh, and, please apply clang-format to decompressors/FujiDecompressor.*
>> and install clang-format git hook, so that the formatting is uniform.
> I  will do this on the next commit.
It probably makes sense to squash it into that first commit.

I think it may make sense for me to merge your work into a separate
branch (raf-compressed) of the repo first.

Did you try building rawspeed on it's own, or only as part of darktable?
Does it actually compile with all the current warning flags?
If not i'd just disable -Werror for now.

Roman.
___
darktable developer mailing list
to unsubscribe send a mail to darktable-dev+unsubscr...@lists.darktable.org