As many people have pointed out, storing compression level is generally
meaningless, as even the reference encoder can be configured to not be on
any predefined compression level.
I see two approaches for your case:
1. Monitor the CPU usage and adjust the frequency. The possible problems
with this
I would also add that the "level" is only relevant to the reference encoder,
as it is an alias to quickly set a number of parameters. Other encoders
might make different choices.
Also, I haven't looked in details at how the encoder works, but I think it
is based on heuristics to decide which is the
Hjalmar,
I recall that many hardware decoders have a limit on the level that
they can handle. At the very least, hardware encoders may only
support a subset of the compression levels. As for the decoders, I
cannot remember specifically whether they are all capable of decoding
all compres
HI all,
The current released version of flac (1.2.1) fails to build with
g++ version 4.5.2 in Debian:
if g++ -DHAVE_CONFIG_H -I. -I. -I../../../.. -DFLaC__INLINE=__inline__
-DNDEBUG -I../../../.. \
-I./ include -I../../../../include -g -O2 -MT main.o -MD -MP -MF
".deps/main.Tp
Hi,
I'm developing a flac decoder in an embedded environment. I have it fully up
and running but I am trying to optimise the performance vs. the power it takes
to decode the stream.
Using the reference decoder code with a few optimisations for the hw I'm on I
experience quite a difference in