On 2015-07-04 02:17, telmo bacile wrote:
Hi list,   I found a code that calculates entropy of images with
python that can be used for classifying interesting images from
uninteresting ones. Interesting images has more structured patterns
while uninsteresting are more noisy or completely homogeneous.

I was thinking this code (entropy of image) can be used for measuring
the level of disorder of a group of points in the image.
For example:
Imagine that we have 3 images,  each image has 6 dots, the first one
has very ordered dots , the second one have dots a little bit
disordered and the third one has very dissordered dots.
Then entropy of each image should measure the level of
dissorganization of the dots.

But the wierd thing is that when i experimented with this i got resuts
without sense.
The result i get is that  the image with intermedium dissorder has
less entropy that the very ordered image .  Do anybody have an idea
why im getting this result?

There is no single quantity that is "The Entropy" of a given image. Images don't have "An Entropy". Probability distributions do. Images aren't probability distributions, but there are several distributions of quantities that will be associated with an image. The entropy of each of these distributions will each tell you something different about the image.

What you are calculating is the entropy of the distribution of intensities of the R, G, and B channels. This entropy quantity is related, more or less, to the variety of colors that are in the image. This distribution (and thus the entropy computed from it) doesn't take into account the spatial layout of the pixels. You could take a spatially well-ordered image and rearrange the pixels completely randomly; the entropy quantity that your code is computing will be exactly the same because the pixels contribute to the histogram in the same way.

--
Robert Kern

"I have come to believe that the whole world is an enigma, a harmless enigma
 that is made terrible by our own mad attempt to interpret it as though it had
 an underlying truth."
  -- Umberto Eco

--
https://mail.python.org/mailman/listinfo/python-list

Reply via email to