On 8/9/19 12:47 PM, Russell Reiter via talk wrote:
On Fri, Aug 9, 2019, 8:58 AM James Knott via talk, <talk@gtalug.org <mailto:talk@gtalug.org>> wrote:

    On 2019-08-09 08:03 AM, Russell Reiter via talk wrote:
    > This problem is amplified by making a copy of a copy of a copy etc.

    That is certainly the case with analog, but with digital the copies
    should be exactly the same, even if on a different media.


There is a possibility that keeping your photos in raw form will protect from major copy errors, but in all situations of moving bits in a data stream, there is the possibility of transient error. Jpg was considered lossy as it could not fully recreate the the full raw data. 25mb or more of raw image data per image is, or was, a hefty size to move across the bus in early days, much less across the internet. Now we have so called lossless JPEG, however its accuracy is based on predictive sampling rather than a pure collection of bits per pixel.
When you say keeping your photos in the raw format you are opening up a few other questions. First. If the photos are film based then over time your master image will fade and any copying will result in some form of loss. If your using most phones/cameras you have no actual access to the raw images unless you start hacking so that your first image is already a less than exact copy of the intensity of the raw pixels in the sensor.

Once you have an image file then copying will almost always result in a perfect copy of the original. Bit error rates in copying and storage are exceedingly low and there are numerous methods to decrese the risk go insanely low levels. Data loss in modern computer systems during copy is much more of an academic discussion than a real life discussion.



https://en.m.wikipedia.org/wiki/Lossless_JPEG

It is not clear in the referenced article that lossless JPEG is a lossy compression scheme. It may work as well as any of the popular compression schemes like Huffman or LZP where data can be compressed and exactly restored. All compression schemes take advantage of some kind of pattern in the data once the data does not match that pattern then often the compressed data can be larger then the raw data. There is no compression scheme that will guarantee compression of any arbitrary data stream.


Humans lose thousands of skin cells a day, yet the fabric of the persona stays the same. I think photo data is a little the same, a few stray bits lost here or there won't change the picture that much. However it is possible to corrupt that significant bit which would make decoding the picture impossible.

Raw digital photo images as an array of RGB pixels will be very tolerant of bit errors without destroying the whole of the image. GIF or JPEG compression has the downside of being sensitive to errors and will not fail gracefully but given the reliability of digital copies I would be hard pressed to say that it is less reliable than a photo in a drawer.

Data loss by accident is way more likely than data loss due to bit-rot.

--
Alvin Starr                   ||   land:  (647)478-6285
Netvel Inc.                   ||   Cell:  (416)806-0133
al...@netvel.net              ||

---
Post to this mailing list talk@gtalug.org
Unsubscribe from this mailing list https://gtalug.org/mailman/listinfo/talk

Reply via email to