On Tue, Jul 30, 2019 at 7:05 PM Alan Gauld via Tutor <tutor@python.org> wrote: > > On 30/07/2019 18:20, boB Stepp wrote: > > > What is the likelihood of file storage corruption? I have a vague > > sense that in earlier days of computing this was more likely to > > happen, but nowadays? Storing and recalculating does act as a good > > data integrity check of the file data. > > No it doesn't! You are quite likely to get a successful calculation > using nonsense data and therefore invalid results. But they look > valid - a number is a number...
Though I may be dense here, for the particular example I started with the total score in a solitaire game is equal to the sum of all of the preceding scores plus the current one. If the data in the file somehow got mangled, it would be an extraordinary coincidence for every row to yield a correct total score if that total score was recalculated from the corrupted data. But the underlying question that I am trying to answer is how likely/unlikely is it for a file to get corrupted nowadays? Is it worthwhile verifying the integrity of every file in a program, or, at least, every data file accessed by a program every program run? Which leads to your point... > Checking data integrity is what checksums are for. When should this be done in normal programming practice? -- boB _______________________________________________ Tutor maillist - Tutor@python.org To unsubscribe or change subscription options: https://mail.python.org/mailman/listinfo/tutor