On Thursday 05 May 2016 17:34, Stephen Hansen wrote:

> Meh. You have a pedantic definition of wrong. Given the inputs, it
> produced right output. Very often that's enough. Perfect is the enemy of
> good, it's said.

And this is a *perfect* example of why we have things like this:

http://www.bbc.com/future/story/20160325-the-names-that-break-computer-
systems

"Nobody will ever be called Null."

"Nobody has quotation marks in their name."

"Nobody will have a + sign in their email address."

"Nobody has a legal gender other than Male or Female."

"Nobody will lean on the keyboard and enter gobbledygook into our form."

"Nobody will try to write more data than the space they allocated for it."


> There's no situation where "&&&&&" and "     " will exist in the given
> dataset, and recognizing that is important. You don't have to account
> for every bit of nonsense.

Whenever a programmer says "This case will never happen", ten thousand 
computers crash.

http://www.kr41.net/2016/05-03-shit_driven_development.html


-- 
Steven D'Aprano

-- 
https://mail.python.org/mailman/listinfo/python-list

Reply via email to