I can't speak for other professions, but as a software engineer, I am a rational empiricist. I design a specification, translate it into an implementation, compile it with a trusted compiler, observe behavior on a target machine, using the user interface and possibly debugging tools. I once spent two weeks on designing an overlay loading operating system. Of course I had a bug in the 3000 lines of code which I typed up. After five days of trying diagnostic methods, and repeatedly reading the code over and over again, I found that the bug was simply that I had reversed two lines of code performed sequentially for calling certain functions. Despite repeated efforts at finding the problem, of my OWN design and implementation, I was blind to the oversight until I was tearing my hair in frustration and looked at the code with the eyes of an idiot rather than an expert. If I had not KNOWN that there was a flaw in the code which occurred only under certain specific circumstances, then I would have passed it out as bug free.
The lesson to be learned is that unless you go LOOKING for flaws in a theory or practical application, your ASSUMPTION that there are no flaws (based upon insufficient evidence) will blind you to the fact that flaws exist. Only rigorous testing by persons with various cultural blind spots will detect the gems of inconsistency which lead to great leaps forward in theory, and ultra reliable applications. Lonnie Courtney Clay -- You received this message because you are subscribed to the Google Groups "Epistemology" group. To post to this group, send email to epistemol...@googlegroups.com. To unsubscribe from this group, send email to epistemology+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/epistemology?hl=en.