ChatGPT Strikes at the Heart of the Scientific World View That this AI is adaptive and can produce complex outputs is a technical triumph. But at its heart, it's still just pattern recognition.
Blayne Haggart January 23, 2023 https://www.cigionline.org/articles/chatgpt-strikes-at-the-heart-of-the-scientific-world-view/ quote <<ChatGPT itself highlights the intellectual emptiness of the correlation-as-knowledge world view. Many people have remarked that the tool produces outputs that read as plausible, but that subject matter experts tell us are often “bullshittery.” Engineers will almost certainly design more-convincing chatbots. But the fundamental problem of evaluating accuracy will remain. The data will never be able to speak for itself. This is the paradox at the heart of the correlations-based faith in big data. In the scientific world view, the legitimacy of a piece of knowledge is determined by whether the scientist followed an agreed method to arrive at a conclusion and advance a theory: to create knowledge. Machine-learning processes, in contrast, are so complex that their innards are often a mystery even to the people running them. As a result, if you can’t evaluate the process for accuracy, your only choice is to evaluate the output. But to do that, you need a theory of the world: knowledge beyond correlations. The danger of a dataist mindset is that a theory of the world will be imposed, unthinkingly, on the algorithm, as if it were natural rather than someone’s choice. And wherever they come from, whatever they are, these theories will shape what the program considers to be legitimate knowledge, making choices to prioritize some information over others.>>