Home › Forums › The nature of inquiry and information literacy › ChatGPT et al › Reply To: ChatGPT et al
Thank you for these links, which I will explore. I am reflecting on the following…
ChatGTP was first released by OpenAI on 30 November 2022.
In an article unrelated to ChatGTP, and published at practically the same time on 2 December 2022, Cory Doctorow makes the following point in How tech changed global labor struggles for better and worse:
The original sin of both tech boosterism and tech criticism is to focus unduly on what a given technology does, without regard to who it does it to and who it does it for. When it comes to technology’s effect on our daily lives, the social arrangements matter much more than the feature-sets.
This strikes me as precisely the place to start thinking from.
At the same time I happened to be re-reading Neil Postman‘s The End of Education: Redefining the Value of School (1996), which expresses similar concerns. In it Postman makes the point that what we need to know about important technologies “is not how to use them, but how they use us” (p. 44).
Postman goes on to say (p. 44):
I am talking here about making technology itself an object of inquiry, so that Little Eva and Young John in using technologies will not be used or abused by them, so that Little Eva and Young John become more interested in asking questions about the computer than in getting answers from it.
Postman elaborates on this (pp. 188-193, emphasis added), with specific reference Marshall McLuhan:
McLuhan comes up here because he is associated with the phrase “the extensions of man.” [This brings me to] my third and final suggestion [which] has to do with inquiries into the ways in which humans have extended their capacities to “bind” time and control space. I am referring to what may be called “technology education.” … Technology may have entered our schools, but not technology education. …
McLuhan, while an important contributor [to the great story of humanity’s perilous and exciting romance with technology], was neither the first nor necessarily the best who has addressed the issue of how we become what we make. … This is a serious subject. …
[My suspicion for why there is no such subject in most schools] is that educators confuse the teaching of how to use technology with technology education. … As I see it, the subject is mainly about how [technologies] reorder our psychic habits, our social relations, our political ideas, and our moral sensibilities. It is about how the meanings of information and education change as new technologies intrude upon a culture, how the meanings of truth, law, and intelligence differ among oral cultures, writing cultures, printing cultures, electronic cultures. Technology education is not a technical subject. It is a branch of humanities. Technical knowledge can be useful, [but it is not necessary] …
It should also be said that technology education does not imply a negative attitude toward technology. It does imply a critical attitude. … Technology education aims at students’ learning about what technology helps us to do and what it hinders us from doing; it is about how technology uses us, for good or ill, and how it has used people in the past, for good or ill. It is about how technology creates new worlds, for good or ill. …
But in addition [to all the questions I have cited], I would include the following ten principles [in this subject]:
Finally, for now, I have just come across Noam Chomsky: The False Promise of ChatGPT (8 March 2023), a guest essay in The New York Times. I have not yet read this properly, but the following struck me as particularly insightful:
It is at once comic and tragic, as Borges might have noted, that so much money and attention should be concentrated on so little a thing — something so trivial when contrasted with the human mind, which by dint of language, in the words of Wilhelm von Humboldt, can make “infinite use of finite means,” creating ideas and theories with universal reach.
The human mind is not, like ChatGPT and its ilk, a lumbering statistical engine for pattern matching, gorging on hundreds of terabytes of data and extrapolating the most likely conversational response or most probable answer to a scientific question. On the contrary, the human mind is a surprisingly efficient and even elegant system that operates with small amounts of information; it seeks not to infer brute correlations among data points but to create explanations.
Next week I hope to integrate this practically with what I have written so of our academic integrity policy.