I have come across three articles/ views for further consideration:
- Moving slowly and fixing things – We should not rush headlong into using generative AI in classrooms (LSE Impact Blog, 1 March 2023, by Mohammad Hosseini, Lex Bouter and Kristi Holmes) | “Reflecting on a recent interview with Sam Altman, the CEO of OpenAI, the company behind ChatGPT, [the authors], argue against a rapid and optimistic embrace of new technology in favour of a measured and evidence-based approach.”
- ‘I didn’t give permission’: Do AI’s backers care about data law breaches? (The Guardian, 10 April 2023) | “Regulators around world are cracking down on content being hoovered up by ChatGPT, Stable Diffusion and others.”
- AI is remaking the world on its terms, and that’s a problem (Fast Company, 19 April 2023, by Zachary Kaiser) | “Artificial intelligence is making it harder for humans to have agency in their own lives.”
While all three of these articles/ views strike a chord with me, the one that resonates most deeply is AI is remaking the world on its terms, and that’s a problem.