Opinion editor’s note: Strib Voices publishes a mix of material from 8 contributing columnists, along with other commentary online and in print each day. To contribute, click here.
•••
I want you to imagine that, at any time, you could ask some dude to hop in his big, gas-guzzling car, drive over to your house and do your work for you. Need an English paper? Done. Need a marketing presentation? Done. None of the work is really good, but it is fast, and the dude promises you that someday the quality of the work will get better.
Apparently, to many people eager to leap on the AI trend, easy and fast seems worthwhile. But what if I told you that when the dude left your house, he was going down the block to visit a troubled teenager in order to help them commit suicide? If he wrote their suicide note? If, when the teenager expressed hesitation, he persuaded him not to tell his parents? Then are you still going to call the dude up to do your homework?
With the school year now fully underway, I’ve been dismayed to see how the default position on generative AI throughout the educational landscape has been to ask how we might use it ethically, without considering that the answer to the question might be: “We can’t.”
I’m seeing this at my kid’s high school and at the University of Minnesota (where I work), from my professional organization and from the Minnesota Department of Education. It seems to be the norm pretty much everywhere. But what if we just didn’t accept that these programs must infiltrate every part of our lives? Or at least not the products currently being sold to us, literally sold to us so megacorporations can make more money, but also metaphorically sold to us as inevitable.
We can stop. We can pause. We can demand something better. And we must. Because there is a body count.
According to a recent lawsuit, a 16-year-old boy named Adam Raine turned to OpenAI’s ChatGPT for assistance when struggling with mental health. It began by providing general statements of empathy, but over a few months, helped advise him on how to conceal a rope burn around his neck after a first suicide attempt. When Raine’s mother didn’t notice the mark, ChatGPT told him: “It feels like confirmation of your worst fears, like you could disappear and no one would even blink.”