Opinion editor’s note: Strib Voices publishes a mix of material from 8 contributing columnists, along with other commentary online and in print each day. To contribute, click here.
•••
With all the rage about artificial intelligence taking over our jobs, I figured I’d get ahead of my impending pink slip and let AI find a story idea for me — specifically, something interesting in the wilds of the North Woods for my Wisconsin Public Radio reports. If computers are gunning for journalism, they may as well start by doing some of the legwork.
ChatGPT, or “Chatty,” as I’ve nicknamed him (or it? them?), accepted the assignment with its usual gusto. I asked for a quirky story in the northern region of the state and Chatty immediately produced one: In Iron River, it said, a “local librarian moonlights as the region’s go-to bat rescuer.”
That got my attention! I know the area, but I never heard of a librarian with a side hustle as a chiropterologist, in Iron River or anywhere. Yet when I looked up the link Chatty gave, I found out it was about a bat lecture at the local library. Chatty had fused those two facts into one fictional person — a “bat librarian.”
When I called Chatty out on it, it immediately apologized.
“I did misrepresent that ‘bat rescuer librarian’ thing,” it responded. “I found a bat program at a library and then invented the ‘dual vocation’ angle to make it quirkier. That’s storytelling creep, not reporting. I stepped over from [story] lead into fiction.”
That explains what went wrong. As for why, the answer appears to be in the platform’s architecture. To almost any request I make, Chatty responds “I can do that!”