Skip to content

How a journalism professor got his class to put AI to the test and think like an AI model in the process

John Wihbey’s class AI and Media Industries is designed to get students working with AI to understand the uses –– and limits –– of these tools on journalism, social media and public relations.

The hands of two people sitting at two different laptops typing next to each other.
Photo by Matthew Modoono/Northeastern University

In Northeastern University’s AI and Media Industries class, John Wihbey wants his students to not only use artificial intelligence but to think like it.

Wihbey, an associate professor of journalism at Northeastern and director of the AI-Media Strategies Lab, says it’s important for the next generation to understand a technology that is “going to reshape, affect [and] transform news media, social, public relations and strategic communications.”

“Once you get over the initial awe and you get over the initial disappointment that they’re not perfect, you start to realize that these are unique tools, but they are tools,” Wihbey says. “They can make your life easier and you can do work more efficiently, but you have to be able to use them in a really smart and targeted way and know what they’re good at and what they’re not good at.”

In Wihbey’s graduate level course, students get exposed to ChatGPT, Claude, Gemini and other AI models in a variety of ways. They spend hours using these tools in “a sort of experiential education paradigm” that allows them “to learn to think like that model and … understand its strengths, its weaknesses [and] the limitations,” Wihbey says. 

Students take the time to learn and research potential use cases for AI in certain areas, like social media and journalism, but Wihbey emphasizes the value of putting those lessons to work. 

Part of the class involves partnering with nonprofits through Northeastern’s Service Learning program to find ways of integrating AI into their work. That could look like helping an organization map out a communications strategy or assembling historical archives.

“It has a sort of forcing function where you have to do something that’s not in the abstract but really think about how we would use these tools to make an impact in the real world with an organization that actually has needs,” Wihbey says. “That’s a terrific way of serving the community but also having a rich intellectual exercise and having that intellectual exercise might mean realizing that not everything is an AI problem.”

In another part of the class, Wihbey tasked his students with scoping out investigative stories that could potentially use AI. Students didn’t go as far as writing these stories, but they did pitch them to Jill Abramson, former executive editor of the New York Times and a distinguished professor of the practice at Northeastern.

Wihbey and his students approached all of their work with a “rigorous skepticism,” Wihbey says, acknowledging the pitfalls of the technology, like its tendency to hallucinate. His class had to reckon with some of the questions that still exist around AI, including how much of what it produces belongs to a human or a machine. 

But Wihbey says these are the questions worth asking. He hopes his students are now better equipped to answer them.

“These are questions for every professional across the world and in almost every walk of life: Where is that line?” Wihbey says. “What we tried to do is just put in a lot of intellectual capital on our inputs and produce deliverables where it was substantively shaped by a lot of human thinking and a lot of craft and a lot of care and a lot of research. But the models are helping us in important ways, and I would like to think we learned more than we could’ve without them.”