MaryJo Webster has been data editor at the Star Tribune since 2015. She started her career as a reporter at small daily papers in Minnesota and Wisconsin before attending the University of Missouri-Columbia to specialize in investigative reporting and data journalism. She has worked for Investigative Reporters and Editors, Center for Public Integrity in Washington, D.C., USA Today, and the St. Paul Pioneer Press. MaryJo teaches at the University of Minnesota as well as trains Star Tribune reporters on working with data. She is a regular speaker at journalism conferences.

Pretty much your entire journalism career has been data. What about data and the analysis of it appeals to you?

I realized early on in my career that I am the kind of person who wants to know every last detail and fully understand anything I write about. As a beat reporter, I would ask question after question and gather far more material than was needed for my stories. Once I ventured into data, I realized that this was the ultimate way to truly understand a topic. Analyzing data allows you to see the minute details, and then summarize everything up to see the big picture, too.
How has your job changed over the years?

The main change in my job came in the late-2000s when news organizations began using data for searchable databases and interactive maps on their websites. This required me to learn new digital skills, but also provided a much bigger outlet for my work. Instead of merely publishing a few graphics and some paragraphs summarizing my findings in a story, these digital features make it possible for me to share the data with readers and let them explore it, as well.
How many data records requests do you generally have outstanding at a time?

I usually have at least a couple pending data requests at any given time, maybe more if I'm in the midst of a big project that might require collecting data from numerous government agencies. However, in recent years I haven't had to rely as much on official data requests to government agencies because so many of them are making data readily available via their websites.
What is the longest you've had to wait to get a data request answered? What's the most you've been charged?

Among requests made to local or state agencies in Minnesota, the longest I've had to wait is about six months. But I've had some requests to federal government agencies take up to two years to be completed. Under Minnesota law, government agencies may charge a requestor for the time it takes to find and retrieve data, and typically agencies will require us to pay for larger datasets we request. Usually these bills are between $50 and $100, which means the agency spent about one or two hours working on our request. It might be higher for a large, complicated dataset. For example, I recently paid $250 for a large dataset that we've never requested before. Some agencies have asked me to pay as much as $10,000 for data, but after pointing out the limitations on copy costs specified by the Minnesota Data Practices Act, the final amount usually ends up at a much more reasonable number.
Is it harder to get data in Minnesota compared to other states that you've worked in?

Compared to other states, it's relatively easy to get data in Minnesota. Some of that is due to the fact that Minnesota has a fairly good open records law, which specifies that all government data is public unless otherwise exempted in law. Not all states operate that way. Second, we have a lot of government agencies (at both the state and local levels) that seem to embrace this transparency. A fellow data journalist who works at the Boston Globe repeatedly tells me how jealous he is of the work I've been doing in recent years because nearly every big project I've worked on has relied on data that we can get in Minnesota, but the public can't get in Massachusetts. On the other hand, I'm jealous of him getting to work on the Spotlight investigative team that was featured in a famous movie.
As the news industry continues to shrink, do you worry about the future of this type of investigative reporting or do you see more large and medium sized news outlets putting an emphasis on it?

I think data journalism is helping news organizations do a better job of serving as government watchdogs, which in turn produces a greater quantity of important and compelling stories for readers. More and more news organizations are hiring data journalists and paying for reporters to get training in how to analyze data themselves. When I first started in the late 1990s, roles like mine were relegated to large regional papers (such as the Star Tribune) and the big national outlets. It was typically one person who was attached to an investigative team, tasked with primarily working on the large investigative projects. As the industry started to shrink in the mid-2000s, we saw these positions cut, as well. One way to measure this is to look at attendance at the largest data journalism conference, hosted by Investigative Reporters and Editors each year. At my first conference, in 1999, there were just over 500 attendees. By 2009, that had dropped to just over 200. Many of us were worried that our niche would not survive. But that was also when news organizations started expanding the role that data played on websites, and our ranks swelled. In 2013, the conference attracted over 600 people. The following year, it was almost at 1,000 and it's been around that mark ever since. Now the term "data journalist" covers a wide array of skillsets and roles in newsrooms, including much more involvement in daily news stories.
Do you have a favorite project that you've worked on?

My favorite project was Denied Justice, a 2018 series on how Minnesota's criminal justice system is failing sex assault investigations. I got into journalism to hold the powerful accountable and give voice to the voiceless, and Denied Justice served that purpose far better than anything else I've ever worked on. It was so heartwarming to hear from victim/survivors that our stories were helping them heal and to watch as changes have been made in police departments around the state. From a data perspective, it also was the ultimate challenge because we had to build our own database and come up with our own methodology (largely based on the work of academics) to measure these failures.