COVID-19 changed data collection. Without reliable data, what did researchers learn?
February 2020: Jason Thatcher, professor of Management Information Systems at the Fox School, launched a new survey to identify how training and professional development for IT professionals influence their career trajectory. It was going as smoothly as Thatcher and his team could have imagined.
Thatcher partnered with the Krcmar Lab at the Technical University of Munich, a top university in Germany, and one of the largest international tech companies in the world. After months of work, team members secured a meeting with the company’s vice president. This connection would grant them access to employees that nobody had ever had access to before.
The first two rounds of surveys proceeded with no problems. Abroad, the COVID-19 pandemic was spreading. Thatcher and his team paid little attention. They were focused on finishing what they had started.
“And then—kaput,” says Thatcher. Over the course of three weeks, the number of COVID-19 cases in Germany went from 16 to 150 to 4,838. The German borders closed. People started working from home. Unemployment rose dramatically.
It quickly became clear that the cross-time data would be unreliable. Such drastic economic changes would present too many shifting variables to draw any conclusive answers to their research question.
For Thatcher and his team, the excitement of the prior few weeks made the realization that the study would be sidetracked all the more of a blow to team morale.
“It was crushing, there is no question about it,” says Thatcher. “We didn’t know what to do.”
However, once acceptance set in, they re-evaluated their situation. They could not answer their original research question, but they had still learned something.
“Why don’t we write a paper about what happens when everything blows up?” he says. “So that’s what we did.”
They retraced their steps and created a timeline for their entire research process. They plotted that alongside critical government announcements pertaining to COVID-19, tracking the points when they had missed signs of an emergent crisis.
The final paper, “When your data has COVID-19: how the changing context disrupts data collection and what to do about it,” takes the form of a confessional tale, an unorthodox approach for an accomplished team of researchers.
“The confessional tale is one of those ones you hear about in your research methods as a grad student, but you never actually do it,” says Thatcher.
Researchers use confessional tales to recount takeaways from studies that did not yield trustworthy results due to methodological issues. Using his own tale, Thatcher develops several takeaways that researchers can use to account for the impact of context on data collection.
“The first thing is you need to keep track of the changing environment,” explains Thatcher. Usually, researchers only consider environmental changes within a firm. However, factors external to the firm are equally important. “And that doesn’t just mean asking questions about how you perceive external markets. It is actually looking at things like the Federal Reserve reports, Centers for Disease Control reports.”
“The second thing I would say is to hedge your bets,” Thatcher says. When conducting surveys, ask open-ended questions to gather qualitative data on the context that might complicate the results. A question such as, “Has anything happened in the past month that might have shaped your answers today?” can expose extraneous variables early in the research process.
“The third thing is go through the normal battery of tests, overtly, and report them for the possibility that these extra things impacted your analysis,” Thatcher continues. In other words, be transparent about how the context might be influencing your data. If possible, adjust your study to include control variables that can help gather more reliable data.
So despite those challenges, what can Thatcher tell us about training and IT professionals?
“I can tell you that during a pandemic, or during an emergency, the normal drivers of turnover don’t work,” he explains. “We found even people that were really unhappy in their jobs were staying put.”
Thatcher was unable to confirm or reject his hypothesis that employees with new job training are more likely to quit their job because they feel empowered to test the job market. Their takeaways raise concerns of how research conducted during COVID-19 will hold over time.
“I predict that in 15 years, policymakers will take the last three years and they will just have to drop them out of the panel of data,” he says.
Only time will tell whether this prediction holds true. Thatcher expects that his paper might not be popular amongst other researchers, who see their findings during the pandemic as valid. He understands the frustration as much as anyone.
But when the world faces other drastic changes, such as changing weather patterns or Russia’s invasion of Ukraine, people’s behavior will change too. In today’s world, it is important that researchers pay close attention to the context in which they are collecting—and using—their data.
This article originally ran in On the Verge, the Fox School’s flagship research publication. To check out the full issue of On The Verge: Business With Purpose, click here.