Research Fraud Allegations Trail a German B-School Wunderkindby
Ulrich Lichtenthaler was a research wunderkind. The German management professor, an expert on technology licensing and innovation, published more than 50 journal articles, was a visiting scholar at Northwestern University’s Kellogg School of Management, and won a business school department chairmanship—all before he turned 34 last August. The newspaper Handelsblatt in 2009 named him the top young business researcher in the German-speaking world.
Now, Lichtenthaler’s reputation is in tatters. In recent months, academic journals say they have retracted 13 of his articles and are scrutinizing others, after finding that he mischaracterized data and engaged in “self-plagiarism,” offering slightly different versions of the same material to multiple publications while claiming each article was original.
In an e-mailed response to questions from Bloomberg Businessweek, Lichtenthaler said that some of his work contained “unintended statistical errors. I deeply regret these errors and would like to emphasize that no attempt was made to deliberately influence the results.”
He declined to comment further, pending an investigation by the University of Mannheim Business School, where he is chairman of management and organization. The university says it expects to complete the probe by the end of this month. The WHU Otto Beisheim School of Management, where Lichtenthaler previously studied and worked, also is investigating.
Lichtenthaler isn’t the only German scholar facing accusations of academic misconduct. According to an unofficial tally kept by the blog Retraction Watch, scholarly journals since 2010 have retracted 78 articles by German authors—more than any country except the U.S. and China, which have far bigger research communities than Germany does. Since 2011, two Cabinet ministers have resigned after their doctoral dissertations were found to have been plagiarized. Seven other elected officials have had their doctorates revoked for the same reason.
“These ‘little missteps’ have become commonplace,” says Debora Weber-Wulff, a professor at the University of Applied Sciences in Berlin who is active in VroniPlag Wiki, a group that monitors academic fraud.
Why is a country renowned for efficiency so sloppy about science? One reason, Weber-Wulff says, is the German constitution, which guarantees tenured professors the right to pursue their research without interference. The government, mindful of the atrocities committed by state-controlled researchers under Adolf Hitler, stays in the background and relies on universities to police their own ranks. But scholars usually shy away from challenging their peers, especially those with impressive publication records, Weber-Wulff says.
German academics face pressure to step up their publication rate as universities vie for billions of euros in new research funds under a government “excellence initiative” launched in 2006, says Ingo Rohlfing, an assistant professor of management economics at the University of Cologne. “The research landscape got more competitive,” he says. “Publishing is the means to getting more funding.”
Lichtenthaler, who was churning out journal articles at a rate of about one per month, appears to have used the same data over and over, presenting it in different ways to produce seemingly new findings in each article.
“In some cases, measures from the survey were relabeled from earlier papers so one had to look carefully to see whether a given finding had already been published,” says Russell Coff, a University of Wisconsin management professor who edits the journal Strategic Organization, which retracted a 2009 Lichtenthaler article on technology licensing.
The most serious problem, Coff says, was that Lichtenthaler had labeled some variables as statistically significant when a quick glance at the data showed they were not. “It did not seem that the mislabeling was necessarily accidental,” Coff says, “though we cannot be certain.” Lichtenthaler “approached us asking to retract the article and wanted to be clear that he was being proactive,” Coff adds.
Other publications have flagged similar problems. Editors of the Journal of Business Venturing write in the current issue that they are retracting a 2008 article by Lichtenthaler because of “an error in statistical analyses, an omitted variable bias, and a ‘new’ measure that was not ‘new’ because it was already used” in an article he had published in 2007.
The Open Innovation Blog, which has followed the story closely, says the Journal of Production Innovation Management retracted two Lichtenthaler articles, withdrew three others that were awaiting publication, and told readers in an editorial that it was taking additional measures to guard against what it terms “academic research misconduct.”
While such mass retractions are rare, the practice of manipulating data is surprisingly common, says Leslie John, an assistant professor at Harvard Business School. She co-authored a 2011 study in which 60 percent of research psychologists admitted to overemphasizing certain data or ignoring inconvenient facts in their published work. Peer review by other scholars, the traditional method of screening journal articles, rarely catches such problems, John says, because reviewers usually don’t get to see the underlying research data.
Tweaking data is “kind of like taking steroids,” John says. “It can lead to cleaner, clearer results. In academia we have tremendous incentive and pressure to publish interesting results. The incentive is to publish, not necessarily to publish the truth.”