Fri 9 Jul 2010
By Sarah Stanley
In January 2006, two high-profile papers by South Korean researcher Woo Suk Hwang were retracted after they were found to be based on fabricated data. Hwang and his lab members claimed to have successfully cloned human stem cells for the first time. But their results were discarded after it was found that, in addition to engaging in other types of data fraud, they had misleadingly altered images of stem cells.
The incident, which made headlines worldwide, is a perfect example of scientists succumbing to “the temptation of image manipulation,” according to Mike Rossner (above), executive director of the Rockefeller University Press and former managing editor of the peer-reviewed research publication Journal of Cell Biology (JCB).
Altering digital images is easy for anyone with access to Adobe Photoshop or similar digital image editing software. Many nonprofessionals regularly touch up their personal digital photos. It seems natural that scientists, who typically present their data in the form of images, would find it helpful to edit photos to clarify their results. But researchers who modify figures risk misleading their readers, whether or not they intend to deceive.
Rossner, who gave the MBL Special Lecture in Bioethics last week, heads up a powerful effort to detect image manipulation before papers are published in JCB. In many cases, detecting image manipulation is as simple as altering contrast (see image below) or examining mirror images in Photoshop. These techniques can reveal problems like deletion or addition of part of an image, duplication of an image, and misleading contrast adjustments. JCB examines every image used in papers submitted for publication, ensuring any image manipulation does not violate its thorough guidelines.
JCB guidelines divide manipulation misconduct into two categories. Inappropriate manipulation violates the journal’s guidelines but does not lead to misinterpretation of data. Fraudulent image manipulation does result in data misinterpretation. Rossner reports that more than 25 percent of all manuscripts submitted to JCB have at least one inappropriately altered image that needs to be remade, while one percent contain fraudulent images, keeping such papers from being published.
Rossner shared some of the responses JCB receives from investigators when they are informed of inappropriate or fraudulent image manipulation in their manuscripts. Some are indignant. “Everyone does it,” read one author’s e-mail. Others insist that the manipulation is OK because it is more representative of their overall data set. But, Rossner says, “We do get mostly appropriate responses from authors.” Indeed, many authors are grateful to be notified of image issues before publication, since the repercussions of publishing a paper with fraudulent data can ruin careers.
After presenting tips on how to avoid inappropriate image manipulation, Rossner shared a new tool that allows readers to view the original, raw images that the authors obtained from their lab equipment. Called JCB DataViewer, it is currently only used for images in JCB articles but, according to Rossner, “we hope this may become a model for a standard for publication of image data in the publishing industry.” JCB developed the DataViewer in collaboration with Glencoe Software, which utilizes an open-source microscopy environment (OMERO) co-developed by Jason Swedlow, co-director of the MBL’s Analytical and Quantitative Light Microscopy course.
Rossner finished his talk by emphasizing that science does not equal art. “You are looking for the most accurate representation of your data, not the prettiest representation of your data.”