Just how rampant is scientific misconduct? In episode 51, Elisabeth Bik talks with us about her research suggesting that as many as 35,000 papers in biomedicine journals may be candidates for retraction due to inappropriate image duplication. Her open-access article, “Analysis and Correction of Inappropriate Image Duplication: the Molecular and Cellular Biology Experience” was published on September 28, 2018 with Ferric Fang, Amy Kullas, Roger Davis, and Arturo Casadevall in Molecular and Cellular Biology.
Websites and other resources
-
- Elise’s website Microbiome Digest and blog Science Integrity Digest
- Elise in the news
- Example #1, #2 and #3 of “Where’s Waldo” on Elise’s Twitter
- Elise’s 2016 article “The Prevalence of Inappropriate Image Duplication in Biomedical Research Publications“
- Interview with Elise on Everything Hertz
- Profile of Elise on The Scientist
- New policy at PLOS ONE and PLOS Biology requiring raw blot and gel image data following publication of Elise’s article
- Plubons article on peer review for beginners
- Fiji image software
Media and Press
Bonus Clips
Patrons of Parsing Science gain exclusive access to bonus clips from all our episodes and can also download mp3s of every individual episode.
Support us for as little as $1 per month at Patreon. Cancel anytime.
🔊 Patrons can access bonus content here.
We’re not a registered tax-exempt organization, so unfortunately gifts aren’t tax deductible.
Hosts / Producers
Ryan Watkins & Doug Leigh
How to Cite
Watkins, R., Leigh, D., & M. Bik, E.. (2019, June 15). Parsing Science – Double Trouble (Version 1). figshare. doi:10.6084/m9.figshare.8279714
Music
What’s The Angle? by Shane Ivers
Transcript
Elisabeth Bik: I think one of the things that I try to accomplish is making people aware that science — as in any other profession — has a dark side.
Ryan Watkins: This is Parsing Science. The unpublished stories behind the world’s most compelling science as told by the researchers themselves. I’m Ryan Watkins…
Doug Leigh: And I’m Doug Leigh. The well-known aphorism that “there are three kinds of lies: lies, damned lies, and statistics” is often — and erroneously — attributed to Mark Twain. But whatever its origin, no one in the 19th century could have foreseen the advances in biomedicine that have occurred since Twain’s time, nor the unethical behavior that has accompanied some of it. Today, in episode 51 of Parsing Science, we’re joined by Elisabeth Bik who’ll discuss her research suggesting that as many as 35,000 papers in biomedicine are candidates for retraction due to scientific misconduct. Here’s Elise Bik.
Bik: Hi! My name is Elisabeth Bik, I am a microbiologist, I grew up in the Netherlands, and did my PhD there, and I moved to the US in 2001, and started to work at Stanford University, again in microbiology. But around 2013 14 or so, I started to become more interested in science misconduct, and that became my strange hobby, and so I actually quit my job, and now I want to focus this next year on doing more searches, and just basically taking a sabbatical and looking for more cases of image misconduct. So nowadays it’s my full-time job.
Types of images examined
Leigh: Various types of images are used in biomedicine to provide evidence to support the claims the researchers make in their published articles. Chief among these is the Western Blot, a technique used to detect samples — or blots — of specific proteins or DNA in a sample of tissue on a solid jelly-like soft material called a ‘gel’. We began by asking Elise to describe the various types of images that she examined for evidence of inappropriate duplication.
Bik: As in any paper in a science paper, images illustrate findings that the researchers have done. So you can say oh, you know, we treated cells with a particular compounds, and that increased the expression of a particular protein of interest. And to illustrate that, you find that you actually show a photo of the blot. We focus specifically on these Western Blots, but we also looked at other images such as microscopy images or fax images which are sort of you measure cells whether or not I have a particular marker, and the result is a figure that contains a lot of little dots, and so even though they’re not technically photographic images, we found that these images sometimes are manipulated, so I also included them in my screen. But most of the images are photos either over blots of a protein, blots of a DNA gel, or of a microscopy image, so basically a photo of cells. And so images are a little bit irregular, like our faces are all difference or a cloud if you look at the clouds every cloud is a little bit different, and most of us tend to see things in clouds right you look at the clouds in the sky and you think oh that looks like a dog or it looks like a monkey or a tree, and if you would look at the sky, you would see two clouds that are exactly the same that is something that would never happen. And so similarly, if you look at these images of protein bands or DNA bands in a gel, all these bands are slightly different from each other to have little ridges and irregularities, they have slightly different shapes. So if you focus on that, you can see that every blot is unique, and if you see the same band — or the same stretch of bands — multiple times in the same paper that cannot happen by accident. But there’s some duplication that’s going on and you can spot that.
Origin of Elise’s interest in image duplication
Watkins: The identification of inappropriately duplicated images in scientific journals is definitely a niche occupation. So Doug and I were curious to learn what got Elise interested in the topic.
Bik: I don’t actually remember the one moment that started it off, but I did listen — I think — to maybe a radio podcast. I saw something about plagiarism in science and that people published papers that were taken from other people’s writings, and they said oh you can just do a Google Scholar search and, you know, you’ll find sometimes that some text is duplicated — it’s written by one person then another person another scientist might take it and put it in their science papers — and that sounded really interesting. So I took one of my sentences that I had written for a review paper and put it in Google Scholar, and sure enough I found my own paper obviously, but I also found another paper that came out after I had written my paper and they had taken my sentence and put it in their paper, and I’m like no it’s my sentence. So yeah, that got me started and I was very surprised and mad because I thought that scientists were honest people — like myself — and I had never thought that scientists would do such a thing. So it’s a ruin to my view of the honorable scientist, but it also got me very interested in the dark side of science. And so I started to investigate first plagiarism, and that led me eventually to image duplication which is what became this paper.
How Elise discovered her skill for detecting image duplication
Leigh: Ryan and I first became aware of Elise’s work through her Twitter feed on which she often posts “Where’s Waldo” like challenges to find suspicious elements and images published in biomedicine journals. You can see some examples at www.parsingscience.org/e51. She has an almost preternatural ability to see patterns and images that have slipped past even the most seasoned peer reviewer. Given that spotting questionable images is no easy task, Ryan and I were interested in learning more about how she discovered that she had the skill in the first place.
Bik: I’ve always I guess had this skill because when I look at the tiles in a house or in a bathroom or so, I tend to see repeating patterns. And when I actually selected tiles for my own house, I drove everybody in the tile shop mad because I can see these, you know, these porcelain tiles only have four patterns and I can see they’re the same. I want a tile that has way more patterns, I don’t want to see the same tile all over again, and that’s very hard yeah, you don’t have to buy a natural tile or you’ll see these patterns. So I’ve always had an eye for that but I never go to put it to use until I started to work in image duplication. I don’t see it as anything special, I’ll just flip through the images in a journal in a paper and I will try to remember all the images that I see within that paper, then I’ll see oh this is the same blot that I’m seeing again. And I first saw it in figure one and now I see it again in figure five, is that the same experiment or is this blot representing something else? But it is it is hard to remember all the figures within a paper if there are many different figures in a paper, if there are ten different figures. You know, each figure is composed of five or six different panels, it becomes very hard to remember all of them — like I might not find all the duplications — but all the ones I am finding are ones that I guess the reviewers or the editors did not see. So these were published papers and I’m spotting these duplications in them. So I might have this skill but I’m also lacking in another skill that most people actually do have — I’m very bad in recognizing faces.
Three types of image duplication
Watkins: According to SCOPUS — the largest database of abstracts and citations among peer-reviewed academic literature — articles appearing in the journal molecular and cellular biology are cited more often than any other molecular biology journal. Given the impact that such a journal has, it’s impressive that the professional organization which publishes the journal, the American Society for Microbiology, was up for shining the spotlight on misconduct in their own journal. For Elise’s project, Doug and I asked her to describe the three types of image duplications she and her colleagues examined in this study.
Bik: We categorized these images into three categories: so the first one is a simple duplication. It’s basically the same exact photo is published two times — there’s no manipulation, it’s just a simple, the same picture is published twice — and that could just be an honest error, where, you know, imagine you make a photo album and you just happen to pick the same photo twice. And so, you know, you might see the same photo on page one and you might see it on page 13, it’s the exact same photo and that could just be a simple error, so that’s the most simple category. Then we have category two, which is a, we call that, duplication with repositioning, and that could be where the photo is either rotated or flipped or overlapping. So it might be a photo of a couple of cells and then there’s another photo also for a couple of cells, but you can see these photos have a slight overlap with each other, but they’re supposed to illustrate two different experiments. But if you look carefully, you can see they have an overlap in these pictures and so they must have been derived from the same sample. They just took a photo more to the left, then a photo more to the right, but there’s an overlap, or it could be like I said a mirroring — a duplication — of rotation or flipping the image, so that’s our category to repositioning. And then category three is a photo in which within the same photo we see the same elements twice. So that would be say a microscopy photo, for example, where you see the same cells or the same group of cells multiple times, or a Western Blot photo where you see the same bands twice within the same photo, and that’s really hard to imagine that could have happened by accident. Of course, we never know if something is an accident or done on purpose, but category three — where we have duplicate its elements within the same photo — is the most likely to have been done on purpose.
Image “enhancements”
Leigh: Elise examined 960 papers published in molecular and cellular biology from 2009 to 2016 and found 59 or 6.1% contained inappropriately duplicated images. In addition to the duplicated images, she also identified images that were either beautified or spliced. We asked her to describe what these image enhancements refer to.
Bik: Beautification of an image is where let’s say we have a Western Blot and we have a particular band of interest in a certain area, but in another area of the gel that is not an important area there’s a big background stain, a big bowl up, or like something ugly, and you just, you know, you could maybe see evidence that the authors have tried to take that stain away and like beautify it. So it maybe they copy paste and then a clean area over that big blob, but it’s not in the area of interest, so the bands in that the image is actually showing, you know, that they didn’t do any beautification in that area, so it’s in an area of the gel that is not important. So it’s not really changing the message of the gel, it’s just making it look a little bit better. But most journals now have rules against these beautifications because it is, of course, a slippery slope. If you allow a person to clean up a smudge in a corner of the gel that is not important in which area of the gel might have become a problem. But splicing is copy pasting and making a new image out of two other images. So it’s sort of the same where you make a group photo of your lab and then there’s one person missing because they were on vacation, but you have another photo and now you copy paste that person in the background and suddenly that person looks like they’re in the group photo, so you make an image that is composed of two other images but there’s no duplication. If you’re alive splicing, you can actually have a band that was, you know, 25 kilodalton band you can make it look like a 15 kilodalton band because you splice them together but you also move the bands a little bit in height. And that sometimes happens when people do experiments and they have a blot and they have, I don’t know, ten different experimental conditions on that particular blot, but in the paper they decide to only show five experiments so they take out the five middle ones for example, and there they sort of glue together in Photoshop the left side of the image and the right side and they take out these middle five lanes. So it is sometimes allowed in journals under a very specific condition that you have to show a big black or white line — vertical line — in between these two halves of the photo to really show these two lanes were not originally next to each other, and then more and more journals now also will ask authors to submit the photo of the original blot, because it’s a slippery slope where it’s hard to tell what an author did to this particular photo, and if things are still next to each other at the same heights and the same intensity.
How authors responded to detection of image duplication
Watkins: Elise reached out to many of the authors whose articles appeared to have inappropriately manipulated images. We were curious how these authors responded to her inquiries about their papers as well as what their responses were like.
Bik: So most authors will reply that they made an error and most journals will accept that. And I sometimes ask myself is that really an error or I don’t know it’s really hard to explain that I see that same group of cells twice within the same image, like you might say it’s an error but I don’t know, it’s really hard to accept these excuses sometimes, it’s like the dog ate my homework, you can not always accept that. But I think whether or not people did it on purpose maybe is less important, I think the most important thing from my point of view is that science needs to be corrected, I need to be correct whether or not people did that on purpose is, you know, something I’m definitely interested in. But with respect to science, the correction of a paper would be a good thing no matter what their excuses are. If they correct the science and give a new figure that now can be included in the paper, and the journal will issue a correction, that would be a good thing for science. So at least now we know hopefully that the science is correct and people won’t do it again. But it’s, yeah, I sometimes ask myself if they really did that by accident.
Conclusions and implications
Leigh: Prior to the article that we’ve been speaking about, Elise published another analysis in which she examined 20,000 papers from 40 biomedical journals. She found that approximately 1 in 25 papers contained at least one inappropriately duplicated image. She tells us more about what she concluded from that work after this short break.
Advertisement: sciencepods.com
Leigh: Here again is Elise Bik.
Bik: One of the things that we concluded when we did our 20,000 paper search is, you know, how much time did I at least spend on finding these images, and but also how much time does a journal now need to spend on addressing these issues. So when a reader finds problems in a published scientific paper the correct action usually is for the that concerned reader to right through the editor of this journal and then they will start an investigation. They will call the authors ask a reader found a problem in your figure, can you explain? And then maybe the author doesn’t responds or, you know, there’s usually a lot of back-and-forth, and it takes a lot of time for an editor and the staff of the journal to follow up on these concerns. So the idea we had is it’s probably much better for a journal to do a little bit of extra screening upfront before a paper gets published. So, you know, an author sends in a paper and you’re in contact with them, their email is current so you can ask the author to explain a duplication and they can easily fix it. If you try to contact an author years after they’ve published the paper the blot might be gone or, you know, the lab books are maybe at a different location or they’re tossed away. It’s much harder to follow up on these images and to find what really happens. So if you do it in the moment — where an author submits a paper — it probably takes much less time to fix these problems plus you just save everybody the embarrassment of having these duplicated images being published. You give the author a chance to gracefully correct the error and you give the reader more certainty that the images have been screened. So it’s very similar to plagiarism check for, most journals will check for plagiarism when they have accepted papers before they go out in prints — either being, you know, real prints or print online, they’ll check for plagiarism — so we thought it’s probably much easier to build in a check for duplicate images similarly to plagiarism check before these papers are being accepted and published.
Image comparison software
Watkins: Though Elise’s primary means of detecting manipulated images are her own eyes, Doug and I wondered if there might be software that could be used to automate the process. Here is what Elise had to say.
Bik: There is software where you can use Photoshop to actually detect different background, I’d say like pixelation or compression artifacts. I know they exist I wouldn’t know how to operate them. So I just use my eye and and sometimes I might use Preview which is on a Mac, it’s a Mac software, it’s a very simple software. It has these sliders where you can increase the contrast, you know, make the picture basically a little bit darker or a little bit lighter, and play with that, and I might use that sometimes to bring out the background in these images. But I don’t use any specific software as in Photoshop, but I know that some editorial staff at a journal might have more specialized software with tools within Photoshop that can look specifically for these things. And you can also play with false color, so occasionally I might do that when I see something and I want to convince myself that I indeed see a duplication. So I’m using Fiji which is an image software where you can make a false color rendering of a black and white photo. So basically different gray scales become colors, and it becomes a little bit easier to see similarities or differences within different bands.
Absence of image duplication prior to 1997
Leigh: Elise found no evidence of image manipulation and the articles published in the journal prior to 1997. Ryan and I wondered if it might be that the necessity of hiring other professionals to assist in the creation of images in that largely pre-digital era slowed down the process sufficiently to prevent errors.
Bik: Indeed. You know, 20 years ago, I remember I would take my blots to the photographer at my institute and they would make a photo and you would have to submit five copies of your manuscript and five copies of your photos, and that whole package was sent to a journal. And then these five copies were then actually physically sends to different reviewers with the original photos. And so it’s the whole process of paper submitting was much much slower and much more work than it is now, when you just do everything electronically. So with packages like Photoshop it becomes much easier to manipulate photos. And so that is one thing, it’s easier to manipulate a photo, whether by accident or on purpose, that is, you know, that’s another thing, but it’s much easier to submit it. The whole process of submitting papers is much faster, and so it does indeed take less time. So you have less time to think about it and papers also become much more complex. If you look at papers from 20 years ago, you had one Western Blot or one sequencing, you know, a couple of figures, but now figures are much more complex, there’s many more figures within a paper, and there’s also many more data that aren’t photographic images. So if you think about photo manipulation that is something we could still see and detect, but manipulation in tables or line graphs is almost impossible to detect. So I don’t know, it’s really hard to know if these things already existed 20-30 years ago in tables and line graphs, that’s really hard to imagine, but that might have been the case. But in photos it was much less easy to do manipulation 20 years ago.
Motivation of scientists to engage in misconduct
Watkins: Probably the most famous contemporary case of scientific misconduct is Andrew Wakefield, who in 1998 published an article in the renowned British journal, The Lancet. In it, he claimed that vaccination was the likely culprit behind the rise in autism and inflammatory bowel disease. Later investigative work by the London Sunday Times led to his being found guilty of dishonesty in his research, and he was eventually banned from medicine by the UK General Medical Council. Though debunked and retracted, Wakefield’s article is at the heart of the anti Beck’s movement, which is responsible for the measles epidemic currently gripping the United States. So we asked Elise what is it that she thinks motivates researchers to engage in such dishonesty.
Bik: Manipulated images that I ride to journals about or that I post on Twitter, there’s a sad story behind these cases, there’s you know a PI who might want results, or who’s looking for like their their tenure, or so they have to have to publish a paper and if they publish this paper they might get tenure, or they might get a bonus, or they might get an award, or they might get a grant. If they don’t have this result they’re going to lose all of that. Or there’s a graduate student who is looking to finish their PhD in a lab and they need to have this paper out, otherwise, their PhD is not going to happen this year, and there you lose out everything. So there’s a lot at stake for each of these persons, each of these players, and each of these authors. And each of these papers has a sad story where people were so so desperate that they felt they must have done this. And I realize that I might break careers, and it’s not just the person who did this, but all their other co-authors who might not have realized that this was going on, they’re also being dragged into that, and I’m very well aware of that. I know that there will be a lot of collateral damage and there will be a lot of crying and desperation, but like I said before, I cannot look the other way. We have to correct the science and we have to call out on these images being manipulated or these false data being brought out in the open, even though I know that there’s people whose career might be damaged, and who might be fired, and who might have to look for another job because of this, or might never be able to find a job in science again.
Dangers of reporting scientific misconduct
Leigh: “Doxxing” is when someone’s personal information — often including address phone number in place of work — is leaked onto the internet. On March 30th 2019, Elise posted to Twitter: “A convicted felon with several white papers that raised concerns on Pub Peer is now personally attacking me by publishing my home address, discrediting my former place of work, and threatening to do a radio show on me. I will still fight against science misconduct.” We asked if she was comfortable discussing the situation, and, if so, to explain the background of her being doxxed.
Bik: There was one particular individual who did not like that and he published my home address. I imagine that this person is not very happy that I wrote about this paper, and he was so mad that he published my home address, and he started to attack my previous employer — I don’t even work at that company anymore but he started to drag them into it. And yes, and then it becomes a little bit dangerous for me because I don’t know what people like that. And they could take revenge at me, and I obviously need to make sure that I stay, I only make objective comments, that I don’t accuse people of anything, that I just call them out for what I think is wrong, and it’s a dangerous line of work, I realize that. But I can also not looked out a way, I feel I have to say something. So now that I’m not employed at least, I cannot drag my employer into these things because I did his work as a side project, so obviously during my daytime job, I was doing microbiology work. This image manipulation and broader science misconduct work is my hobby, it’s my, you know, it’s not my employed work, and I don’t want to drag any of the persons I work for during the daytime into this line of work. So now that I’m taking a year off, hopefully, I can at least focus on this work.
Recommendations to journals and their editors
Watkins: Lastly, we asked Elise what she thinks journals can and should do to enhance their accuracy and integrity of the work it publishes.
Bik: I think one of the best actions that a journal can do to prevent these cases is to have clear guidelines of which manipulations or which beautification are allowed or are not allowed for their images. So have strict rules on if splicing is allowed, or if original blots should be sent with the submission of the manuscript, and to train their staff on the cases that I’m finding, and that others are reporting. If you look at Pub Peer here, you’ll see many examples of these images that are duplicated or manipulated. Being aware of that, this is already big thing, but also follow up on them. But I think the main thing is knowing how a person works in the lab and working together will give you an idea of the quality of another person’s work, and every co-author on a paper is responsible for all the components of a paper. So that’s something that you should be aware of. If you accept an authorship, you’re actually responsible for all the parts of the paper and you should screen all the parts of the paper, and make sure that they’re good. So working together in a group is one of the ways we hopefully can check each other’s work. In our example, we found that 4% of the papers had manipulated images, but on the other hand, you can also flip it around and say that means that 96% of the papers did not have these problems. So that’s reassuring. Like that I still think most of the scientists are very very honest, and that most papers are correct. But yeah it’s just a tiny part of these papers, as in any profession, there’s a tiny part of persons who are less honest and who produce results that we cannot trust. And it’s is really hard sometimes to know which of these papers the problems are in.
Leigh: That was Elise Bik discussing her article, “Analysis and Correction of an Appropriate Image Duplication: the Molecular and Cellular Biology Experience,” which she published with four other co-authors on September 28th, 2018 in the journal Molecular and Cellular Biology. You’ll find a link to their paper at: www.parsingscience.org/e51 along with bonus audio and other materials we discussed during the episode.
Watkins: Thanks to our generous patrons, we’ve recently been able to double the speed of our website. But we now face double the monthly fees as well, So if you’re enjoying Parsing Science, consider becoming a patron for as little as $1 a month. As a sign of our thanks, you’ll get access to hours of unreleased audio from all of our episodes so far, as well as the same for all of our upcoming ones. You’ll help us continue to bring you the unpublished stories of researchers from around the globe while supporting what we hope is one of your favorite science shows. If you’re interested in learning more head over to: www.parsingscience.org/support for more information.
Leigh: Next time, in episode 52 of Parsing Science, we’ll be joined by Andreas Schilling from the University of Zurich, who will discuss the development of an amazingly simple device that allows heat to flow temporarily from a cold to a warm object without an external power supply; a process that initially appears to contradict the second law of thermodynamics.
Andreas Schilling: If you claim this to somebody the first time, he will always say — especially if he’s a physicist — he will say this is not possible, this is simply not possible because it seems to contradict physical laws. So heat cannot flow by itself from cold to hot.
Leigh: We hope that you’ll join us again.