2008-02-07

On Peer Review

Recently somebody goofed. Bad. The journal Proteomics (which I had never heard of but I understand was heretofore reputable) published (Epub, ahead of print) an article that, if Myers' excerpts are representative, has conclusions that not only don't follow from any data, but are in fact unfalsifiable. This is bad. It's not only bad for the journal and the authors, it's bad for science. It is the “one peer reviewed article” that we've been smugly demanding from IDiots all this time. Bad.

It is also being held up as a failure of the peer review process, and I can see why. But there's something important that's getting lost in the (justifiedly) panicked shuffle, and I want to talk about it. My intention is that this post be a reference that people can point to in the fight I know is coming. So here it is.

There are two things that peer review can mean. I'll call them peer review(S) and peer review(G). Peer review(S) is what people normally think of when we talk about peer review. It's the process whereby a journal sends articles to just a few experts in the field, and those experts read and evaluate the article to recommend whether it should be published and, if so, what changes should be made first. When a reviewer is reviewing an article, she is looking for two things. One, she is looking for good science: Is the methodology sound? Is the measure valid? Do the conclusions actually follow from the data? and so on. And two, she is looking for integrity: Is the methodology well enough described to be reproduced? Are potential sources of error disclosed? Do citations accurately represent their sources?

This is an incredibly important process. Whatever I may say later in this post, I want it understood that one cannot overstate the importance of this process. It serves several functions. Firstly, it keeps out the rifraff. Writings from the likes of John A. Davison don't generally make it as far as a reviewer, but some less gibbering nonscience will, and the reviewers can keep it from getting published. This is good, not only because it keeps the rest of the community from having to slog through mountains of nonsense, it also means that nonexperts reading the journals (such as science reporters) are less likely to come away from them believing the nonsense, which is good for everyone.

Secondly, it helps the editor to keep her biases from dictating what gets published. As important as it is to keep gibbering idiocy out of a journal, it's equally important to let legitimate disagreement in, and that's a big part of what peer review(S) is for.

But here's the thing. Not everything that has ever passed peer review(S) is still believed today, and the reason for that is what I'm calling peer review(G). The review process doesn't stop once the article is sent to a typesetter.

Once an article gets published, it doesn't just sit there in some sciencey archive, revealed truth to be believed from here on out. It gets tested. This is where the real strength of the scientific institution lies. Many more people will read the article than reviewed it. Most will read it with a critical eye, asking themselves if the conclusions follow from the given data. Some will check the sources to see if they say what they're claimed to say. And a few will check the results. They'll re-run the numbers, redo the experiments, do different experiments to check for convergence.

Peer review(G) is what allows someone (but not everyone) who publishes books rather than articles to still be taken seriously as a scientist. It's what decides how reliable a journal is.

As important as peer review(S) is (and dammit, it is) peer review(G) is more important. Peer review(G) is how hoaxes get identified. It's how knowledge moves forward. The real test of the strength of a conclusion is whether it stands the test of time and rigorous retesting, not whether it can make it past a couple reviewers.

So while I agree that what happened with Proteomics is an obscene failure of peer review(S), at best a truly embarrassing mistake and at worst an apalling miscarriage of the process, ultimately this is a success story for the peer review process. A process which "Mitochondria, the missing link between body and soul: Proteomic prospective evidence" is failing, by the way.

2008-02-01

I Am So Old

Life is not fair. Wikipedia just told me that the xkcd guy is almost two years younger than me. Why don't I have a talent?