[StATS]: The role of a statistician on an IRB (March 29, 2006)
Someone on the IRBForum wrote in and asked what guidance their IRB could offer to a statistician who was just assigned to an Institutional Review Board (IRB). There were many good responses. I wrote in with a few comments of my own.
A statistician, especially one fresh from graduate school, might feel a bit perplexed at all the issues that arise in an IRB. Certainly they are not qualified to answer questions like whether lumbar puncture qualifies as minimal risk. But they can and should offer unique contributions to the IRB.
There's a quote that I find particularly relevant here.
"The statistician who supposes that his main contribution to the planning of an experiment will involve statistical theory, finds repeatedly that he makes his most valuable contribution simply by persuading the investigator to explain why he wishes to do the experiment, by persuading him to justify the experimental treatments, and to explain why it is that the experiment, when completed, will assist him in his research."<U+FFFD> -- Gertrude M. Cox
My belief is that statisticians, because of their intimate involvement with the scientific method, can ask a lot of important and valuable questions. They are not qualified to answer those questions, of course.
There is a lot of research that just doesn't make any sense because the researchers are collecting the wrong data. Statisticians are especially adept at sniffing this out. They are also good at understanding the limitations of observational data, and the controversies associated with early stopping of clinical trials.
From my experience on various committees where I am the sole statistician amidst a group of content experts, the most common questions you will get will be about whether the data analysis approach suggested in the research is appropriate or not. I get lots of comments like "Aren't you ALWAYS supposed to use a median to summarize your data when it is highly skewed?" The people who ask these types of questions are what I call "half smart" in that they know enough to comment on an important controversy, but not quite smart enough to think things through to their logical conclusion. Beware of people who get excessively dogmatic about data analysis issues ("This is the way I analyzed the data for my dissertation, so any other approach must be wrong.")
Life is too short to quibble about data analysis. There are a few things worth fighting about (such as failure to recognize and take advantage of pairing in a data set), but most of these mean/median style controversies are overblown. There is more than one way to analyze the data, and your goal ought to be to make sure that the analysis is defensible rather than trying to make the analysis methods perfect.
I've had several people argue with me about this, but I believe that an important role of the statistician is to get people to think more about how the data is being collected and less about how it is being analyzed. After all, the fanciest analysis in the world won't save you if you end up collecting the wrong data.
As a statistician, your participation on an IRB will make you a better statistician. You will learn about a lot of important issues, such as confidentiality, and learn them in great detail. Then when you are helping a colleague in the early planning stages of the research (all your doctors come to you early in the planning stages, don't they?) then you will be able to make them aware of important ethical issues that can and should influence their research design.
By the way, we statisticians have a certain reputation when it comes to ethical issues, and it is best illustrated with the following story. A manager wanted to hire someone to work in her department and she had applications from a lawyer, an archeologist, and a statistician. She invited the lawyer for an interview and asked just a single question, "What is one plus one?" The lawyer explained "In matters of commercial law, the Supreme Court precedent in 1867 established that, in matters of commercial law, one plus one is considered to be prima facie evidence of equaling two." The manager was very impressed. Then she invited the archeologist for an interview and asked the same question. The archeologist replied, "Although many ancient cultures had solid concepts of numbers, the concept of addition was first established in the Egyptian culture, circa 2000 B.C. and it was at the tomb of Amenhotep that the first written evidence was found to establish that one plus one was equal to two." Again the manager was very impressed. She then invited the statistician in and asked "What is one plus one?" The statistician looked around very carefully, then went up and closed the door, lowered the window shades, and asked very quietly "What do you want it to be?"
This page was written by Steve Simon while working at Children's Mercy Hospital. Although I do not hold the copyright for this material, I am reproducing it here as a service, as it is no longer available on the Children's Mercy Hospital website. Need more information? I have a page with general help resources. You can also browse for pages similar to this one at Category: Human side of statistics.
statistics](../category/HumanSideStatistics.html). for pages similar to this one at [Category: Human side of with general help resources. You can also browse Children's Mercy Hospital website. Need more information? I have a page reproducing it here as a service, as it is no longer available on the Hospital. Although I do not hold the copyright for this material, I am This page was written by Steve Simon while working at Children's Mercy