My niece, Professor Dahlia Remler of CUNY, has a fascinating and timely blog post, Bridgegate: The Case of the Missing Ethical Research Review, asking this very question, and then suggesting the need for aligning human subject protections better with the risk of harm. She asks:
Why didn’t the Fort Lee mayor turn to that [human subject experimentation protection] body? Don’t research studies that affect humans have to show that they don’t harm those humans? Or at least that benefits exceed harms? Decades ago, after scandals like the Tuskegee study which kept poor sharecroppers ignorant of their syphilis, and therefore untreated, we created rules and bureaucracies to protect human research subjects. Currently, regulation 45 CFR 46 ensures this, mandating the creation of Institutional Research Boards (IRBs) who are charged with ensuring that research on humans is ethical.
The mayor of Fort Lee couldn’t turn to an IRB because there wasn’t one. IRBs don’t apply to a government agency—or school or business—trying to improve operations. Studies that are for “internal management” purposes, don’t count as research, which is defined as producing “generalizable knowledge.” (Generalizable means providing information beyond just the specific setting, place and time of the study.)
After discussing the lack of logic and risk of harm from this absence, as well as the dangers of over-bureaucratizing the process, she suggests the following:
At this point, I don’t know how to expand the good of ethical review without a lot of bad side effects. Perhaps going for norms and training before regulation would be best. Perhaps we could require anyone doing any investigation to think about and write up potential harms and benefits, and require them to make it available if concerns arise.
Of course, ethical review of Port Authority studies would not have deterred the Bridgegate perpetrators. After all, the Port Authority does have extensive rules and processes for closing lanes for any purpose and the perpetrators ignored all of those rules. At best, ethical review of studies on humans would have forced the perpetrators to find another excuse. But the ease with which they used that excuse highlights the immense gaps and inconsistencies in which studies are regulated. As studies explode in all corners of our lives, let’s work on a better approach.
If you are interested, read the whole blog. Dahlia also discusses the general issue in her textbook, Research Methods in Practice.
I think that you are right that putting information out there so that those concerned (“groups concerned with the common good”) can examine what’s going on is an important part. That is a lot of what is needed.
But I also think that getting those to conduct the studies to think in advance is also important. Many problems are just due to not thinking through the issues. There was a Columbia business school prof who did a study (over a decade ago) sending letters to many top NYC restaurants complaining of food poisoning to see how they would respond to quality complaints. (He had never even been there, much less gotten food poisoning.) Rather than being unconcerned about employees who might get blamed, etc, he had simply not thought about it. People in biomedical research have had this drummed into them–and the stakes are definitely highest for biomedical research–but for many in social and policy research, just pausing to think can be valuable.
Fascinating dilemma posed by your professor -neice. But perhaps the answer is not so much in concern for harm done to individual humans as diligence by groups concerned with the c0ommon good. Surely the individuals who gave a stupid order (apart from the well-known meanness and lack of concern for the common good by Christie and his minions) deserves deserves some sort of punishment. Perhaps publicity for their behavior is the best punishment for office seeking politicians.