I’ve outlined the way things generally work for obtaining federal research grant funding before, but the pressures of lower funding success rates have other consequences as well. The proposal review process is designed to reduce the risk associated with ‘real science’ so that money is not wasted on unrealistic or unattainable projects. But are these safeguards against risk really guiding us to the best investments among submitted proposals? This question is harder to answer than you would think as the pool of worthy applicants grows and the available funds do not.
Fierce competition for funds means highly polished proposals full of preliminary supporting data with meticulous experimental plans to pursue these findings. All of this neatly wrapped in a well-written document highlighting the critical nature of the research topic to human existence with bonus features like thorough budgets along with descriptions of science education and outreach activities and mentoring plans. Sounds pretty good, what’s wrong with that? The simple answer is- nothing. Except that many impressive proposals do not receive funding. When proposals are reviewed during panel sessions, there are many high quality candidates, but they must be ranked according to recommendation for funding (a decision ultimately made by the program officer within the agency). When it comes to selecting the top 20%, the margin between the proposals that get funded and those that wait to be resubmitted is slim. It is difficult to predict what small imperfections will separate these two groups.
These pressures have caused a trend in the way that proposals are assembled and how science is done. In order for a proposal to appear like a safe investment, increasing amounts of preliminary data have become the norm. This trend has been the subject of a PhD comic. While I don’t think things have gotten as extreme as that example, there is a fine line between showing an adequate amount of data to show likelihood of future success and showing so much that the focus of the proposal should be significantly extended. When we’ve reached this point, we are under-cutting ourselves and I think the current system of peer review does a good job and distinguishing this margin in favor of good science.
There is also a narrow region between safe, incremental science and the outrageously unfeasible. In a perfect world, all science would advance logically, going deeper by defined steps. In real life, this may seem boring and inconsequential- both to perform and to read about in a grant proposal. Some sound experimental plans may get lost in the unfunded pile after panel review if the investigator hasn’t dressed it up a little better in the proposal. Even so, some proposals may seem more or less fashionable (read: fundable) if they aren’t using the catchy technique of the month (anything with lasers that requires sequencing an entire genome) or working in an emerging interdisciplinary field (computational paleontology or aerospace biochemistry). At the same time the reviewers and program officers must discern revolutionary proposals from those that are all flash with no chance in hell of actually producing results.
In science, we are always skeptical and second-guessing ourselves is part of the job description. It should not be surprising to find that we also feel that way about our funding system as well. Is it really the best way of supporting research? I have no doubt that the the projects receiving funding are of the the highest quality in areas of importance by investigators that value mentoring and outreach. Some draft legislation earlier this year called the review process into question regarding its ability to achieve that goal*. My larger concern is- what are we missing out on? How many great research projects have we left on the bench for lack of funding? Are labs really functioning efficiently with the trend towards escalating preliminary data? What fundamental research is being overlooked because it is less trendy? Finally, are we not being bold enough with our research projects? Have we become too skeptical and jaded to see visionary innovation when it lands in the review pile? I wish I could answer these questions favorably.
How do we resolve these systemic issues? There’s no one clear answer. I’ve mentioned that additional funding would always be welcome, but it doesn’t address all of these issues. Other suggestions have been made to change the system**, but they all have their own merits and flaws. Scientists along with our investors, the public, must honestly evaluate our funding system to make sure it’s working to give us all the best returns. No, the irony is not lost on me that I, a scientist, am suggesting that we study and experiment with our scientific infrastructure to make it better. It is the epitome of nerd-dom, but also encapsulates the seriousness of the issue. All hypotheses are welcome.
*Earlier this spring the Science, Space, and Technology Committee Chairman Lamar Smith (R-Texas) drafted legislation called the High Quality Research Act aimed at improving the NSF peer review process. The spirit of the legislation was to increase accountability with respect to the research funded with your tax dollars. However, the language in the draft bill did little to meaningfully extend the current guidelines for the NSF peer review system, but did succeed in riling the scientific community into thinking all research would subsequently be subject to what amounts to political review. There was some drama between the committee and administrators at NSF regarding specific projects, but ultimately the HQRA legislation was not formally introduced. As a scientist doing basic research, it was a wake-up call about the importance of science communication and public outreach.
**These include models where agencies would support individual investigators based on past and predicted career success regardless of specific research direction, increasing the duration of funding cycles (2-5 years is the current grant lifespan), equivalently dividing funding among all submitted proposals, and lottery-based funding among worthy proposals.