Background Papers

The Initiative has produced two background papers for the user community that guide the design of infrastructure for evaluating the selection of funded research projects.

Impact Evaluation for Science Funding

Best Practices for Funding Early Careers of Scientists:  Evidence and Unanswered Questions

Research Papers on the Science of Science Funding

This is a short bibliography of research papers that might be helpful to anyone who is learning about the science of science funding:

•  Arora, A., & Gambardella, A. (2005). The impact of NSF support for basic research in economics. Annales d’Economie et de Statistique, Contributions in memory of Zvi Griliches, 91–117.  (Summary)
This study tests the impact of NSF Economics funding on research output by comparing the subsequent publication success of successful and unsuccessful applicants.

•  Azoulay, P., Manso, G., & Graff Zivin, J. (2011). Incentives and Creativity: Evidence from the Academic Life Sciences. RAND Journal of Economics Vol. 42, No. 3 (2011): 527-554. (Summary)

•  Azoulay, P., Zivin, J. S. G., Li, D., & Sampat, B. N. (2015). Public R&D investments and private-sector patenting: evidence from NIH funding rules. The Review of Economic Studies  National Bureau of Economic Research. Retrieved from http://www.nber.org/papers/w20889.  (Summary)
This paper examines the pace, impact, and direction of research produced under NIH R01 grants and HHMI Investigator grants.

•  Boudreau, K. J., Guinan, E. C., Lakhani, K. R., & Riedl, C. (2016). Looking across and looking beyond the knowledge frontier: Intellectual distance, novelty, and resource allocation in science. Management Science62(10), 2765-2783. (Summary)
This study examines how the closeness of reviewers' own research to the topic of a research proposal affects their evaluation of the proposal.

•  Bush, Vannevar. 1945. Science: The Endless Frontier. Washington, DC: US General Printing Office.

•  Dasgupta, Partha and Paul A. David.  Toward a new economics of science. Research policy 23, no. 5 (1994): 487-521.

•  Ganguli, Ina. (2017). Saving Soviet Science: The Impact of Grants When Government R&D Funding Disappears. American Economic Journal: Applied Economics, 9 (2): 165-201. (Summary)

•  Gush, J., Jaffe, A. B., Larsen, V., & Laws, A. (2015). The Effect of Public Funding on Research Output: The New Zealand Marsden Fund (Working Paper No. 21652). National Bureau of Economic Research. Retrieved from http://www.nber.org/papers/w21652

•  Jaffe, A. B. (2002). Building programme evaluation into the design of public research-support programmes. Oxford Review of Economic Policy, 18(1), 22–34.

•  Myers, Kyle (2018). The Elasticity of Science. https://ssrn.com/abstract=3176991. (Summary)
This study uses NIH RFAs ('Request for Applications') in particular areas to measure how much scientists change the direction of their research in response to directed funding opportunities.

•  Nelson, R. R. (1959). The simple economics of basic scientific research. Journal of political economy67(3), 297-306. (Summary)
This paper provides a classic analysis of the economics of basic research.

•    OECD Global Science Forum (2018). Effective operation of competitive research funding systems. https://doi.org/10.1787/2ae8c0dc-en This report provides results of a survey of OECD members on how they run research grant processes, and whether they have undertaken any evaluation of their approaches.

•  Stephan, P. E. (1996). The economics of science. Journal of Economic literature34(3), 1199-1235.

•  Stephan, Paula E. How economics shapes science. Vol. 1. Cambridge, MA: Harvard University Press, 2012.

•  Wang, J, Veugelers, R., Stephan, P. 2017, Bias against novelty in science: a cautionary tale for users of bibliometric indicators, Research Policy, 46, 1416-1436. Also published as NBER working paper 22180; See also http://www.nber.org/digest/jun16/w22180.html.  (Summary)
This paper proposes a new metric for novelty of scientific research, based on a measure of diversity of the references cited by papers, and examines how papers that score high on this metric compare in terms of other impact measures.