This is the first of what will hopefully be somewhat regular posts on Division of Environmental Biology data. Our goals for these posts include: stimulating discussions with our PI community, providing deeper context for understanding the grant review process, and sharing insights we gain through ongoing examination of our programs.
This introductory post outlines some of the things we hope to cover in future posts and provides an overview of the challenges that apply to reporting and interpreting DEB Numbers.
Who will be writing these posts?
In DEB we have several analysts whose duties include assisting in management of the review process and working with the NSF databases to produce reports and analyses as well as taking part in communications with our PI community. Expect most of the posts under the DEB Numbers heading to be written by and based on the work of the analysts. Program Officers in the Division will also contribute to Numbers posts but they have other priorities (e.g., managing review of your individual proposals) and expect to author these posts less frequently.
What types of data will be presented?
What we will provide to you are our best efforts to clearly and accurately present the numbers you care about for DEB programs. This includes information on things like submission trends, program demographics, and various portfolio metrics. The scope of the presented data will be limited to the programs housed in the Division of Environmental Biology, though comparisons may be made to public data presented elsewhere. Hopefully, your questions, feedback, and comments will help us to advance our data presentations beyond the static glimpses offered in outreach talks.
We will not, as stated in this blog’s policy notes, discuss or provide data about individual proposals or groups of proposals at a level that the information could be tied to specific applicants.
Timeliness, completeness, and terminology: challenges to presenting data clearly and definitively
Timeliness: The U.S. federal government operates on a fiscal year calendar which runs from October 1 – September 30. NSF reporting and analyses account for each proposal by the fiscal year when a decision was recorded. For instance, full proposals received for the August 2012 deadline ultimately become part of the FY2013 dataset because they are reviewed and processed to completion after October 1, 2012. However, because we process reviews and decisions throughout the fiscal year, proposal data is coming in right up until the end of business on September 30. This creates a conflict between our desires to present information quickly and our responsibility to be accurate.
Completeness: We can only show you the data that we have. Some things we recognize as important and know you care about cannot be definitively reported because of gaps in the data. For instance, many pieces of key demographic information for reporting on historically underrepresented groups by gender, ethnicity, or career status can be based only on the records the public voluntarily self-reports through approved information collections (the optional PI Information fields of your FastLane profiles). Other potentially useful data is incomplete because the data fields were recent additions or are specific to certain programs and so are missing from portions of the record.
Terminology: We recognize that the phrasing and terminology we use may be interpreted to different ends by blog participants. There are many key words and phrases that have a common English use, one or more colloquial uses in the PI community, and a specific technical meaning inside NSF. Just a few that spring to mind are: proposal, project, jacket, collaborator, year, broader impact, postdoc, ethnicity, underrepresented, minority, program, peer review, and merit review.
Our aim is to explain and consistently apply technical wording (e.g., specifying “fiscal year”). However, this is no small feat with NSF where we have hundreds of solicitations, submission mechanisms and specialized considerations that vary from office to office and year to year. We will inevitably miss some terms but welcome the opportunities provided by a blog to quickly see when we have been confusing and clarify.
Responses to these challenges
1) Values for the current fiscal year are tentative. They will be noted, as applicable, with qualifiers such as “estimated (est.)”, “tentative” or “to date (t.d.)”. Generally, final values will not be available until after the start of the following fiscal year or later.
2) Descriptive language will be attached to values where the data has known gaps or limitations (e.g., the success rate for proposals self-reporting a female PI was XX%, at least YY proposals were received from minority PIs).
3) Technical terms will be explained when presented and the explanation referenced when the term is used in future posts.
4) If you are not seeing what you are looking for or something seems off, speak up in the comments.
Coming up next in DEB Numbers: Revisiting performance of PI demographic groups during the first preliminary proposal cycle.