DEB Numbers: Revisiting performance of PI demographic groups, Part 1

Just before the end of December 2012, the Division of Environmental Biology sent out an email message to a list of all people listed as PIs and Co-PIs on DEB proposals since the start of fiscal year 2008. (Aside: if you did not get the email and think you should have, make sure your FastLane profile information is up to date.) This message included a notice of our plans to start blogging among other efforts to enhance interactions between DEB and the research communities.  About 1/3 of the message consisted of several snippets of Division-wide data from the two-stage proposal process with specific tables focused on two groups: Early Career Investigators and Primarily Undergraduate Institutions (PUIs).  We received helpful feedback from several readers of the original email pointing out ways in which the presentation could have been clearer. We thank you for that.  It is already helping to make this blog better.

Since many out there may not have seen the original message, and others may have been intrigued to learn more, this post re-visits and expands on those numbers.

The data snippets in the email were meant to begin a discussion, they were not intended to be comprehensive or the final word. There are other ways to look at the numbers and significant context and nuance simply could not be crammed in to a reasonable email.  Actual performance numbers from the two-stage review process are just starting to come in and even those will change somewhat as Program Officers pursue every opportunity to secure funding through the fiscal year’s end.

In the spirit of the prior Numbers post, before we come back to the data we require a detour to explain some terminology (in bold):

Our numbers are reported by fiscal year (which runs Oct. 1 to Sept. 30). The first round of new solicitation full proposals, the full proposals reviewed in October of calendar year 2012, are 2013 proposals according to the fiscal calendar.

The term proposal can be used in several ways, and it is not always clear what is meant when it is used as a stand-alone data label (e.g., “the program reviewed 1200 proposals”).  This perennial confusion stems from a technical distinction – the individual document packages we review and process are properly called “proposal jackets”.  A single submitting organization prepares and submits an individual proposal jacket. While many proposal jackets involve only a single organization, sub-awards enable two or more organizations to be included in a single proposal jacket. Alternatively two or more organizations can submit separate proposal jackets which are bundled together as a single package through the review process with one organization identified as the lead organization. The NSF PAPPG defines both single proposal jackets with sub-awards and bundled proposal jackets as “collaborative proposals”[1]. Thus, when numbers are presented for “proposals”, it is not clear if the bundled proposal jackets are being counted once per package or each proposal jacket is being counted separately[2].

Our convention in DEB is to avoid using “proposal” alone as a formal data label. We use projects to refer to whole packages that go through review (whether single institution or collaborative bundle) and count each bundle of proposal jackets once.  In cases where numbers are being reported by individual proposal jacket, we will identify the data as based on proposal jackets.

Finally, and hopefully enough to start looking at the numbers from reasonably common ground, we need to address what is meant by the % calculations variously called success (invite or funding) rate in DEB.

For a group of projects under review, we can categorize them into one or more submission sub-groups.  Each project is ultimately assigned to an outcome of acceptance (invited, or awarded) or rejection (not invited, declined). So, we can take a look at the review outcomes like this:

# Accepted # Rejected Total Reviewed
# Projects Sub-Group A Xa Ya (Xa + Ya)
# Projects Sub-Group B Xb Yb (Xb + Yb)
# Projects Sub-Group C Xc Yc (Xc + Yc)
# Projects Total X Y (X+Y)

Note: Sub-Groups may not be mutually exclusive so (Xa +Xb + Xc) can be > X.

When DEB presents success rates, those are the percent calculations across rows, such as:

Proportion of Sub-Group A that was accepted = Xa/(Xa+Ya)*100%  (e.g., 17% of PUI full proposal projects were funded, 32% of all preliminary proposals were invited [both fictional examples])

Success rates formulated this way are what have historically been presented as one of the, if not the sole, metric for NSF program health. But, this number says more about needs for resources and program management on the NSF side than it does about the performance of submitters. We can easily see from tracking success rates over time that demand for funding is growing and outstripping budget growth widely across NSF programs and categories of submitters.  However success rate is a blunt and superficial measure when it comes to comparing performance between sub-groups.

If you want to compare performance of sub-groups over time, between programs, etc. it is much more revealing to look at the contribution of those sub-groups to the portfolio of projects for a given review stage or outcome.  In the above table, we can calculate Portfolio Representation using the numbers down the columns, such as:

Proportion of Total Reviewed projects from Sub-Group C = (Xc+Yc)/(X+Y)*100% (e.g., 15% of funded full proposal projects were from PUIs, 21% of all reviewed preliminary proposals were from early career investigators [both fictional examples])

With Portfolio Representation numbers, one can quickly and easily see the extent to which the awards coming out of a program reflect the diversity of submissions coming in and how that has changed over time without the numbers being drowned out by overall growth in demand.

On to the data…

These three tables display the Success Rates for projects in DEB’s core program panels since FY2007 and include tentative data for the FY2013 full proposal stage: the October, 2012 review panels and subsequent award making.  These present the same information that was included in the original email message; formatting has been updated for clarity based on feedback and reported values have been updated to ensure consistent back-casting of historical numbers and present more recent tentative numbers and estimates.

Project Submissions and Success Rate for DEB Core Program Panels FY2007-FY2013

Numbers2.1

Project Submissions and Success Rate of Projects Identifying the Lead PI as a Beginning Investigator^ for DEB Core Program Panels FY2007-FY2013

Numbers2.2

Project Submissions and Success Rate of Projects of Projects Identifying the Lead Institution as a Primarily Undergraduate Institution for DEB Core Funding Programs FY2007-FY2013

Numbers2.3

*Note: Tentative numbers for current fiscal year under the Continuing Resolution budget plan: 80% of the 2012 budget.  The FY2013 Full Proposal counts include full proposal projects received as CAREER, OPUS, RCN and co-reviews.

^Note: As indicated on the BIO-specific Proposal Classification Form completed in FastLane and associated with the lead proposal jacket of each project.

What these numbers show is pretty much the same point we illustrated in our August, 2011 webinar to introduce the two-stage proposal review mechanism and in subsequent talks at professional meetings.

Success Rates of Projects for DEB Core Programs as shown at ESA 2012.

Numbers2.4

Success Rates have been declining overall and the trend extends back to at least the early 2000s when success rates were above 20%.  Decreasing success rates are due largely to increases in submissions. The dynamics behind increasing submissions are complex and reach beyond the scope of this particular post.  However, the end product of those dynamics was a pattern of unsustainable growth driving a falling success rate with ever more submissions.

The collective effort required to create, submit and review projects gradually grew ever more out-of-whack relative to the payback seen in support for research and feedback to PIs.  We heard this from the community over many years in panels, at professional meetings, and in the phone and email exchanges with the vast majority of PIs being declined funding.  This state of affairs was not unique to DEB or even BIO; in 2007 NSF published the results of a survey of PIs and reviewers which reported this same basic message across the entire agency.  As in the graphic above, trends continued to worsen to the present.

The decision to develop and launch a two-stage review cycle was made in this context, following models employed elsewhere in the agency.

In implementing a preliminary proposal stage, DEB recognized that success rates would likely not be improved over the entire process. Especially in the first year, we expected a sizable increase in submissions because news of a change would inspire new or previously dissuaded PIs to enter the process.  Additionally, preliminary proposals, being perceived as a less burdensome entry to the system, should incentivize participants to submit even more projects.  In practice, we did see a significant increase in projects submitted:

1626 preliminary proposal projects

+             163 full proposal projects via CAREER, OPUS, RCN, and co-review

=             1,789 unique projects reviewed in the core programs during the first full review cycle.

This total project load is roughly 35% larger than during the last full fiscal year of the old system: 1329 projects in FY2011 (and 25% larger than the previous high water mark in 2010). Given flat program budgets, and assuming the same average award size, we would therefore expect the overall funding rate to be between 8 and 9% over the entire two-stage cycle.  However, we are currently under a continuing resolution with a budget of 80% of what we had in FY2012 to both start new awards and cover continuing grant increments on awards made in prior years. The tentative numbers at this point in time show a success rate of about 6% over the two-stages.

The major difference in the two-stage review system is that the PI and reviewers are not required to do all of the work up-front for this huge number of projects just to see 10% of them funded.  Instead, during the preliminary proposal stage, the success rate can be higher than under the old system while the costs to the PI for producing (especially in regards to administrative portions of a submission) and to the community for reviewing a preliminary proposal are reduced relative to a full proposal[3].  Then, in the full proposal stage, a significantly reduced number of projects are submitted allowing the work of preparing and reviewing the full-length submissions to be balanced against a better success rate.

The tables above bear that out within the limitations of only a single full cycle of the new system.  Even when looking at Beginning Investigators or PUIs, the success rates at either stage of the review process were generally higher than the success rates seen for prior years.  Additionally the differences in the success rates for these groups compared to the success rate for all projects do not appear to show exacerbation of those differences under the new system.  We can see the relative performance of these groups better by looking at contributions of Beginning Investigators and PUIs to the portfolio of submission, invitees, and awards which we explore in Part 2.


[1] Further exploration of “collaborative proposals”, differences between the PAPPG and colloquial use, and how we are counting them will be a future post.

[2] Another potential post to look forward to: how differences in counting choices provide different pictures of DEB growth

[3] Note: This refers to effort per proposal. Individual panelists likely experience little change in total work load since in most cases they will see a greater number of shorter proposals.  While not a quantitative improvement for preliminary proposal panelists, we received many comments indicating improved quality of their experience.

14 thoughts on “DEB Numbers: Revisiting performance of PI demographic groups, Part 1

  1. I don’t think I’m following your numbers right. In the first table, for example, in 2007 there was a 17.2% success rate for the 1264 projects submitted. Therefore, 217 or 218 project actually got funded. Yes?

    In the same table in 2012 there were 1626 pre-proposals with a 23.6% invite rate. I understand that to mean 383 or 384 pre-proposals got invited for full proposals. Then 547 full proposals were submitted. How were there more full proposals than pre-proposals that were invited?

    I imagine there must be a different process for the 163 or 164 extra proposals that didn’t go through the pre-proposal process. If someone is *really* interested in the raw funding rate — that is, how many folks *got* money out how how many folks *wanted* money — then these tables aren’t very helpful. What we really want to do is say that there were 1626+163(or 164) = 1789(or 1790) projects that requested money. And 100 projects (that is, 18.3% of 547) that got money. So the raw funding rate for 2013 is 100 / 1789(or 1790) = 5.6%. Except this number is so much lower than previous years, I suspect something is wrong.

    So what IS the proper percentage to compare against previous years? It’s not 23.6% or 18.3% — it’s some combination of those two. But I doubt it’s as low as 5.6%. Can you help explain?

    • Thank you for the comments and checking back. On the project numbers, you’re right about the total unique count for FY2013 full proposals being higher than FY2012 invited pre-proposals. See the * note below those tables. There are several routes by which projects can come directly to the full proposal panel, such as CAREER submissions. These are included in all the numbers for the years prior to the launch of the new system. We emphasize again that until congress approves a FY13 budget, the numbers for funding are tentative; expect them to change.

      We will be discussing the long-term decline in the success rate in more detail in future posts, but an increased proposal load driven largely by an increased number of PIs (rather than an increase in the number of proposals per PI) and increasing research costs per proposal are the major factors. However, success rate alone is neither sufficient to understand the local and national research enterprise dynamics manifest in the DEB community nor refined enough to identify where and how specific challenges and opportunities may be addressed; those conversations require more information viewed in different ways.

      • Thanks for your explanations. So I take it that 5.6% is a reasonable 2013 number for raw funding rate, given the increase in (pre-)proposals this year over previous years. And I understand that success rate is not the only metric that you as a funding agency need to look at to understand dynamics, challenges, and opportunities. But it’s also important for early career researchers like me to understand that funding rates aren’t really as high as 18% — they’re more like 6%.

        On a separate note, in some future post, can you elaborate on the different proposal types? I’ve heard of CAREER grants and know that they’re targeted at early-career researchers. But I have not heard of “OPUS, RCN and co-reviews.” What are they? (This might be old hat for your established PI’s, but useful education for the rest of us.)

      • Margaret,

        You raise a couple of issues here. Funding rates vary depending on which programs you are including or excluding. The 18% figure shown on the BIO directorate web page is for all research proposals handled by DEB in Fiscal Year 2012 and counts each proposal jacket within multi-institutional projects separately. Also, FY2012 saw only a single full proposal deadline in DEB before the pre-proposal system launch, hence significantly fewer projects submitted than prior years. As this discussion demonstrates, there are lots of different kinds of proposals, each with its own success rate. Pulling out specific numbers is tricky and, importantly, depends on what you want to know.

        Many of the topics you bring up are in the works for future posts but you can always contact a program officer if you want more information. We will have a brief guide to funding opportunities coming soon. You can always get a list of funding opportunities relevant to DEB PIs from our web page at http://www.nsf.gov/div/index.jsp?div=DEB. Following those links takes you to each program page, where you can look at the solicitation, find abstracts of funded projects, and see who at NSF you can contact for more information about that program.

        Finally, co-reviews are reviews involving more than one NSF program; we use them to cover projects with questions that span two or more programs. That’s another good suggestion for a future blog post!

  2. Pingback: What should the PUI share of NSF awards be? | Prof-Like Substance

  3. Pingback: DEB Numbers: Preproposals and Collaboration, Part 1 | DEBrief

  4. Pingback: DEB Numbers: An Introduction | DEBrief

  5. Pingback: Discussion: DEB Review Calendar (Part 1 of 2) | DEBrief

  6. Pingback: Discussion: DEB Review Calendar (Part 2 of 2) | DEBrief

  7. Thank you for these analyses. This post and others on the blog are a huge benefit to our community. I know October is a busy time for NSF – it is also busy time for academics – ’tis the season of promotion and tenure reviews. As a letter writer for tenure candidates I plan to use the sort of information you present here to help university administrators better understand the challenges that our junior colleagues face in grantsmanship. Funding is so different than it was even 5 years ago. I have heard more than one program officer say “NSF is not in the tenure business,” nor should you be. I’m not sure university administrators buy that though. Do you have other suggested language that should be used to support a candidate who hasn’t cracked the multi-year NSF grant despite an otherwise stellar record? What other NSF programs in the BIO directorate have moved to the pre-proposal/full proposal submission schedule? Are their numbers similar to DEB? What is the DEB numbers outlook for 2013/14 (or is that too depressing even to ask)? Thanks!

    • It’s great to hear you have found the information helpful. In BIO, only the core programs in DEB and IOS currently use the pre-proposal/full proposal submission cycle. The numbers are generally similar between the two. We don’t have an outlook for this new fiscal year yet (we’re operating under a temporary budget called a continuing resolution) but be sure to check back for updates.

  8. Pingback: DEB Numbers: FY2013 Wrap-Up | DEBrief

  9. Pingback: Are US researchers slowly boiled frogs? – or thinking out of the box about the future of NSF | Dynamic Ecology

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s