A topic we have been interested in since long before the launch of the two-stage review process is how collaboration plays into the review process in DEB. In this post, we explore the various definitions of collaborative proposals and look at measures of the extent of collaboration in DEB project submissions and awards. If this is your first DEB Numbers post, we suggest you read our introductory message to familiarize yourself with some of the conventions we use here to describe what can be confusing information.
What does collaboration mean in DEB?
We mentioned in a previous post that the NSF PAPPG recognizes two different arrangements as “collaborative proposals“: 1) single proposal jackets with additional institutions as subawards, and 2) linked proposal jackets submitted by multiple institutions. Only the second arrangement is explicitly labeled as “Collaborative Research:…” in the project title. The common feature of these two arrangements is that the full proposal projects contain budgetary information submitted by two or more organizations.
In conversation, DEBers often use the shorthand term “collabs.” This use usually refers to only those projects consisting of multiple linked proposal jackets with “Collaborative Research:…” titles. We are particularly interested in this subset of collaborative proposals internally because they pass through review as a single unit but become individual grants if awarded and that has effects on the processing workflows.
DEB Numbers posts report collaborative proposal information incorporating both arrangements described in the PAPPG.
We also recognize that there is more to collaboration than organizations coordinating project budgets. Counts of PAPPG defined “collaborative proposals” do not account for the vast majority of cooperative arrangements that fall under a reasonable general-public understanding of collaboration. For instance, Co-PIs from the same university in the same or complementary fields are clearly taking part in collaboration but only a single institution receives an award. A foreign counterpart providing access to a collection or dataset may be thought of as a collaborator (and may even have provided what solicitations call a “letter of collaboration” confirming their willingness to provide support to a project) but is not a PI or Co-PI and their organization is not an awardee. Neither of those constitutes a “collaborative proposal” but they are aspects of collaboration that interest us. However, data on such collaborations are not always systematically captured during the submission process or award life-cycle.
Our abilities to scrape meaningful data on these deeper facets of collaboration from NSF records vary, often from difficult to currently unavailable. But, there is a lot of promise in the future. The development of altmetrics in the wider community, StarMetrics within federal agencies, and the continued upgrading and replacement of old systems with new ones designed with data needs in mind suggest that in a couple of years we will have operational tools to better explore project collaboration.
One immediate development in this direction is the migration of project reporting from FastLane to Research.gov. Current grantees should have heard about this by now. The switch will make the information in awardee reports easier to draw out and analyze. However, it will also be a fresh start. We do not expect to have backward conversion of old records. At the program level, we do not know what all the outputs and data products from the new form will look like (either internal or external to NSF). It will definitely require time and exploration to get enough data into the reporting system to figure out how to recombine it with existing data sources and produce new insights.
Those limitations in mind, there are several pieces we can look at today to give a picture of the landscape of collaboration over the last several years.
For starters we can look at the numbers of individuals appearing on project cover pages. We can also look at the numbers of institutions represented in project budgets. With these numbers we can look for trends in the representation of various collaborative arrangements in the submission and award portfolios.
Proportions of Three Collaboration Strategies in the Portfolio of Project Submissions to DEB Core Program Panels FY2007-FY2013
^Institutional involvement beyond the lead organization on a proposal jacket is captured in budgetary information. This data is not generated in the submission of preliminary proposals.
*Tentative numbers for FY2013 under the Continuing Resolution budget scenario.
Here we have the data for three types of collaboration strategies: Collaborative Proposals (PAPPG-definition requiring 2 or more institutions involved in the project budget), Multi-investigator projects (a broader concept of collaboration including all projects with Co-PIs even if from a single institution), and Single Investigator Projects (no named collaborations with other institutions or Co-PIs). There is not much to interpret, the relative contributions of each collaboration strategy to the submitted project portfolio has been amazingly constant even through the FY2010 post-stimulus submission spike and preliminary proposal debut. The only notable feature is the apparent relative increase in formal collaborative proposals at the FY2013 full proposal stage. That change would appear to run counter to some of the concerns voiced at the launch of the two-stage review process. However, that number is also a decent, if imperfect, proxy for the proportion of collaborative proposals in invited preliminary proposals. When viewed in that context, it is less exciting.
Proportions of Three Collaboration Strategies in the Portfolio of Awards or Invitations from DEB Core Program Panels FY2007-FY2013
*Tentative numbers for FY2013 under the Continuing Resolution budget scenario.
Here we see a bit more variation over the reviewed period, but we are also considering a much smaller population. Again, however, the numbers coming out of FY2013 are not out-of-place when compared against the range of values encountered since FY2007. If the X in this table were replaced with the 43% collaborative proposals from the FY2013 full proposal submission portfolio, it too would fit right in. Regarding the preliminary proposals in FY2012, Multi-Investigator Projects appear to have fared a little better than normal at the point of invitation but we have only a single data point and that difference must be interpreted in light of several factors: single investigator CAREER projects were not part of that review stage, and both panelists and Program Officers were aware of publicized concerns about negative impacts on collaboration and could have responded with behavioral changes.
These data initially suggest that the two-stage review process is not having an effect on collaborative proposal submissions or outcomes. “But,” you may say, “this data only reflects presence/absence of a technical marker of collaboration, what about measures of ‘how much’ collaboration is happening?” Currently such information pushes the limits of our ability to glean data from the records, but we will take a look at what we can in part two.
Pingback: DEB Numbers: Preproposals and Collaboration, Part 2 | DEBrief
Pingback: DEB Numbers: Award Size and Duration | DEBrief
Pingback: DEB Numbers: FY2013 Wrap-Up | DEBrief