What do panelists think of the preliminary proposal process?
Here we present the results of recent surveys of DEB panelists, starting with the first round of preliminary proposal panels in March/April 2012 through the fall 2014 full proposal panels.
This survey was meant to gauge panelist reactions to the preliminary proposal process during the first few years in a case where baseline opinion data were lacking. Since many of the survey items relate to perceived differences between the new system and panelist experiences prior to 2012, the usefulness of these questions heading in to the future will be limited as the pool of panelists continues to grow and memories of those prior experiences fade.
Quick Summary
Over the six panel seasons, more than 70% of panelists completed the survey. The respondent counts were higher for preliminary proposal rounds (~170 respondents per round) than for full proposal rounds (~80 respondents per round), consistent with the larger number of panelists required for preliminary proposals.
The majority of respondents reported an improved panel experience in both preliminary and full proposal panels compared to pre-2012 full proposal panels.
Preliminary proposal panelists overwhelmingly agreed that preliminary proposals are conducive to a quality panel review.
Most panelists perceived no change in the intellectual merit of highly rated proposals.
Panelist Experience
A majority of respondents reported their panel experiences under the preliminary proposal system were better than in years prior. This is consistent across all preliminary proposal panels and the first two years of full proposal panels, with slightly more positive responses from preliminary proposal panelists. The latest full proposal panelists were less positive, with a larger proportion reporting “No change” but also a smaller proportion reporting “worse” than in the first two full proposal panel cycles.
Preliminary proposal panels represented a greater departure from prior panels in size, goals, and implementation than full proposal panels, so the potential for changes to panelist experience (better and worse) seemed greater here. This was borne out by the large majority of panelists reporting a directional change and of those, positive experiences greatly outweighed negative experiences. For full proposal panels, the major difference from prior panels was that most reviewed proposals had already been through a preliminary proposal review and subsequent revision. We did not expect much change in overall experience of full proposal panelists, so the extent of panelists reporting positive change is encouraging.
Written comments shed light on what full proposal panelists saw as positive and negative changes to their full proposal panel experience. On the plus side, a greater proportion of time and effort was spent on providing feedback to strong proposals. The flip side of that was that the invited full proposals necessitated difficult choices and highlighted the discrepancy between worthwhile science and available resources.
In addition to their overall experience with the panel, we also asked preliminary proposal panelists about their preparations for the panel. In order to manage the review of the large volume of shorter preliminary proposals, DEB planned for an increased number of assignments per panelist to avoid increasing total panelist burden; this assumed a relationship between proposal length and review work.
The majority of preliminary proposal panelists reported a decreased time burden to prepare for panels, even though the number of proposals per panelist was increased. This indicated to us that we succeeded in balancing the volume/work per proposal trade-off. Comments from panelists also indicated a qualitative improvement in their panel preparation and individual review writing experience. This was generally ascribed to a feeling that, with shorter proposals less time was required to simply cover the entire proposal and they could instead engage more deeply with the project description and literature. A minority of respondents reported that extra preparation time was needed, citing difficulty in adjusting to the new format and changing old review habits.
View of Preliminary Proposal Mechanism
Across the 3 rounds of preliminary proposals, we asked panelists to provide feedback on the preliminary proposal format. Over 90% of respondents are in agreement that the content of the preliminary proposals provides adequate basis for evaluating the projects. The 2012 panel highlighted two issues for NSF to consider regarding the preliminary proposal content: reviewer expectations should be better aligned with the preproposal format, and low-cost projects might be identified, relative to higher-cost competitors. The former was resolved, in part, through experience with a new process and by changes to panelist instructions. The latter provided support for the “small grants” track, adopted for the 2013 preliminary proposal submissions. Since then, panelists are nearly unanimous in finding the content adequate for review.
We separately asked panelists to weigh-in on the length adequacy of a 4-page project description. Again, these results are overwhelmingly positive. Based on written comments, some reviewers suggested the page limit would be improved either by adding a page or setting aside specific lengths for various sub-components to ensure PIs sufficiently address them (e.g., 1 page exclusively for broader impacts, 1 page for graphics, limiting the length of introductory material). Others felt that 4 pages was too long and that if preliminary proposals are to stay, DEB should go “all in” with 1 or 2 page descriptions that leave only the main idea and no possibility for reviewers to expect or demand preliminary data and detailed experimental designs. And, a few suggested that while the length was adequate for most work, the complexity of their own specialized research interests defied such succinct description and deserved special dispensation. These conflicting opinions however do not appear much different from concerns about proposal structure typical for the 15-page full proposals prior to the preliminary proposal system. For the vast majority of reviewers, the 4-page project description works for a preliminary proposal evaluation.
Perceived Changes in Proposal Content
The questions about proposal content were deliberately selective. The wording we used specified a perceived change in the quality of the “highest rated” or “top 20%” of panel ratings. The thought behind this when developing the questions prior to the first panel was that 1) we are primarily concerned with changes that might affect the eventual awards, and 2) the relative ease of preparing preliminary proposal packages might invite more off-the-cuff submissions, which would be screened out at that stage but also depress the perception of average quality without altering any actual award decisions.
The results have been largely consistent with our expectations. The majority of respondents reported no change in intellectual merit of the top proposals for both preliminary and full proposals. During the preliminary proposal panels, respondents reporting a perceived change were pretty evenly split between the proposals being better and worse. Opinions were more positive during the full proposal panels; however, we aren’t putting much weight on that difference since the majority still reports no change. As far as the positive response by full proposal panelists regarding improved quality of proposals, there are at least two non-exclusive explanations: 1) panelists didn’t respond to the question and instead reflected their view of the entire body of full proposals from which most non-competitive applicants had already been removed, and 2) full proposals actually improved in quality after feedback from the preliminary proposal review.
Respondents’ perceptions of changes in broader impacts mirror those for intellectual merit, though they were more polarized. In all 3 preliminary proposal cycles, the majority reported no change to broader impacts in the top proposals. However, greater proportions reported both positive and negative changes. This seems to reflect a still-divided opinion in the community on what ought to be the emphasis for broader impacts. Comments suggested that the broader impacts were both improved and worsened through less detail in the preliminary proposal. Quite unexpectedly, few respondents thought broader impacts declined at the full proposal stage; far more panelists, even a majority in 2013, felt this component improved over prior panels. We’re not sure how to explain this response, although we note this coincides with the release (January 2012) and incorporation into NSF documents (January 2013) of the revised merit review criteria that sought to clarify the context for assessing broader impacts.
Synthesis
Through 3 cycles of two-stage review, the preliminary proposal process in DEB appears to be improving panelist workload and experience. Panelists also report a high degree of satisfaction with the preliminary proposal mechanism and, allowing for individual differences in formatting preferences, generally find that preliminary proposals supply sufficient evidence on which to base their evaluations. Further, few returning panelists perceived any decline in the quality of projects coming highly rated out of preliminary and full proposal panels. This supports a view that the preliminary proposal process is achieving positive changes and not adversely affecting the overall quality of the review process and award portfolio.
Supplemental Survey Background
Traditionally, DEB includes toward the end of the panel one or two drop-in sessions with DEB and BIO leadership for some general Q&A. As an informal discussion, it’s helpful for sharing information with panelists and for us to hear about external concerns. However, it’s not at all standardized: the topics jump around, much of the discussion depends on who is most willing to speak up first, and the take-away points are mainly just a general sense about several disparate issues. With the launch of the preliminary proposal process, we realized it would be helpful to collect better information from panelists. A survey that didn’t link individuals to their responses was thought to be a “safer” venue to encourage all panelists to voice their opinions. This would hopefully avoid the professional and inter-personal dynamics that may bias who is willing to speak up, how forcefully things are said, and what we ultimately interpret as important or common feelings. The downsides of course were that we were asking for subjective (perception and opinion) information and lacked an established baseline against which to measure any change.
Since the first preliminary proposal panels in April of 2012, we have been asking our panelists to answer a few questions about their perceptions of the preliminary proposal process. Within the limitations of the context, we asked respondents questions about 1) their views of the preliminary proposal mechanism, 2) their panel experience relative to prior participation, 3) and their perception of the proposal content relative to prior participation. A similar and shorter set of questions was used for full proposal panels. There have been minor changes to the survey from year to year as we received feedback that helped clarify wording, and some questions were added or removed as our specific needs changed. This post presents the responses to the core questions that continued across the lifetime of the survey in DEB.