Preliminary Proposal Evaluation Survey Reminder


Check your inbox.

Check your spam folder.

Complete the survey!

End the reminder messages.


Background (if the above doesn’t make sense to you).

This is about the Preliminary Proposal system in use in both NSF BIO’s Division of Environmental Biology and Division of Integrative Organismal Systems.

We are in the midst of an external evaluation of the effects of this system on the merit review process.

We posted an initial notification letter about stakeholder surveys. And, copies of this letter were sent out to everyone in the sample ahead of the formal invitations.

The formal survey invitations with the active survey links were sent out by mid-September from the evaluator, Abt Associates.

Reminder emails are also coming out and will continue to do so at regular interviews while the survey remains open and incomplete.

If you have been receiving these messages, please complete the survey. If your colleagues have been receiving these messages and have not completed the survey, encourage them to do so.

If you received an invitation to take the survey,

  • Please take the 10 or so minutes to register your responses via the link in the email.
  • Remember that these are single-use individualized links.
  • Your response matters. This isn’t a census: your invitation is part of a stratified random sample selected for inference to the population.

Thank you for your participation!

MacroSystems Biology and Early NEON Science

A cross-posting from the NSF BIO Division of Biological Infrastructure blog (DBInfo) that we thought would be of interest to our readers too.


What do a fungal disease, lake sediments, and weather radar have in common?

They are all components of research projects funded by the NSF Macrosystems Biology and Early NEON Science Program (MSB). (You can find a list of active awards here.)

Last week, the NSF headquarters served as the gathering place for a meeting of Principal Investigators (PIs) and other researchers working on MSB projects from across the country. We wanted to share with you a little bit more about this unique program in the NSF BIO portfolio and some of the outcomes of the meeting.

Dr. Olds and Dr. Waring are standing in a meeting room in front of a wall with Dr. Warings scientific poster hanging on it. There are other posters in the background. The NSF’s Assistant Director for Biological Sciences, Dr. Jim Olds, speaks with MacroSystems Biology researcher Dr. Kristen Waring.

About the Program

Originally just called “Macrosystems Biology,” the Macrosystems Biology and Early NEON Science program is an NSF BIO funding competition that made its first round of awards in FY 2011. The next…

View original post 924 more words

Meet DEB: Paula Mabee, Division Director

Meet DEB: Paula Mabee, Division Director

Basic ProfilePaula Mabee, Division Director, BIO/DEB

Name: Paula Mabee

Education: Ph.D. Duke University, 1987 (Zoology)

Home Institution:
The University of South Dakota

NSF Experience/History:

I have been the Division Director of the Division of Environmental Biology in the NSF’s Directorate for Biological Sciences (BIO) since August of 2015, but my history with NSF goes way back. I’ve served on 16 panels and participated in multiple site visits across BIO since 1996, and I have served as an ad hoc reviewer since 1990. I have also been a fortunate NSF awardee. I received my first NSF award – a postdoctoral research fellowship – in 1989, during which I trained in experimental methods in developmental morphology.  This led to my first award as a faculty member at San Diego State University in 1994 for a comparative experimental study of cranial development in teleost fishes.  Following this, I diverged a bit to pursue the development of mobile platform-based field guides with SBIR and STTR funding, leading to the National Geographic Birds birding guide app.  In 2004, NSF supported a collaborative Assembling the Tree of Life (AToL) project for cypriniform fishes (carps, minnows, loaches), for which I was one of the PIs.  Support from the NSF-funded National Evolutionary Synthesis Center for a synthesis working group with the Zebrafish Model Organism Database folks (ZFIN) led to a NSF DBI award in 2007 and another in 2011. These awards supported development of bioinformatics methods to compute across the full range of very diverse anatomical traits and link those traits to candidate developmental genes.  A RCN award in 2010 broadened this type of community-driven data integration by establishing an international Phenotype Research Coordination Network with over 400 participants.

Research Experience/History:

My Ph.D. work and subsequent research focuses on questions at the intersection of evolutionary and developmental biology such as: What is the relationship between developmental and evolutionary change?  What is the genetic basis for anatomical structures that are evolutionarily new? I’ve explored these questions using a variety of experimental and computational approaches, beginning with comparative morphology and phylogenetic systematics, continuing with developmental genetics, and currently bioinformatics. My research contributions include 52 peer-reviewed journal publications, many with student co-authors, and spanning the fields of evolution, phylogenetics, ichthyology, anatomy, bioinformatics, developmental biology, and cross-cutting journals.

Since 2006 my research has been highly collaborative and focused on developing a new bioinformatics approach for connecting the diverse phenotypes of species to the genetic and developmental data from model organisms such as zebrafish and other vertebrates.  Our research team has established methods to connect, search, and compare data from the zebrafish community database (ZFIN) and other vertebrate databases (e.g., Xenbase and MGI).  This has been challenging from the biodiversity phenotype perspective, because in contrast to genomics, where resources are well-developed for computation, it is difficult to render diverse morphological and behavioral features computable.  For example, representing ‘segmented fin rays’ of fishes such that a computer can reason that they are part of fins, composed of bone, and develop from mesoderm, requires a basic logical dictionary of terms called an ontology.  Ontologies appropriate to represent multiple species or phenotypic diversity had not previously been built when we began this research, so we developed these methods and, at the same time, built resources to promote discovery of new knowledge.

The outcome is a resource that combines new software, a database infrastructure, and an interface to serve evolutionary biologists and geneticists.  The connection of genetic, medical, and evolutionary data in one resource –the Phenoscape Knowledgebase (KB)– enables over 500,000 testable hypotheses regarding which specific genes might underlie specific traits in vertebrates and how those traits have changed over time.  For example, modern catfishes do not have tongues and data in the KB implicated a role for the brpf1 gene in evolutionary changes in this trait, a prediction that would have been difficult to formulate without computational methods. Data from our recent wet-lab work validated this prediction [i].

Flathead Catfish; Photo by USFWS, Used under Creative Commons License; Original at:

Flathead Catfish; Photo by USFWS, used under Creative Commons License

More recently we have used this machine logic to ‘expand’ the trait data available for phylogenetic and evolutionary research [ii].  This is an application of these data that had not been envisioned when we initially planned this research.

What gets me excited about this research is the prospect, some day, of being able to join different data types across environment, ecology, phylogenetics, traits, and genetics to make discoveries that are very difficult at present.


Why did you want to work for DEB?

I think that DEB broadly encompasses my scientific home. In the past, I pitched some unconventional ideas and always found they were appreciated in DEB panels. From the outside, the Division seemed to be welcoming to creative research, and now as a NSF insider, I can see that it is.

Of particular interest to me are questions that both enable and require the integration of different kinds of data.  DEB is faced with increasing numbers of projects that require data integration. This is true in both core programs and DEB’s special programs such as Dimensions of Biodiversity and the Long Term Ecological Research program. These projects require that we be able to integrate data across scales. To do this, I think that looking across traditional knowledge domains is critical.

I feel that biologists are at a particularly interesting juncture in comparative work. Although tools are available to aggregate and analyze some data such as genetic sequences, they are only just emerging to integrate other very central pieces such as phenotypes, phylogenies, and environmental variables. An important challenge is to scale up this integrative approach across all extinct and extant biodiversity, enabling visualization of linked data in time and space, and integrating environmental data to produce a fully informed and machine-enabled comparative biology of the future.

Biggest surprise you’ve encountered coming to DEB from the academic world?

Being here has made me realize the extent to which we are living through the Wild West of data. I noticed this first in the Dimensions of Biodiversity program where PIs have been undertaking a lot of one-off processes, without the ease or benefit of standards or best practices. This raises questions about how to use funding to organize the community for greatest efficiency in these early times. DEB funded research has included incredibly heterogeneous data types, so finding effective strategies for integrating data is very challenging.

This leads me to another surprise, which is the level of introspection among DEB Program Officers and staff.  I had no idea the extent to which Program Officers reflected on whether the existing slate of funding mechanisms sufficed to cover the spectrum of science proposed by the PI community.  No one wants a proposal to fall through a gap.  What was not surprising is that DEB Program Officers have a great deal of respect for the PI community.

What would someone find you doing in your down time?

In short, eating and going to art galleries.  Because I’m living in the DC area now, I’m taking full advantage of all it has to offer. It seems there is no end to the variety and deliciousness of area cuisine. I’ve also been enjoying docent tours at the National Gallery of Art and the show currently at the Renwick Gallery.

Simultaneously, I’m reconnecting with colleagues and many old friends. Because of my interdisciplinary, roaming life, I have been a part of several communities that have had little to no interaction, such as evolutionary biology and genetics. When I got involved with data interoperability for example, I didn’t know anyone in that community. Because there are so many meetings and government agencies in DC, people from all of these disciplines come together here.  It is great fun to have the opportunity to be reacquainted with old friends and meet so many new interesting people.

Personally I love to travel, and I love anything having to do with water, from swimming to scuba diving, sailing, and fly-fishing.  Aquatic pursuits are a big part of my life in South Dakota.  I am also enjoying time with my two college-age sons when they visit D.C.

Where should someone go to eat when they visit NSF?

Wow – this the hardest question of all because there are so many awesome places.  I especially love the Spanish and Middle Eastern restaurants.

[i] Edmunds, R.C., Su, B., Balhoff, J.P., Dahdul, W.M., Lapp, H., Lundberg, J.G., Vision, T.J., Dunham, R.A., Mabee, P.M., Westerfield, M. 2016. Phenoscape: Identifying candidate genes for species-specific phenotypes. Molecular Biology and Evolution 33 (1): 13-24. doi:10.1093/molbev/msv223

[ii] Dececchi, T.A., Mabee, P.M., Blackburn, D. 2016. Data Sources for Trait Databases: Comparing the Phenomic Content of Monographs and Evolutionary Matrices. PLOS One 11(5): e0155680. doi:10.1371/journal.pone.0155680

A dozen things All PIs should know about the U.S. Federal budget as it relates to NSF research grants

A dozen things All PIs should know about the U.S. Federal budget as it relates to NSF research grants

Things upstream from a grant decision


There is an annual budget cycle (see graphic, below):

a.    Request: The President puts out a plan for a budget in a request to Congress.

b.    Appropriation: Congress decides how much (described in this downloadable PDF) to actually provide to each agency, (e.g., NSF). This is signed into law by the President. Annual appropriations start on October 1 each year. Even if Congress is delayed in finalizing the budget for that year, the October 1 “birthday” of the funds applies retroactively.

c.    Allocation and Allotment: The appropriations are passed down from the Treasury through the agency to funding programs (e.g., Population and Community Ecology, Dimensions of Biodiversity).

d.    Commitment and Obligation: Funding is applied to projects (typically as grants) after merit review. Technically, Program Officers “recommend” funding (d-i), Division Directors concur the decision to “commit” funds (d-ii), Grants Specialists make the “award obligation” (d-iii), and the award is made to the institution (not the PI).

e.    Expenditure & Reimbursement: Over the subsequent months and years, the PIs of funded projects use the funding to make science happen and receive reimbursement from Treasury accounts.

Diagram of the relationship between the annual U.S. Federal budget process and NSF merit review system.

Diagram of the relationship between the annual U.S. Federal budget process and NSF merit review system.


At any given time, we are thinking about 3 or 4 different years’ budgets:

a.    Reporting on last year

b.    Managing this year

c.    Planning for next year

d.    Building momentum for the year after next


While we often refer to “the budget” in the singular abstract form, there are different pots of money at different levels in the agency.


At the highest level, there are 6 different pots (described in this PDF), called accounts[i]. These pots can’t be mixed[ii]. And, only 2 typically matter directly to researchers: Research & Related Activities (R&RA) and Education & Human Resources (EHR).


Individual program[iii] budgets, scopes, and lifespans are usually managed by each Division, but specific guidance from the White House Office of Management & Budget (OMB) or Congress can lead to changes and cancellations.


Our window to put the funding onto projects through grants (item 1d, above) is the most constrained step. Funds are supposed to arrive by October 1 each year, but it’s not uncommon that delays in the budget cycle mean we don’t see the full (i.e., appropriated and allocated) budget at the program level until the following March, April, or later. And, all funds need to be obligated by the end of each fiscal year (September 30)[iv].


Things downstream from a grant decision


Every dollar that supports your NSF research grant has an expiration date. The same is true for much of the Federal budget appropriated by Congress. For NSF research (R&RA) funds, the expiration date is 7 years from the start of the fiscal year (October 1 annually) in which the funds were provided to the agency (i.e., appropriated).


Because most DEB awards made in a given fiscal year have start dates well after October 1, the clock started ticking even before you received a grant. For example, if your award start date is July 1, then the funds you received are already 9 months old.


Although you can request a delay in the official start date of an award, which affects when you start spending your funds, you can’t delay the aging of your award funds. A delayed start doesn’t provide you any extra time to complete the work. The ultimate limit on how long you can extend a funded project (no-cost extensions) depends on when those dollars expire.


Money doesn’t actually go to your institution when you get a grant. It stays in the US Treasury until spent. We refer to your award as a federal obligation because it authorizes your institution to charge for expenditures incurred in the conduct of that award, and get reimbursed from the Treasury. We can see how much you have spent of your funds at any time.


There is a whole lot of regulation defining what projects can and can’t spend money on; meeting those regulatory obligations is largely the responsibility of your Sponsored Research Office (SRO)[v]. The ability of your SRO to meet those obligations is one of the things NSF reviews between the time when we (the programs) say “this is a good project” and the formal issuing of the grant. The consequences for failing to follow these rules are serious.


When we make a grant, we want you to use your full award. Funds that expire at the end of the 7 year clock don’t support your research or our mission. When we see expiring funds, we realize that we could have funded someone else but now we can’t (and there are lots of others who would have been happy for any funding). It also looks like you inflated your budget and/or can’t manage your projects effectively. And, it sends a message that the community has more money than it can put to good use.

[i] In 2009, ARRA “stimulus” funding was a 7th pot of money.

[ii] Without specific authority granted through legislation.

[iii] E.g., this list

[iv] Technically, NSF has two years in which to obligate our R&RA annual appropriation, but DEB, like most of NSF, does not “carryover” any funds into a second year. We commit and obligate every dollar allocated to us in a fiscal year and typically do so by mid-August. This allows maximum time for the funded projects to put the funds to use and minimizes the complexities of accounting across different appropriations.

[v] Therefore, your questions about use of funds already awarded should be directed at your SRO, not NSF Program Officers!

Preliminary Proposal Evaluation Update for DEB and IOS: Surveys Arriving Soon

Dear Researcher,

We are writing today to alert you that you may soon receive an invitation by e-mail to participate in a short survey conducted by Abt Associates on behalf of the US National Science Foundation (NSF). This is a legitimate request and we invite you to participate.

This survey is part of an independent evaluation of the two-step merit review process (preliminary proposal system) implemented by the Division of Environmental Biology (DEB) and Division of Integrative Organismal Systems (IOS). Some of you may have also received an invitation to complete a broader merit review satisfaction survey within the past year. These are complementary but completely separate activities.

The goal of this survey is to examine the level of satisfaction with the two-step merit review process (preliminary proposal system) pilot in DEB and IOS and to estimate the workload associated with preparing and reviewing proposals. The survey is being sent to a sample of DEB and IOS applicants and reviewers and to a comparison group from similar NSF programs which have not adopted the two-step process. This approach will enable us to understand the relative advantages and limitations of the change as well as to capture everyone’s perspective.

While participation in the survey is voluntary, we hope that you will take a few minutes to share your views. Research community input is vital to developing the best approaches to handle the large number of grant applications without compromising the quality of review and that is why your participation is very important. Thank you in advance for your help.


Paula Mabee, Division Director Heinz Gert de Couet, Division Director
Division of Environmental Biology Division of Integrative Organismal Systems
National Science Foundation National Science Foundation

Note: Pursuant to 5 CFR 1320.5(b), an agency may not conduct or sponsor, and a person is not required to respond to an information collection unless it displays a valid OMB control number. The OMB control number for this collection is 3145-0215. Public reporting burden for this collection of information is estimated to average less than 10 minutes per response, including the time for reviewing instructions. Send comments regarding this burden estimate and any other aspect of this collection of information, including suggestions for reducing this burden, to: Reports Clearance Officer, Office of the General Counsel, National Science Foundation, Suite 1265, 4201 Wilson Boulevard, Arlington, VA 22230.

DEB Numbers: Historical Proposal Loads

Last spring we posted on the per-person success rate and pointed out several interesting findings based on a decade of DEB data. We were seeing a lot of new PIs and, conversely, a lot of PIs who never returned after their first shot. And, the vast majority of PIs who managed to obtain funding are not continuously funded.

This post is a short follow-up to take a bigger picture look at submission rates.

Since preliminary proposals entered the scene, DEB really hasn’t seen much change in the submission pattern: 75% of PIs in any year submit one preliminary proposal and the other 25% submit two (and a small number submit three ideas in a year, if one also counts full proposals to special programs).

Before the preliminary proposals were launched, we ran some numbers on how often people tended to submit. The results were that, in the years immediately prior to preliminary proposals (~2008-2011), around 75% of PIs in a year were on a single proposal submission (25% on two or more). Fewer than 5% of PIs submitted more than two proposals in a year. Further, most PIs didn’t return to submit proposals year after year (either new ideas or re-working of prior submissions); skipping a year or two between submissions was typical. These data conflicted with the perceptions and anecdotes that “everyone” submitted several proposals every year and were increasing their submission intensity. Although recent data don’t support those perceptions, we still wondered if there might be a kernel of truth to be found on a longer time scale. What is the bigger picture of history of proposal load and submission behavior across BIO?

Well, with some digging we were able to put together a data set that lets us take a look at full proposal research grant submissions across BIO, going all the way back to 1991 when, it seems, the NSF started computerized record-keeping. Looking at this bigger picture of submissions, we can see when changes have occurred and how they fit into the broader narrative of the changing funding environment.

Total BIO full research grant submissions per year (line, right axis) and proportions of individuals submitting 1, 2, 3, 4, 5, or more proposals each calendar year from 1991 to 2014. (Note: 2015 is excluded because proposals submitted in calendar year 2015 are still being processed at the time of writing.)

Total BIO full research grant submissions per year (line, right axis) and proportions of individuals submitting 1, 2, 3, 4, 5, or more proposals each calendar year from 1991 to 2014. (Note: 2015 is excluded because proposals submitted in calendar year 2015 are still being processed at the time of writing.)


1990s: Throughout the 1990s BIO received about 4000 proposals per year. This period of relative stability represents the baseline for more than a decade of subsequent discussions of increasing proposal pressure. Interestingly, the proportion of people submitting two or more proposals each year grew over this period, but without seeming to affect total proposal load; this could result from either increasing collaboration (something we’ve seen) or a shrinking PI pool (something we haven’t seen). At this time NSF used a paper-based process, so the cost and effort to prepare a proposal was quite high. Then….

2000s: In 2000, FastLane became fully operational and everyone switched to electronic submission. BIO also saw the launch of special programs in the new Emerging Frontiers division. In a single year, it became easier to submit a proposal and there were more deadlines and target dates to which one could potentially submit. The new electronic submission mechanism and new opportunities likely both contributed to increased submissions in subsequent years.

Following the switch to FastLane, from 2001 to 2005, total annual submissions grew to about 50% above the 1990s average and stayed there for a few years. This period of growth also coincided with an increasing proportion of people submitting 2+ proposals. Increasing numbers of proposals per person had only a limited effect on the total proposal load because of continued growth in collaboration (increasing PIs per proposal). Instead, the major driver of proposal increases was the increasing number of people submitting proposals. This situation was not unique to BIO.

This period from 2001 to 2005 was the rapid growth that sparked widespread discussion in the scientific community of overburdening of the system and threats to the quality of merit review, as summarized in the 2007 IPAMM report.

Eventually, however, the community experienced a declining success rate because BIO budgets did not go up in any way to match the 50% increase in proposal submissions. From 2005-2008 submissions/person seemed to stabilize and submissions peaked in 2006. We interpret this as a shift in behavior in response to decreasing returns for proposal effort (a rebalancing of the effort/benefit ratio for submissions). It would have been interesting to see if this held, but….

2009/2010: In 2009 and 2010, BIO was up another ~1000 proposals over 2006, reaching an all-time high of nearly 7000 proposal submissions. These were the years of ARRA, the economic stimulus package. Even though NSF was very clear that almost all stimulus funding would go toward funding proposals that had been already reviewed (from 2008) and that we wouldn’t otherwise be able to afford, there was a clear reaction from the community. It appears that the idea of more money (or less competition) created a perception that the effort/benefit relationship may have changed, leading to more proposals.

2011: We see a drop in 2011. It is plausible that this was the realization that the ARRA money really was a one-time deal, there were still many more good proposals than could be funded, and that obtaining funding hadn’t suddenly become easier. As a result, the effort/benefit dynamic could be shifting back; or, this could’ve been a one-time off year. We can’t know for sure because…

2012: Starting in 2012 IOS and DEB, the two largest Divisions in BIO, switched to a system of preliminary proposals  to provide a first-pass screening of projects (preliminary proposals are not counted in the chart). This effectively restricted the number of full proposals in the two largest competitions in BIO such that in 2012, 2013, and 2014 the full proposal load across BIO dropped below 5000 proposals per year (down 2000 proposals from the 2010 peak). The proportion of individuals submitting 2+ full proposals per year also dropped, consistent with the submission limits imposed in DEB, IOS, and MCB. PIs now submitting multiple full proposals to BIO in a given year are generally submitting to multiple programs (core program and special program) or multiple Divisions (DEB and [IOS or MCB or EF or DBI]) and diversifying their submission portfolios.

In summary, the introduction of online and multi-institutional submissions via FastLane kicked off a decade of change marked by growth in proposal submissions and per-PI submissions to BIO. The response, a switch to preliminary proposals in IOS and DEB, caused a major (~1/3) reduction in full proposals and also a shift in the proportion of individuals submitting multiple proposals each year. In essence, the pattern of proposal submission in BIO has shifted back to what it was like in the early 2000s. However, even with these reductions, it is still a more competitive context than the 1990s baseline, prior to online submissions via FastLane.

DEB Numbers: Are aquatic ecologists underrepresented?

Editor’s note: This post was contributed by outgoing rotating Program Officer Alan Wilson and is a write-up of part of a project performed by DEB summer student Zulema Osorio during the summer of 2015.

Generalizability has been fundamental to the major advances in environmental biology and is an important trait for current research ideas proposed to NSF.  Despite its significance, a disconnect between terrestrial and aquatic ecological research has existed for several decades (Hairston 1990).

For example, Menge et al. (1990) quantitatively showed that authors heavily (~50%-65%) cite more studies from their representative habitat but that terrestrial ecologists are less likely to include citations from aquatic systems than the converse.  Failure to broadly consider relevant literature when designing, conducting, and sharing findings from research studies not only hinders future scientific advances (Menge et al. 2009) but may also compromise an investigator’s chances for funding[i] when proposing research ideas.

More recently, there have been anecdotal reports from our PI community that freshwater population or community ecology research is under-represented in NSF’s funding portfolio.  To explore the potential bias in proposal submissions and award success rates for ecological research associated with focal habitat, we compared the submissions and success rates of full proposals submitted to the core Population and Community Ecology (PCE) program from 2005-2014 that focused on terrestrial systems, aquatic systems, or both (e.g., aquatic-terrestrial linkages, modeling, synthesis).  Data about focal ecosystems were collected from PI-reported BIO classification forms.  To simplify our data analysis and interpretation, all projects (including collaboratives) were treated only once.  Also, the Division of Environmental Biology (DEB) switched to a preliminary proposal system in 2012.  Although this analysis focuses only on full proposals, the proportion of preliminary and full proposal submissions for each ecosystem type were nearly identical for 2012-2014.  Some projects (2.7% of total projects) provided no BIO classification data (i.e., non-BIO transfers or co-reviews) and were ignored for this project.  Finally, several other programs inside (Ecosystem Science, Evolutionary Processes, and Systematics and Biodiversity Science) and outside (e.g., Biological Oceanography, Animal Behavior, Arctic) of DEB fund research in aquatic ecosystems.  Thus, our findings only relate to the PCE portfolio.

In total, 3,277 core PCE projects were considered in this analysis. Means + 1 SD were calculated for submissions and success rates across 10 years of data from 2005-2014. Terrestrial projects (72% ± 2.8% SD) have clearly dominated projects submitted to the core PCE program across all ten years surveyed (Figure 1).  Aquatic projects accounted for 17% (± 2.6% SD) of the full proposal submissions while projects that include aspects of both aquatic and terrestrial components accounted for only 9% (± 1.6% SD) (Figure 1).  The full proposal success rate has been similar across studies that focused on terrestrial or aquatic ecosystems (calculated as number of awards ÷ number of full proposal submissions; Figure 2; terrestrial: 20% ± 6.9% SD; aquatic: 18% ± 6.5% SD).  Proposal success rate dynamics for projects that focus on both ecosystems are more variable (Figure 2; 16% ± 12.7% SD), in part, due to the small population size (9.5% of the projects considered in this study).

Figure 1. Submission history of full proposals submitted to the core PCE program from 2005-2014 for terrestrial (brown), aquatic (blue), or both ecosystems (red). Proposals were classified based on PI-submitted BIO classification forms. Note that some projects did not provide BIO classification data. These projects were ignored for this analysis and explain why yearly relative data may not total 100%.

Figure 1. Proportion of full proposals submitted to PCE based on focal ecosystem from 2005 to 2014.

Figure 2. Success rate of full proposals submitted to PCE based on focal ecosystem from 2005 to 2014.

Figure 2. Success rate of full proposals submitted to PCE based on focal ecosystem from 2005 to 2014.

In summary, anecdotal PI concerns of fewer funded aquatic proposals in PCE are consistent with available data but are an artifact of fewer aquatic proposal submissions.  Although funding rates for all full PCE proposals have generally varied from 2005-2014 (mean: 19.9% ± 6.4% SD; range: 11%-29%) as a function of available funds and the number of proposals considered, terrestrial- and aquatic-focused research proposals have fared similarly for the past decade.  PCE, like the rest of DEB and NSF, is motivated to have a diverse portfolio and encourages ecologists from varied institutions and backgrounds to submit ideas that study interesting, important questions that will generally move the field of population and community ecology forward.


Figure 1. Submission history of full proposals submitted to the core PCE program from 2005-2014 for terrestrial (brown), aquatic (blue), or both ecosystems (red).  Proposals were classified based on PI-submitted BIO classification forms.  Note that some projects did not provide BIO classification data.  These projects were ignored for this analysis and explain why yearly relative data may not total 100%.

Figure 2. Success rate history of full proposals submitted to the core PCE program from 2005-2014 for terrestrial (brown), aquatic (blue), or both ecosystems (red).  Proposal success rate is calculated for each ecosystem type as the number of awards ÷ number of full proposal submissions.   Proposals were classified based on PI-submitted BIO classification forms.


Hairston, Jr., N. G. 1990. Problems with the perception of zooplankton research by colleagues outside of the aquatic sciences. Limnology and Oceanography 35(5):1214-1216.

Menge, B. A., F. Chan, S. Dudas, D. Eerkes-Medrano, K. Grorud-Colvert, K. Heiman, M. Hessing-Lewis, A. Iles, R. Milston-Clements, M. Noble, K. Page-Albins, R. Richmond, G. Rilov, J. Rose, J. Tyburczy, L. Vinueza, and P. Zarnetska. 2009. Terrestrial ecologists ignore aquatic literature: Asymmetry in citation breadth in ecological publications and implications for generality and progress in ecology. Journal of Experimental Marine Biology and Ecology 377:93-100.

[i] Generalizability “within its own field or across different fields” is a principal consideration of the Intellectual Merit review criterion: