DEB Numbers: FY 2016 Wrap-Up

DEB Numbers: FY 2016 Wrap-Up


Fiscal year 2016 officially closed out on September 30. Now that we are past our panels in October and early November, we have a chance to look back and report on the DEB Core Program merit review and funding outcomes for FY 2016.

This post follows the format we’ve used in previous years. For a refresher, and lengthier discussions of the hows and whys of the metrics, you can visit the 2015,  2014, and 2013 numbers.

Read on to see how 2016 compares. Continue reading

Your project titles matter, choose wisely


This post was inspired by a bit of musing as to what would happen if PIs tried to crowd-source parts of their proposals. The obvious answer, to us at least, was that we would almost certainly, and immediately, receive a proposal titled “Granty McGrantface.” We’re presuming you are familiar with the reference; but if not, see these links. While the saga of our friends at NERC turned our pretty well, it reminded us of two things: 1) asking the internet to decide for you is a risky proposition, and (the focus of this post) 2) that no matter our intentions, some of the stuff[i] we do, or that stems from the funding we provide to you, will get noticed by a wide audience. Most stuff tends to go unnoticed, but from time to time something goes viral.

Therefore: What you choose to call your project matters.

Why the project title matters to NSF

The project title is the most meaningful and unique piece of your proposal that carries over to the public award description. Everything else in your proposal is distilled and condensed down to a couple paragraphs of “public abstract” and a few dozen metadata records available via the NSF award search and research.gov[ii]. Consider, too, the project title is a part of your proposal for which NSF takes responsibility and exercises editorial power. We can, and sometimes do, change project titles (about a quarter are changed, mostly for clarity – such as writing out abbreviations.)

Why the project title matters to you

The project title and PI info are the only things most potential reviewers will ever see before deciding whether to review your proposal. The title is your first (and typically only) shot to communicate to a reviewer that your proposal is interesting and worth their time to review[iii].  And as we said above, if your proposal gets funded, the title gets posted on the NSF public awards website along with the PI name and institution.

 

You can (and should) provide effective project titles

When you receive an award, the title will be searchable by anyone and permanently associated with your name. Over the years, we’ve seen a vast array of proposal titles. We’ve also seen how they affect the audiences (reviewers, panels, and public) who read or hear them. Based on the accumulation of observations and experiences in DEB, we’ve put together these 8 tips to consider when composing your project titles.

Keep in mind: The following are not any sort of universally enforced rules or NSF policy. The proposal title is initially your responsibility, but as we said, once it comes into NSF, we can edit it as needed. Ultimately, what makes a good title is subjective and is probably not constant across disciplines or over time. These are just some broad and general tips we hope you’ll find helpful.

Tip 1: Know your broader audiences

Reviewers, including panelists, are specialists, but not necessarily from the same sub-sub-specialty as you. Public readers of award titles cover an even wider range of knowledge and expertise. These are the people who are going to read that title and make a decision whether to take action. Reviewers will, first, decide whether or not to read, and then, whether or not to support your proposal. The public will decide whether to read your award abstract, and the media will decide whether to contact you.

There are both good and bad potential outcomes of public attention. It can seem like a strong, scientifically precise, and erudite proposal title might inform and impress readers. But that misses half the point: it’s not simply about avoiding misunderstanding. Instead, a good title is a vehicle for audience engagement; it seeks to cultivate positive responses. This happens when you use straight-forward, plain language, minimizing jargon and tech-speak, with a clear message. The rest of these tips are basically more specific examples of ways to do this.

Tip 2: Write to your (proposal’s) strengths

Most of us feel some twinge of annoyance when we see a misleading headline or publication title, e.g. “Transformative Biology Research to Cure All Diseases.” This is your chance to get it right! Don’t bury the lede. Focus your title on the core idea of the proposal. In many cases, details like the organism, the location, or the specific method are secondary[iv]; if you include them, do so carefully, in supporting roles and not swamping the central conceptual component[v]. If you wrote your title before your proposal, it’s a good idea to come back around to it before hitting submit.

Tip 3: Using Buzzwords #OnFleek

It’s a bit cliché to say this, but it bears mention: don’t tell us your project is great, demonstrate it. That is what the project description is for. We like “transformative” and “interdisciplinary” projects, but placing those words in your title doesn’t imbue your project with those qualities. Similarly, loading up on topical or methodological buzzwords (“*omics”, “CRISPR”, etc.) adds little when the major consideration is the knowledge you’re seeking to uncover, not the shiny new tool you want to wield or the loose connection to a hot topic. The space you save by dropping this extra verbiage can allow you to address other important aspects of your project.

Tip 4: Acronyms

They save space in your title. And, NSF seems to have them all over the place (It’s an ARE: Acronym Rich Environment). So, why not use them, right? Well…, tread carefully.

The various title prefixes (e.g. RUI, CAREER) we ask for are used by us to 1) ensure reviewers see that special review criteria apply and 2) check that we’ve applied the right processing to your proposal. They’re often acronyms because we don’t want to waste your character count. So, we want those on your proposals[vi] but, after merit review, we may remove them before making an award. Other acronyms added by you tend to fall into two categories:

  • Compressed jargon- for example, “NGS” for Next Generation Sequencing. When you don’t have the whole proposal immediately behind it, an acronym in your title may never actually be defined in the public description and it may imply something unintended to some in the audience.
  • Project-name shorthand- There are perhaps a handful of projects that through longevity and productivity have attained a degree of visibility and distinctiveness that allows them to be known by an acronym or other shorthand within the particular research community. Even if your project has achieved this distinction, remember that your audience goes beyond your community: not everyone will know of it. Further, trying to create a catchy nickname for a project (or program) usually doesn’t add anything to your proposal and can lead to some real groan-inducing stretches of language.

Tip 5: Questions to consider

How will reviewers respond to a title phrased as a question? Is the answer already an obvious yes or no? If so, why do you need the proposal and more money? Is this question even answerable with your proposed work? Is this one of the very rare projects that can be effectively encapsulated in this way?

Tip 6: Attempted humor

This can work; it may also fall flat (see above entry on “Questions”). It can, to some audiences, make your project seem unprofessional and illegitimate. That is a sizeable risk. It used to be, and still is to some extent, a fairly common practice to have a joke or cartoon in your slide deck to “lighten the mood” and “connect with your audience”. If you’ve ever seen a poor presenter do this, you know it’s not a universally good thing. With a proposal title, it’s always there and doesn’t get buried under the rest of the material as might happen with a slide. The alternative is to skip the joke and write something that connects to your reader through personality and creativity instead. This can be hard to do, but practice helps. For example, “I Ain’t Afraid of No Host: The Saga of a Generalist Parasite” was a funny, at least to us, title we made up – but will everyone reading it think it is funny, and does it help the grant that the title is funny? It isn’t very informative – again, tread lightly.

Tip 7: Latin vs Common terms

Per tip 2, you may not always list an organism in your project title; but when you do, make it accessible. The Latin name alone places a burden of prior knowledge or extra work on readers. It is a courtesy to public readers (not to mention your own SRO who may be filling out paperwork about your proposal and also to panelists who may be far afield from your system and unfamiliar with your organism) to add a common name label too. But, be careful. Some common names are too specific, jargon-y, or even misleading for a general audience. You don’t want, for instance, someone to see “mouse-ear cress” for Arabidopsis thaliana and think you’re working on vertebrate animal auditory systems (this has happened![vii]).

Tip 8: Thoughtful Word Choice

This tip expands the idea of confusing language, which we already pointed out regarding Latin names and acronyms, to avoiding jargon in general. Some jargon is problematic just because it is dense; as with Latin names and acronyms, this sort of jargon can be addressed by addition of or replacement with common terms. Other jargon is problematic because the audience understands it, but differently than intended. Meg Duffy over at Dynamic Ecology had a post on this some time back in the context of teaching and communication. These issues arise in proposals too. There are some very core words in our fields that don’t necessarily evoke the same meaning to a general audience or even across fields. The most straightforward example we can point to is our own name: the “E” in DEB stands for “environmental.” To a general audience environmental is more evocative of “environmentalism,” “conservation,” recycling programs, and specific policy goals than it is of any form of basic research[viii]. Addressing this sort of jargon in a proposal title is a bit harder because the word already seems common, and concise alternative phrasings are hard to come by.

For jargon, it might benefit you to try bouncing your title off of a neighbor, an undergrad outside your department, or an administrator colleague. In some cases, you might find a better, clearer approach. In others, maybe there’s not a better wording, but at least you are more aware of the potential misunderstandings.

Final Thoughts

Most of the project titles that we see won’t lead to awards and will never be published; and even if an award is made, most of their titles attract little notice. A few, however, will be seen by thousands or be picked up by the media and broadcast to millions. Thus, the title seems like a small and inconsequential thing, until it’s suddenly important. Because of this, even though the project title is a small piece of your proposal, it is worthy of attention and investment. We have provided the tips above to help you craft a title that uses straight-forward, plain language, to convey a clear and engaging message to your audiences.

We can’t avoid attention. In fact, we want to draw positive attention to the awesome work you do. But audience reactions are reliably unpredictable. The best we can do is to make sure that what we’re putting out there is as clear and understandable as possible.

 


[i] Anything related to research funding from policies on our end to research papers to tweets or videos mentioning projects.

[ii] At the close of an award, you are also required to file a “Project Outcomes Report” via Research.gov. This also becomes part of the permanent project record and publicly visible when your work is complete. We don’t edit these.

[iii] For the “good titles” argument as applied to research papers, see here: https://smallpondscience.com/2016/10/19/towards-better-titles-for-academic-papers-an-evaluative-approach-from-a-blogging-perspective/

[iv] There are obvious exceptions here, like a proposal for a targeted biodiversity survey in a geographical region.

[v] For what it’s worth, this is a common “rookie mistake” even before writing a proposal. We get lots of inquiries along the lines of “do you fund studies on organism X” or “in place Y”. The short answer is yes, but it’s often irrelevant because that doesn’t differentiate DEB from MCB or IOS or BioOCE. We don’t define the Division of Environmental Biology by organisms, or places, or tools, or methods. We define it by the nature of the fundamental questions being addressed by the research.

[vi] Some prefixes are mutually exclusive of one another. For example, CAREER and RUI cannot both be applied to the same proposal (http://www.nsf.gov/pubs/2015/nsf15057/nsf15057.jsp#a16).

[vii] Better alternatives might have been “plant”, “wild mustard”,

[viii] And yes, we do get the same sorts of calls and emails about “sick trees”, “that strange bird I saw”, “what to do about spiders,” etc. as you do.

Fall 2016 DEB Panels status: “When will I have a decision?” edition


DEB’s full proposal panels finished in early November (for those full proposals submitted back in July and August). So, when will you receive review results?

Some of you may have already heard from us. Others will be hearing “soon” (as detailed below).

Right now, all of our programs have synthesized the recommendations of their panels, considered their portfolios, and come up with their planned award and decline recommendations. These are then documented, sent through administrative review, and finally signed off, “concurred,” by the head or deputy for the Division.

DEB’s first priority is processing the decline notices. We’re trying to get your reviews back to you to provide as much time as possible to consider your options for January pre-proposal submissions.

For potential awards, it’s a bit more complicated. We expect award recommendation dates to be later this year than typical. At present, NSF is operating under a temporary budget measure, called a Continuing Resolution (or CR). The current CR runs through December 9, 2016. We won’t have significant funds available to cover new grants until a longer-term funding measure is enacted.

So, while we have a prioritized list of award recommendations, we don’t yet have the funds needed to take action on those recommendations. Moreover, we don’t know how much funding we’ll actually have available so uncertainty is part of the plan. Thus, between “definite award recommendation” and “definite decline recommendation” we have a recommendation gray zone.

How are we handling this?

If your proposal fell into the definite decline group, then you’ll be getting an official notice from DEB. Once the formal decline recommendation is approved, the system updates the proposal status in FastLane and queues up a notification email. We are planning to have all declines approved by December 20, 2016. Note: our IT system sends the notification emails in batches at the end of the day[i]. Thus, if you are frequently refreshing FastLane you will likely see the news there before you get a letter from us.

If your proposal fell into the definite award group or the gray zone, you will first be getting a call or email from your Program Officer. They will be letting you know what the plan is for your particular proposal and how you can get things ready (e.g., submitting budget revisions or abstract language) for an eventual award. Formal action, including the release of reviews, cannot happen until we have funding available. However, folks in this group should also hear from their Program Officers by December 20.

After December 20, if you have not received any communication from us, first check your spam folder and then look up your proposal number and give us a call. But please remember, the lead PI for a proposal or collaborative group is the designated point of contact; if you’re a co-PI you need to get in touch with the lead PI and have them inquire.


[i] We’re not totally sure why this is, but suspect it has to do with email traffic volume and security features: discriminating an intentional batch of emails from an account taken over by a bot.

Preliminary Proposal Evaluation Survey Reminder


TL;DR

Check your inbox.

Check your spam folder.

Complete the survey!

End the reminder messages.

 

Background (if the above doesn’t make sense to you).

This is about the Preliminary Proposal system in use in both NSF BIO’s Division of Environmental Biology and Division of Integrative Organismal Systems.

We are in the midst of an external evaluation of the effects of this system on the merit review process.

We posted an initial notification letter about stakeholder surveys. And, copies of this letter were sent out to everyone in the sample ahead of the formal invitations.

The formal survey invitations with the active survey links were sent out by mid-September from the evaluator, Abt Associates.

Reminder emails are also coming out and will continue to do so at regular interviews while the survey remains open and incomplete.

If you have been receiving these messages, please complete the survey. If your colleagues have been receiving these messages and have not completed the survey, encourage them to do so.

If you received an invitation to take the survey,

  • Please take the 10 or so minutes to register your responses via the link in the email.
  • Remember that these are single-use individualized links.
  • Your response matters. This isn’t a census: your invitation is part of a stratified random sample selected for inference to the population.

Thank you for your participation!

A dozen things All PIs should know about the U.S. Federal budget as it relates to NSF research grants

A dozen things All PIs should know about the U.S. Federal budget as it relates to NSF research grants


Things upstream from a grant decision


1

There is an annual budget cycle (see graphic, below):

a.    Request: The President puts out a plan for a budget in a request to Congress.

b.    Appropriation: Congress decides how much (described in this downloadable PDF) to actually provide to each agency, (e.g., NSF). This is signed into law by the President. Annual appropriations start on October 1 each year. Even if Congress is delayed in finalizing the budget for that year, the October 1 “birthday” of the funds applies retroactively.

c.    Allocation and Allotment: The appropriations are passed down from the Treasury through the agency to funding programs (e.g., Population and Community Ecology, Dimensions of Biodiversity).

d.    Commitment and Obligation: Funding is applied to projects (typically as grants) after merit review. Technically, Program Officers “recommend” funding (d-i), Division Directors concur the decision to “commit” funds (d-ii), Grants Specialists make the “award obligation” (d-iii), and the award is made to the institution (not the PI).

e.    Expenditure & Reimbursement: Over the subsequent months and years, the PIs of funded projects use the funding to make science happen and receive reimbursement from Treasury accounts.

Diagram of the relationship between the annual U.S. Federal budget process and NSF merit review system.

Diagram of the relationship between the annual U.S. Federal budget process and NSF merit review system.


2

At any given time, we are thinking about 3 or 4 different years’ budgets:

a.    Reporting on last year

b.    Managing this year

c.    Planning for next year

d.    Building momentum for the year after next


3

While we often refer to “the budget” in the singular abstract form, there are different pots of money at different levels in the agency.


4

At the highest level, there are 6 different pots (described in this PDF), called accounts[i]. These pots can’t be mixed[ii]. And, only 2 typically matter directly to researchers: Research & Related Activities (R&RA) and Education & Human Resources (EHR).


5

Individual program[iii] budgets, scopes, and lifespans are usually managed by each Division, but specific guidance from the White House Office of Management & Budget (OMB) or Congress can lead to changes and cancellations.


6

Our window to put the funding onto projects through grants (item 1d, above) is the most constrained step. Funds are supposed to arrive by October 1 each year, but it’s not uncommon that delays in the budget cycle mean we don’t see the full (i.e., appropriated and allocated) budget at the program level until the following March, April, or later. And, all funds need to be obligated by the end of each fiscal year (September 30)[iv].


 

Things downstream from a grant decision


7

Every dollar that supports your NSF research grant has an expiration date. The same is true for much of the Federal budget appropriated by Congress. For NSF research (R&RA) funds, the expiration date is 7 years from the start of the fiscal year (October 1 annually) in which the funds were provided to the agency (i.e., appropriated).


8

Because most DEB awards made in a given fiscal year have start dates well after October 1, the clock started ticking even before you received a grant. For example, if your award start date is July 1, then the funds you received are already 9 months old.


9

Although you can request a delay in the official start date of an award, which affects when you start spending your funds, you can’t delay the aging of your award funds. A delayed start doesn’t provide you any extra time to complete the work. The ultimate limit on how long you can extend a funded project (no-cost extensions) depends on when those dollars expire.


10

Money doesn’t actually go to your institution when you get a grant. It stays in the US Treasury until spent. We refer to your award as a federal obligation because it authorizes your institution to charge for expenditures incurred in the conduct of that award, and get reimbursed from the Treasury. We can see how much you have spent of your funds at any time.


11

There is a whole lot of regulation defining what projects can and can’t spend money on; meeting those regulatory obligations is largely the responsibility of your Sponsored Research Office (SRO)[v]. The ability of your SRO to meet those obligations is one of the things NSF reviews between the time when we (the programs) say “this is a good project” and the formal issuing of the grant. The consequences for failing to follow these rules are serious.


12

When we make a grant, we want you to use your full award. Funds that expire at the end of the 7 year clock don’t support your research or our mission. When we see expiring funds, we realize that we could have funded someone else but now we can’t (and there are lots of others who would have been happy for any funding). It also looks like you inflated your budget and/or can’t manage your projects effectively. And, it sends a message that the community has more money than it can put to good use.


[i] In 2009, ARRA “stimulus” funding was a 7th pot of money.

[ii] Without specific authority granted through legislation.

[iii] E.g., this list http://www.nsf.gov/funding/programs.jsp?org=DEB

[iv] Technically, NSF has two years in which to obligate our R&RA annual appropriation, but DEB, like most of NSF, does not “carryover” any funds into a second year. We commit and obligate every dollar allocated to us in a fiscal year and typically do so by mid-August. This allows maximum time for the funded projects to put the funds to use and minimizes the complexities of accounting across different appropriations.

[v] Therefore, your questions about use of funds already awarded should be directed at your SRO, not NSF Program Officers!

DEB Numbers: Historical Proposal Loads


Last spring we posted on the per-person success rate and pointed out several interesting findings based on a decade of DEB data. We were seeing a lot of new PIs and, conversely, a lot of PIs who never returned after their first shot. And, the vast majority of PIs who managed to obtain funding are not continuously funded.

This post is a short follow-up to take a bigger picture look at submission rates.

Since preliminary proposals entered the scene, DEB really hasn’t seen much change in the submission pattern: 75% of PIs in any year submit one preliminary proposal and the other 25% submit two (and a small number submit three ideas in a year, if one also counts full proposals to special programs).

Before the preliminary proposals were launched, we ran some numbers on how often people tended to submit. The results were that, in the years immediately prior to preliminary proposals (~2008-2011), around 75% of PIs in a year were on a single proposal submission (25% on two or more). Fewer than 5% of PIs submitted more than two proposals in a year. Further, most PIs didn’t return to submit proposals year after year (either new ideas or re-working of prior submissions); skipping a year or two between submissions was typical. These data conflicted with the perceptions and anecdotes that “everyone” submitted several proposals every year and were increasing their submission intensity. Although recent data don’t support those perceptions, we still wondered if there might be a kernel of truth to be found on a longer time scale. What is the bigger picture of history of proposal load and submission behavior across BIO?

Well, with some digging we were able to put together a data set that lets us take a look at full proposal research grant submissions across BIO, going all the way back to 1991 when, it seems, the NSF started computerized record-keeping. Looking at this bigger picture of submissions, we can see when changes have occurred and how they fit into the broader narrative of the changing funding environment.

Total BIO full research grant submissions per year (line, right axis) and proportions of individuals submitting 1, 2, 3, 4, 5, or more proposals each calendar year from 1991 to 2014. (Note: 2015 is excluded because proposals submitted in calendar year 2015 are still being processed at the time of writing.)

Total BIO full research grant submissions per year (line, right axis) and proportions of individuals submitting 1, 2, 3, 4, 5, or more proposals each calendar year from 1991 to 2014. (Note: 2015 is excluded because proposals submitted in calendar year 2015 are still being processed at the time of writing.)

 

1990s: Throughout the 1990s BIO received about 4000 proposals per year. This period of relative stability represents the baseline for more than a decade of subsequent discussions of increasing proposal pressure. Interestingly, the proportion of people submitting two or more proposals each year grew over this period, but without seeming to affect total proposal load; this could result from either increasing collaboration (something we’ve seen) or a shrinking PI pool (something we haven’t seen). At this time NSF used a paper-based process, so the cost and effort to prepare a proposal was quite high. Then….

2000s: In 2000, FastLane became fully operational and everyone switched to electronic submission. BIO also saw the launch of special programs in the new Emerging Frontiers division. In a single year, it became easier to submit a proposal and there were more deadlines and target dates to which one could potentially submit. The new electronic submission mechanism and new opportunities likely both contributed to increased submissions in subsequent years.

Following the switch to FastLane, from 2001 to 2005, total annual submissions grew to about 50% above the 1990s average and stayed there for a few years. This period of growth also coincided with an increasing proportion of people submitting 2+ proposals. Increasing numbers of proposals per person had only a limited effect on the total proposal load because of continued growth in collaboration (increasing PIs per proposal). Instead, the major driver of proposal increases was the increasing number of people submitting proposals. This situation was not unique to BIO.

This period from 2001 to 2005 was the rapid growth that sparked widespread discussion in the scientific community of overburdening of the system and threats to the quality of merit review, as summarized in the 2007 IPAMM report.

Eventually, however, the community experienced a declining success rate because BIO budgets did not go up in any way to match the 50% increase in proposal submissions. From 2005-2008 submissions/person seemed to stabilize and submissions peaked in 2006. We interpret this as a shift in behavior in response to decreasing returns for proposal effort (a rebalancing of the effort/benefit ratio for submissions). It would have been interesting to see if this held, but….

2009/2010: In 2009 and 2010, BIO was up another ~1000 proposals over 2006, reaching an all-time high of nearly 7000 proposal submissions. These were the years of ARRA, the economic stimulus package. Even though NSF was very clear that almost all stimulus funding would go toward funding proposals that had been already reviewed (from 2008) and that we wouldn’t otherwise be able to afford, there was a clear reaction from the community. It appears that the idea of more money (or less competition) created a perception that the effort/benefit relationship may have changed, leading to more proposals.

2011: We see a drop in 2011. It is plausible that this was the realization that the ARRA money really was a one-time deal, there were still many more good proposals than could be funded, and that obtaining funding hadn’t suddenly become easier. As a result, the effort/benefit dynamic could be shifting back; or, this could’ve been a one-time off year. We can’t know for sure because…

2012: Starting in 2012 IOS and DEB, the two largest Divisions in BIO, switched to a system of preliminary proposals  to provide a first-pass screening of projects (preliminary proposals are not counted in the chart). This effectively restricted the number of full proposals in the two largest competitions in BIO such that in 2012, 2013, and 2014 the full proposal load across BIO dropped below 5000 proposals per year (down 2000 proposals from the 2010 peak). The proportion of individuals submitting 2+ full proposals per year also dropped, consistent with the submission limits imposed in DEB, IOS, and MCB. PIs now submitting multiple full proposals to BIO in a given year are generally submitting to multiple programs (core program and special program) or multiple Divisions (DEB and [IOS or MCB or EF or DBI]) and diversifying their submission portfolios.

In summary, the introduction of online and multi-institutional submissions via FastLane kicked off a decade of change marked by growth in proposal submissions and per-PI submissions to BIO. The response, a switch to preliminary proposals in IOS and DEB, caused a major (~1/3) reduction in full proposals and also a shift in the proportion of individuals submitting multiple proposals each year. In essence, the pattern of proposal submission in BIO has shifted back to what it was like in the early 2000s. However, even with these reductions, it is still a more competitive context than the 1990s baseline, prior to online submissions via FastLane.

DEB Numbers: Are aquatic ecologists underrepresented?


Editor’s note: This post was contributed by outgoing rotating Program Officer Alan Wilson and is a write-up of part of a project performed by DEB summer student Zulema Osorio during the summer of 2015.

Generalizability has been fundamental to the major advances in environmental biology and is an important trait for current research ideas proposed to NSF.  Despite its significance, a disconnect between terrestrial and aquatic ecological research has existed for several decades (Hairston 1990).

For example, Menge et al. (1990) quantitatively showed that authors heavily (~50%-65%) cite more studies from their representative habitat but that terrestrial ecologists are less likely to include citations from aquatic systems than the converse.  Failure to broadly consider relevant literature when designing, conducting, and sharing findings from research studies not only hinders future scientific advances (Menge et al. 2009) but may also compromise an investigator’s chances for funding[i] when proposing research ideas.

More recently, there have been anecdotal reports from our PI community that freshwater population or community ecology research is under-represented in NSF’s funding portfolio.  To explore the potential bias in proposal submissions and award success rates for ecological research associated with focal habitat, we compared the submissions and success rates of full proposals submitted to the core Population and Community Ecology (PCE) program from 2005-2014 that focused on terrestrial systems, aquatic systems, or both (e.g., aquatic-terrestrial linkages, modeling, synthesis).  Data about focal ecosystems were collected from PI-reported BIO classification forms.  To simplify our data analysis and interpretation, all projects (including collaboratives) were treated only once.  Also, the Division of Environmental Biology (DEB) switched to a preliminary proposal system in 2012.  Although this analysis focuses only on full proposals, the proportion of preliminary and full proposal submissions for each ecosystem type were nearly identical for 2012-2014.  Some projects (2.7% of total projects) provided no BIO classification data (i.e., non-BIO transfers or co-reviews) and were ignored for this project.  Finally, several other programs inside (Ecosystem Science, Evolutionary Processes, and Systematics and Biodiversity Science) and outside (e.g., Biological Oceanography, Animal Behavior, Arctic) of DEB fund research in aquatic ecosystems.  Thus, our findings only relate to the PCE portfolio.

In total, 3,277 core PCE projects were considered in this analysis. Means + 1 SD were calculated for submissions and success rates across 10 years of data from 2005-2014. Terrestrial projects (72% ± 2.8% SD) have clearly dominated projects submitted to the core PCE program across all ten years surveyed (Figure 1).  Aquatic projects accounted for 17% (± 2.6% SD) of the full proposal submissions while projects that include aspects of both aquatic and terrestrial components accounted for only 9% (± 1.6% SD) (Figure 1).  The full proposal success rate has been similar across studies that focused on terrestrial or aquatic ecosystems (calculated as number of awards ÷ number of full proposal submissions; Figure 2; terrestrial: 20% ± 6.9% SD; aquatic: 18% ± 6.5% SD).  Proposal success rate dynamics for projects that focus on both ecosystems are more variable (Figure 2; 16% ± 12.7% SD), in part, due to the small population size (9.5% of the projects considered in this study).

Figure 1. Submission history of full proposals submitted to the core PCE program from 2005-2014 for terrestrial (brown), aquatic (blue), or both ecosystems (red). Proposals were classified based on PI-submitted BIO classification forms. Note that some projects did not provide BIO classification data. These projects were ignored for this analysis and explain why yearly relative data may not total 100%.

Figure 1. Proportion of full proposals submitted to PCE based on focal ecosystem from 2005 to 2014.

Figure 2. Success rate of full proposals submitted to PCE based on focal ecosystem from 2005 to 2014.

Figure 2. Success rate of full proposals submitted to PCE based on focal ecosystem from 2005 to 2014.

In summary, anecdotal PI concerns of fewer funded aquatic proposals in PCE are consistent with available data but are an artifact of fewer aquatic proposal submissions.  Although funding rates for all full PCE proposals have generally varied from 2005-2014 (mean: 19.9% ± 6.4% SD; range: 11%-29%) as a function of available funds and the number of proposals considered, terrestrial- and aquatic-focused research proposals have fared similarly for the past decade.  PCE, like the rest of DEB and NSF, is motivated to have a diverse portfolio and encourages ecologists from varied institutions and backgrounds to submit ideas that study interesting, important questions that will generally move the field of population and community ecology forward.

Figures

Figure 1. Submission history of full proposals submitted to the core PCE program from 2005-2014 for terrestrial (brown), aquatic (blue), or both ecosystems (red).  Proposals were classified based on PI-submitted BIO classification forms.  Note that some projects did not provide BIO classification data.  These projects were ignored for this analysis and explain why yearly relative data may not total 100%.

Figure 2. Success rate history of full proposals submitted to the core PCE program from 2005-2014 for terrestrial (brown), aquatic (blue), or both ecosystems (red).  Proposal success rate is calculated for each ecosystem type as the number of awards ÷ number of full proposal submissions.   Proposals were classified based on PI-submitted BIO classification forms.

References

Hairston, Jr., N. G. 1990. Problems with the perception of zooplankton research by colleagues outside of the aquatic sciences. Limnology and Oceanography 35(5):1214-1216.

Menge, B. A., F. Chan, S. Dudas, D. Eerkes-Medrano, K. Grorud-Colvert, K. Heiman, M. Hessing-Lewis, A. Iles, R. Milston-Clements, M. Noble, K. Page-Albins, R. Richmond, G. Rilov, J. Rose, J. Tyburczy, L. Vinueza, and P. Zarnetska. 2009. Terrestrial ecologists ignore aquatic literature: Asymmetry in citation breadth in ecological publications and implications for generality and progress in ecology. Journal of Experimental Marine Biology and Ecology 377:93-100.

[i] Generalizability “within its own field or across different fields” is a principal consideration of the Intellectual Merit review criterion: http://www.nsf.gov/pubs/policydocs/pappguide/nsf16001/gpg_3.jsp#IIIA