Posted by: Jeremy Fox | May 7, 2012

Provocative new Oikos paper: should we reallocate funding away from the ecological 1%? (UPDATED)

My fellow Oikos editor and blogger Chris Lortie has a strong interest in scientific publication practices (see, e.g., here). His latest effort, now in press at Oikos, examines patterns of funding and impact among the ecological 1%: the most-cited 1% of ecologists over the last 20 years (Lortie et al. in press). Chris is too busy right now to blog it himself, so I’m going to do it, because I think it’s a must-read.

In a previous paper (UPDATE: link fixed), Chris and his colleagues reported detailed survey data characterizing 147 of the most-cited ecologists and environmental scientists. Not surprisingly, they are overwhelmingly male, middle-aged, employed in North America and Western Europe, have large, well-funded labs, publish frequently, and have high per-paper citation rates.* But there’s a surprising amount of variation (multiple orders of magnitude) in terms of lab size and funding level within this elite group. A feature of the data which Lortie et al. take advantage of to ask how citation rates vary with funding level within the elite.

In their previous work, Chris and his colleagues have shown that, for non-elite Canadian ecologists, more funding is associated with more “publication impact efficiency” (PIE): more highly-funded researchers also have more citations per dollar of funding. But Lortie et al. find that the same is not true of the elite. Using a slightly different measure of PIE (citations per publication per dollar), Lortie et al. find no relationship (not even a hint!) between PIE and funding within the ecological elite. Again, that’s despite multiple orders of magnitude of variation in funding level within the elite. Combined with their previous results, this indicates diminishing returns to really high funding levels (above several hundred thousand dollars, roughly). The implication is that funding agencies looking to maximize the “bang for their buck” arguably should reallocate funding away from really elite researchers and towards non-elite researchers. This wouldn’t reduce the PIE of the former group, but would increase the PIE of the latter group. Actually, Lortie et al. are careful to suggest increased funding for non-elite ecologists, rather than a reallocation of existing funding. But opportunity costs are ever-present so long as total funding is finite, and so I don’t think it’s possible to avoid the implication that these data suggest reallocation of funding away from the “elite of the elite”.

This is really thought-provoking stuff, and I hope that the folks who run our funding agencies take note.** One thing I’d like to see is for funding agencies to use this kind of information to give guidance to grant referees on how to evaluate applicants’ track records. For NSERC Disovery Grants, the “excellence of the researcher” (basically, the reviewers’ evaluation of your track record of publications and other outputs over the previous 6 years) is 1/3 of the grant score. To my knowledge, NSERC currently offers no guidance as to whether “excellence” should be scaled relative to the applicant’s funding level (which the applicant is obliged to report). These data suggest that it should be, and further that the appropriate scaling is nonlinear. Against that idea, one could note that increased funding gives researchers various advantages that let them increase their per-publication impact as well as their publication rate. So ultimately it’s impossible for reviewers (or anyone) to try to tease apart how much of an applicant’s track record is just due to their funding level (the idea being that anybody with lots of funding will have a good track record), and how much is due to their “intrinsic” excellence.

I am curious to see results for elite and non-elite researchers using exactly the same measure of PIE. Perhaps Chris can provide these numbers in the comments.

All of the usual caveats about citations as a measure of “impact” apply, obviously, and Lortie et al. recognize those caveats. But the conclusions here are, I think, robust to those caveats. Basically, there’s an upper limit to the mean quality of papers that any lab is capable of producing–and it turns out that even the most brilliant ecologist’s lab hits that limit at a funding level of several hundred thousand dollars or so.

One other caveat is that these are observational, comparative data. They aren’t necessarily a reliable guide to the effects of an “experimental manipulation” such as a reallocation of funding away from the elite. But they’re the only guide we have. Though having said that, I’d also be interested in analyses tracking changes over time in the funding level, PIE, and other relevant variables for individual researchers. Would it lead to the same conclusions?

In passing, one minor quibble I have is that Lortie et al. describe elite researchers as especially “collaborative”. But by this, they seem to mean simply that elite researchers have larger labs on average than non-elite researchers, not that they have more extensive collaborations with colleagues outside their labs. Which seems like a rather unconventional definition of “collaboration”.

There are analogies here to debates in economics over income and wealth inequality and their consequences, which are too obvious for me to ignore–but too incendiary for me to comment on!

This should be an interesting comment thread…

*Click through and read the whole thing for data on other characteristics of elite researchers–such as how hard they work and how much alcohol they drink!

**At least some of them are thinking about this stuff. I can’t find the citation, but a little while back NIH did analyses along these lines for their researchers, and discovered the same pattern of diminishing returns for really well-funded labs. Although the threshold funding level beyond which there was no further increase in efficiency was higher than in ecology, as you’d expect given the higher cost of much biomedical research.

About these ads

Responses

  1. Getting this right is important, because a lot of people are going unfunded, or subsisting on very low funding rates. We need to fund the most exciting ideas the most, however. I hope social scientists of various sorts are involved in these analyses, because they do these human-based inquiries well. If the goal is scientific progress, we need to both fund the best people and keep a vibrant pool of researchers active.

    • Yes, you’re absolutely right, this is a topic on which we need advice from good economists and other social scientists. Indeed, I’m sure there’s a body of literature out there I’m just not aware of. And your remark about the need to balance rewarding the best people with maintaining a vibrant pool of researchers is a good way to put the central issue, I think.

      This is one of those topics that means a lot to us as scientists, and with which we all have some first-hand experience. Which I think means that it’s one of those topics where we’re all likely to be rather too quick to draw sweeping conclusions based on our own biases and personal anecdotal experiences. That’s why I welcome papers like Lortie et al., that bring data to bear.

  2. Another point occurred to me. The elite researchers are surely funded from multiple sources. Typically, different sources fund different projects; I don’t know of any funding agency that will fund an already-funded project (so-called “double dipping”). So in practice, reallocating funding away from the elite is likely to mean somehow reducing their ability to get many different, nominally-independent grants. The rationale would be that those “independent” grants aren’t actually independent, because they all share the same PI and the empirical evidence indicates that, once a PI’s funding gets past a certain level, other factors (like the PI’s time) become limiting for scientific efficiency.

    This makes intuitive sense to me. In grad school, I remember talking to a grad student from the big lab of a very well-funded superstar ecologist at another university. The student said that he basically got to meet with his supervisor one-on-one once a year for an hour. He said it was always a really great, very effective meeting, but still–one hour a year. Easy to imagine that that student would’ve done better work with more supervision, rather than having to rely on his own devices, or advice from postdocs and other grad students.

    • That is a good point. I have wondered about that face time a lot. In fields where there are much huger groups, often in med schools, what I hear is that the totally amazing group of postdocs, as many as 20 or 30, can make up for that facetime. I can’t imagine having that many people in the group. I think democracy and freedom in research are important, but I also believe there are things out there outside of my experience that are amazing. I just hope we don’t move too much towards the way Germany used to be, where a few top professors directed the research of the whole department. We work best when we work for ourselves, but not in isolation, surrounded by an exciting group of other researchers.

      • A search committee on which I once sat interviewed a candidate from a lab like that for a faculty position once. The lab had 30 postdocs and 30 grad students. The PI essentially did no mentoring. The more senior postdocs were each assigned several grad students to mentor. Like you, I just can’t imagine working that way. And as I noted at the end of the post, NIH has data indicating that it’s not efficient for even biomedical researchers to have such large groups.

  3. One thing I wonder about this is difference in actual need. For example, the funding needs of someone with a full-blown genomic program or needing to pay for serious ship-time is going to be very different than someone working in grassland ecology. And yet, for those very different funding levels, the same number of publications may be produced. So, if labs at very different funding levels are having the same output, we would expect no difference in in PIE. So is this lack of a relationship just driven by disciplinary differences, or is there really no difference for no good reason within the elite?

    Basically, I’m taking issue with the statement “Admittedly, there is variation in the type of ecological research conducted and the associated cost, but it is important that funding does not directly predict the relative success of publications by the elite.”

    So, to pose it as an devil’s advocate question, if we were to take the extreme view and chop the upper end of the funding spectrum of the elite, saying that they should be able to be just as productive at lower levels of funding, would we just be damaging the elite from disciplines that require higher funding and giving precedence to types of ecology that can be done on the cheap? Is this a good thing?

    (Note, I think this question can be answered by looking at the direct and indirect effects of funding on PIE mediated via # of publications – Chris?)

    • Hi Jarrett,

      Chris would have the data to directly address this, at least informally. He knows who the survey respondents were CORRECTION: I’m not sure if he knows this or not, and so knows if any of them need ship time or trips to the Antarctic or massive genome sequencing. It certainly is true that different disciplines or subdisciplines differ in their intrinsic costliness. You can’t do particle physics without a massive particle accelerator and the engineers to run it. This is why I noted at the end of the post that, based on my recollection of the NIH data, the point of “diminishing returns” sets in at a higher funding level than in ecology (10s of millions of dollars, IIRC).

      But I doubt that’s what’s driving the results within ecology and environmental science. With some quite specific exceptions like those you noted, most people’s expenses in ecology are mostly people–grad students, undergrad assts., technicians, postdocs, summer salary for themselves. This is mostly true even for really highly-cited ecologists. And people cost about the same across fields, and certainly across subfields within ecology.

      I do have some questions of my own about funding sources for the super-elite, and how that might change the picture. For instance, consider someone like Dave Tilman, who runs an LTER site. Is that site’s entire funding “his”? Presumably not–but I’ll bet some of it is, at least in practice. Again, I’m sure Chris has the data to comment on this kind of thing. He plans to comment here or do his own post at some point.

      • The funding for LTER sites, including Cedar Creek, is split amongst a group of PI’s at that site. Dave gets some LTER funding, and so do a dozen other professors who work at the site. The PI groups, in turn, submit massive proposals to NSF every five years with a suite of projects, each of which may be led by different PI’s.

      • Thanks for the clarification Margaret. I’m Canadian, so I don’t really know how LTERs work. And just to be clear, I don’t know that Dave is one of the ecologists in Chris’ survey, nor do I know that he wasn’t. I just used him as an example (the first one that came to mind) of how it might be difficult to account for certain funding sources in these sorts of analyses. Having said that, I believe the survey respondents provided their own numbers on how much funding they have. Presumably individuals do know how much funding they have, even if it’s not immediately obvious to others just from reading the Acknowledgements sections of their papers.

  4. I think the link in your blog post for the previous paper goes to the new paper…

    • Thanks! Fixed now.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Categories

Follow

Get every new post delivered to your Inbox.

Join 3,952 other followers

%d bloggers like this: