Ecology’s Brave New World …

My travels have brought me to the kick-off conference of DNAqua-net at the University of Duisburg-Essen in Germany, to give a plenary talk on our progress towards using high throughput next generation sequencing (NGS) for ecological assessment.   I went into the meeting feeling rather nervous as I have never given a full length talk to an audience of molecular ecologists before but it was clear, even before I stood up, that we were in the almost unique position of having a working prototype that was under active consideration by our regulatory bodies.   Lots of the earlier speakers showed promising methods but few had reached the stage where adoption for nationwide implementation was a possibility.   There was, as a result, audible intake of breath as I mentioned, during my talk, that, from 2017, samples would no longer be analysed by light microscopy but only by NGS.

That, in turn, brought some earlier comments by Florian Leese, DNAqua-net chair, into sharp focus.  He had talked about managing the transition from “traditional” ecology to the Brave New World of molecular techniques; something that weighs heavily on my mind at the moment.   In fact, I said, in my own talk, that the structures and the values of the organisations that were implementing NGS were as important as the quality of the underlying science.   And this, in turn, raised another question: what is an ecologist?

If that sounds too easy, try this: is an ecologist more than just someone who collects ecological data?   I have put the question like this because one likely scenario for routine use of environmental DNA, once in routine use, is that sampling will be delegated to lowly technicians who will dispatch batches to large laboratories equipped with the latest technology for DNA extraction, amplification and sequencing on an enormous scale (see “Replaced by a robot?”) and the results will be fed into computer programs that generate the answer to the question that is being posed.

The irony, for me, is that the leitmotif of my consultancy since I started has been helping organisations apply ecological methods consistently across the whole country so that the results generate represent real differences in the state of the environment and not variations in the practice or competence of the ecologists who collected the data.  Over the past decade, I helped co-ordinate the European Commission’s intercalibration exercise, which extended the horizons of this endeavour to the extremities of the European Union.   The whole process of generating ecological information had to be broken down into steps, each has been taken apart and examined and put back together to, we hoped, produce a more effective outcome.  There was, nonetheless, ample opportunity for the ecologist to bring higher cognitive skills to the process, in sampling and surveying, species identification and, ultimately, in interpreting the data.

I often use the example of McDonalds as a model for what we are trying to achieve, simply because it is a brand with which everyone is familiar and we all know that their products will taste the same wherever we go (see “Simplicity is the ultimate sophistication …“).   I admire them for that because they have achieved what ecologists involved in applying EU legislation should desire most: a completely consistent approach to a task across a territory.   But that same consistency means that one is never tempted to pop into a McDonalds on the off chance that the chef has popped down to the market to buy some seasonal vegetables with which to whip up a particularly appetising relish.   If you want the cook to have used his or her higher cognitive abilities to enhance your dining experience you do not go to a McDonalds.

But that is where we could end up as we go down the road of NGS.  A reader of my post “A new diatom record from West Sussex” commented tartly that there would be no chance of that diatom being spotted once the Environment Agency replaced their observant band of diatom analysts by NGS and he was right.   Another mentioned that he had recently passed on a suspicion of a toxic pollution event to the local staff based on observations on the sample that were not captured by the metrics that are used to classify ecological status.  Again, those insights will not be possible in our Brave New World.

Suppose we were somehow able to run a Monte-Carlo permutation test on all the possible scenarios of where we might be in twenty years, in terms of the application of NGS to ecological assessment.  Some of those outcomes will correspond to Donald Baird’s vision of “Biomonitoring 2.0” but some will not and here, for the sake of playing Devil’s Advocate, is a worst-case scenario:

In an effort to reduce costs, a hypothetical environmental regulator outsources eDNA sampling to a business service company such as Group 4 or Capita.   They batch the samples up and dispatch them to the high throughput laboratory that provides the lowest quote.   The sequencing results are uploaded straight to the Cloud and processed according to an automated “weight of evidence” template by data analysts working out of Shanghai, Beijing or Hyderabad before being passed back to staff in the UK.   At no point is a trained ecologist ever required to actually look at the river or stream.  I should stress that this “year zero” scenario will not come about because NGS is being used but because of how it is used (and a post in the near future will show how it is possible to use NGS to enhance our understanding of the UK’s biodiversity).   It brings us back to the question of the structure and values of the organisation.

What I would like to see is a system of ecological assessment that makes full use of the higher cognitive abilities of the biologists responsible for ecological assessment.  Until now a lot of a biologist’s skill goes into identifying organisms in order to make the list of species upon which assessments are based.  It should be possible to use the new genetic technologies to free ecologists to play a greater role in interpretation and decision-making.  However, that will not come about when they are being used in situations where there is an overwhelming desire to reduce costs.  One of the lessons that we need to learn, in other words, is that there is more to applying molecular ecology than simply developing the method itself.

Reference

Baird, D.J. & Hajibabaei, M. (2012). Biomonitoring 2.0: a new paradigm in ecosystem assessment made possible by next-generation DNA sequencing. Molecular Ecology 21: 2039-2044.Date

 

Advertisements

Simplicty is the Ultimate Sophistication: McEcology and the dangers of call centre ecology

I’ve just had a short ‘opinion’ paper published in the journal Ecological Indicators, with the title ‘Simplicity is the ultimate sophistication: building capacity to meet the challenges of the Water Framework Directive’.   One of my big concerns over recent years has been how we translate developments in science into advances in practice particularly when the government agencies that are responsible for our environment are under severe financial constraints.

My big concern is that an effective if rather labour-intensive approach to ecological assessment in the UK is currently being unpicked to accommodate new science (necessary as the Water Framework Directive makes new demands) but, at the same time, more efficient (i.e. ‘cheaper’) business models are also being imposed.  

I use the term ‘McEcology’ to summarise what is going on.     You might have heard the derogatory term ‘McJob’ used to describe a low-paid dead-end job with few career prospects so you might think I’m just riding a popular bandwagon here.   But read on.

Just over ten years ago I was on my way to a meeting to develop some Europe-wide standards for ecological assessment.   At Lille station I passed a McDonalds and remember thinking that it would be nice if we ecologists could get our methods to the point where the end-product was as consistent, across Europe, as a Big Mac.  

I think we have made progress towards this goal over the past decade (see posts on intercalibration) but I’ve also watched some less attractive aspects of the fast food business model being imported.   The essence to producing homogeneous food is often a tightly-controlled ‘production line’ and this approach is increasingly used in ecology, with different individuals responsible for sampling, sample preparation, sample analysis and data interpretation.    In theory, this makes better use of the training of specialist ecologists, allowing them to focus on high-level skills such as data interpretation.   In practice, I suspect that the time ecologists spend travelling around catchments collecting samples and, in the process, observing rivers in all their moods, gives them a level of insight that will soon be lost.   The ‘fast food’ model may be acceptable if the ‘product’ is data but not if it is advice or guidance tailored to a particular water body or an explanation comprehensible to a stakeholder.    

A second concern I addressed was whether advances in science were actually contributing to this slip towards McEcology, as innovations often lead to increased complexity which can push methods out of the capabilities of the “generalist” and limit their use to “specialists”.   We get to the point where knowledge of a particular technique or group of organisms trumps knowledge of a geographic area in determining who is best suited to a task.  

The risk is that the search for short-term economies undermines the professionalism of ecologists.  The old UK model where a team of ecologists acted as a “family doctor” practice for a region, developing extensive local knowledge through close contact over a number of years is at risk of being replaced with a “call centre” model where ecologists become increasingly desk-bound, samples are collected by lower paid staff working to tight schedules and processed by technicians who may not work in the same part of the country, or even in the same country at all.  

The title of my paper “simplicity is the ultimate sophistication” comes from a quotation attributed to Leonardo da Vinci.  Innovation is all very well but if it breaks the links between biologists and the field, we risk losing accumulated wisdom resulting in poorer decision-making.   On the other hand, if we have a good knowledge of the underlying ecological processes, we should be able to develop a set of much simpler methods that allow ‘generalist’ ecologists to collect information on many attributes of ecosystems.  This encourages a more joined-up view of ecosystems and, ultimately, better decision making.

This is not just theory: we’re already making a few tentative steps towards this goal and the next post will talk about one of these in more detail.

Black Swan #2: McEcology and Steve Earle

I’m still mulling over the contents of Nassim Nicholas Taleb’s book The Black Swan (see earlier post).  He spends many pages (too many, perhaps) unpicking the statistical foundations of the predictive models beloved of economists (and which are similar, in many ways, to the models ecologists use).  Such models, he argues, have a track record of failing to predict the really significant events, because these, by their very nature, often fall outside the expectations of those of us raised to think in terms of “normal” (Gaussian) statistics.   Rather than attempt imperfect predictions of the future, he suggests, we should, build robustness into our institutions to enable them to react to change.

Yet, when I look around, I see the exact opposite happening, as the economic recession (itself the product of Black Swan events such as the subprime mortgage crisis in the USA) forces the public sector to retrench.  All of the agencies with which I work in the UK, and many elsewhere in Europe, are having to survive on smaller budgets than was the case a few years ago.   The mantra is always “efficiency” but let’s dissect this and examine what it means through the lens that Talib has given us.   Efficiency implies a high ratio of useful work to effort expended and, as “effort” has to paid for from our taxes, this all sounds very reasonable.   My problem lies is how this is being achieved.  You can bring about “efficiency” of a sort by adopting the approaches of manufacturing industry: splitting tasks into discrete steps and thinking about how to optimise each of these.   The outcome should be efficient production of a homogeneous product.   Think hamburgers.   And welcome to “McEcology”.

This is fine if your product is data, as burger chains achieve a level of reproducibility in their outputs that most ecologists only dream about.   However, if your product is “knowledge” or “advice”, then perhaps splitting the task into several steps and making the ecologists spend less time in the field, will be counterproductive.  It depends if you think of ecologists as data monkeys or as professionals.

Taleb’s term for this process is “naïve optimisation”, though he approaches the topic from a different angle.   He believes that organisations cope with the unexpected by having innate robustness and, interestingly, given my context, uses analogies from ecology.  Robustness and stability arise from having many interdependencies within systems.  This may look like redundancy to an outsider (why do we have two kidneys, for example?) and, therefore, fair game when savings have to be made.   Yet viewed from another perspective, this “redundancy” is insurance against things going wrong.   The human body can cope, for example, with only one kidney.  The “optimised” ecology teams I meet in my work  struggle to cope when colleagues are off sick or on maternity leave and, as important for professionals who need to offer advice and opinions, have little time for reflection and ad hoc investigation of issues that fall outside scheduled activities.

These thoughts came together last night when I went to see the American singer Steve Earle.  He had a role in the TV series Treme, about New Orleans in the months following Hurricane Katrina and three of the songs in his set were written for the show.   Introducing these songs, he reminded us how the aftermath of Katrina – a classic example of a Black Swan event – had been prolonged because the federal agencies responsible had had their budgets slashed by the Bush administration.   All looked fine on paper … until the one time they had to respond to a real crisis.