Finding the balance …

Gammarus fossarum (Scale bar: 1 millimetre).  Photograph: Drew Constable.

Back in March I wrote about the challenges facing those who planned to implement Next Generation Sequencing (NGS) methods for ecological assessment (see “Ecology’s Brave New World”).  In that post I argued that the success (or otherwise) of using DNA for ecological assessment was as much down to the structure and values of the organisation implementing the method as to the method itself.   More particularly, there were likely to be problems if the process was viewed primarily as a means of gathering ecological data rather than for enhancing the capabilities of ecologists.

This is an important distinction.  Much of an ecologist’s time is spent collecting the basic data, whether in the field or laboratory, from which the condition of a particular habitat can be inferred.   But, with traditional methods, there was always a possibility that this basic data collection could be supplemented by observations and insights made by the ecologist that would inform their judgements.  These people have also added to our knowledge of the UK’s biodiversity over the years (see “A new diatom record from Sussex” for an example).   My fear is that adoption of NGS approaches in order to reduce costs will limit the potential for ecologists to make these serendipitous additions to our understanding of a habitat.

A recent paper by Rosie Blackman and colleagues from the University of Hull and Environment Agency offers a good example of how traditional and DNA-based methods can be complementary.  Rosie had looked at invertebrate assemblages in rivers in England using both approaches and discovered that some of the DNA in her samples came from a species, Gammarus fossarum, not previously recorded in the UK.  Other representatives of this genus of small crustaceans, including the extremely common G. pulex, had been abundant in her samples.  Now, however, going back to her sites with the knowledge that G. fossarum might also be present, she was on the lookout for the subtle differences in morphology that separated G. fossarum from other Gammarus species.  She found it in large numbers at 23 out of 28 sites, spread around the country, and in historical material stored at the Natural History Museum dating back to 1964, suggesting that it has been overlooked by those identifying it by traditional means.

This is a great example of biologists working in the sweet zone where traditional and molecular methods combine to give us new insights that are greater than the sum of their parts.   The shortcomings of traditional morphology-based taxonomy in the past are clear but, at the same time, this was essential for verification step once the presence of Gammarus fossarum had been detected by molecular approaches.   The obvious conclusion is that regulatory organisations should move into the future using both traditional and molecular methods in a complementary manner.   Yet, if you look at that statement from another perspective, I have just advocated increasing the cost of ecological assessment at a time when budgets for such assessments are under extreme pressure.

The likelihood is that, as molecular methods are developed (and if they are shown to be substantially cheaper), traditional approaches to ecological assessment will be dropped.  That would not be a problem were it not that the hours spent in the field and laboratory are an important pathway for graduate ecologists to deepen their understanding of organisms and habitats.   Shifting wholesale to molecular methods without retaining at least some infrastructure for traditional methods will mean first, that future discoveries such as Rosie’s will be harder to validate and, second, that the next generation of ecologists will first encounter these organisms not in a pond net but on a spreadsheet.  That link between a name and an organism with distinctive qualities, and between that organism and particular habitats or conditions, will be lost.

Equally, it is unrealistic to assume that complementary use of both approaches will be the norm.   That will place yet more pressure on already tight budgets and could only happen if everyone was happy to accept that monitoring networks could be much smaller (see “Primed for the unexpected?”).  So how do we retain this “sweet zone” between old and new?   I have not yet heard a satisfactory answer to that question so perhaps we should return to the point I made earlier about the structure and values of the organisations that take on these new methods.  Broadly speaking, the adoption of these methods purely to save money is likely to be the road to perdition, because these savings will look most impressive to the senior levels of management (who are probably not biologists) only if there is a wholesale move to the new methods with no retention of traditional infrastructure.

The tragedy is that, within a decade, molecular technology may have moved on to such an extent that it is possible for a biologist to detect invasive species and make other assessments in real time, rather than having to send samples off to remote high-throughput laboratories in order to maximise economies of scale.  Instruments such as Oxford Nanopore’s Minion are still not the finished article from the point of view of ecological end-users, but it is only a matter of time.   Unfortunately, in the here and now, the infrastructure that generates ecological data is already being dismantled in order to squeeze cost-savings from the shift to NGS.   Whether there will be anyone left to inherit this Brave New World is, I am afraid, open to debate.

Two examples of Oxford Nanopore’s Minion portable DNA analysis systems, which can be plugged into the USB port of a laptop.

Reference

Blackman, R.C., Constable, D., Hahn, C., Sheard, A.M., Durkota, J., Hänfling, B. & Lawson Handley, L. (2017).  Detection of a new non-native freshwater species by DNA metabarcoding of environmental samples – first record of Gammarus fossarum in the UK.  Aquatic Invasions 12 (in press)

Primed for the unexpected?

I was in Nottingham last week for a CIEEM conference entitled “Skills for the future” where I led a discussion on the potential and pitfalls of DNA barcoding for the applied ecologist.  It is a topic that I have visited in this blog several times (see, for example, “Glass half full or glass half empty?”).  My original title was to have been “Integrating metabarcoding and “streamcraft” for improved ecological assessment in freshwaters”; however, this was deemed by the CIEEM’s marketing staff to be insufficiently exciting so I was asked to come up with a better one.  I was mildly piqued by the implication that my intended analysis of how to blend the old with the new was not regarded as sufficiently interesting so sent back “Metabarcoding: will it cost me my job?” as a facetious alternative.  They loved it.

So all I had to do was find something to say that would justify the title.   Driving towards Nottingham it occurred to me that the last time I should have made this trip was to Phil Harding’s retirement party.  I was invited, but had a prior engagement.  I would have loved to have been there as I have known Phil for a long time.  And, as I drew close to my destination, it occurred to me that Phil’s career neatly encapsulated the development of freshwater ecological assessment in the UK over the past 40 years.  He finished his PhD with Brian Whitton (who was also my supervisor) in the late 1970s and went off to work for first North West Water Authority and then Severn Trent Water Authority.   When the water industry was privatised in 1989, he moved to the National Rivers Authority until that was absorbed into the Environment Agency in 1995.   Were he more ambitious he could have moved further into management, I am sure, but Phil was able to keep himself in a jobs that got him out into the field at least occasionally throughout his career.   That means he has experienced the many changes that have occurred the past few decades first hand.

jpch_then

Phil Harding: early days as a biologist with North West Water Authority in the late 70s

Phil had a fund of anecdotes about life as a freshwater biologist.  I remember one, in particular, about sampling invertebrates in a small stream in the Midlands as part of the regular surveys that biologists performed around their areas.   On this particular occasion he noticed that some of the invertebrate nymphs and larvae that he usually saw at this site were absent when he emptied out his pond net into a tray.   Curious to find out why, he waded upstream, kicking up samples periodically to locate the point at which these bugs reappeared in his net.   Once this had happened, he knew that he was upstream of the source of the problem and could focus on searching the surrounding land to find the cause.   On this occasion, he found a farmyard beside a tributary where there was a container full of pesticides that had leaked, poisoning the river downstream.

I recount this anecdote at intervals because it sums up the benefits of including biology within environmental monitoring programmes.   Chemistry is very useful, but samples are collected, typically, no more than once a month and, once in the laboratory, you find a chemical only if you set out to look for it and only if it was present in the river at the time that the sample was collected.  Chemical analysis of pesticides is expensive and the concentrations in rivers are notoriously variable, so the absence of a pesticide in a monthly water sample is no guarantee that it was never there.  The invertebrates live in the river all the time, and the aftershocks of an unexpected dose of pesticide are still reverberating a few weeks later when Phil rolls up with his pond net.   But the success of this particular incident depends on a) Phil being alert enough to notice the change and b) having time for some ad hoc detective work.

This encapsulates the “streamcraft” which formed part of my original title.   This is the ability to “read” the messages in the stream that enable us to understand the processes that are taking place and, in turn, the extent to which man’s activities have altered these (see “Slow science and streamcraft”).  It is something you cannot be taught; you have to learn it out in the field, and the Environment Agency and predecessors was, for a long while, well set up to allow this process of personal development.    Changes over the past few years, in the name of greater efficiency (and, to be fair, in the face of enormous budget cuts) have, I fear, seriously eroded this capability, not least because biologists spend far less time in the field, and are no longer responsible for collecting their own invertebrate or diatom samples.

jpch_now

Phil Harding: forty years on, sampling algae in the River Ashop in Derbyshire.

In my talk, I was thinking aloud about the interactions between metabarcoding and the higher level cognitive skills that a good biologist needs.   I feared that, in the wrong hands, it could be yet another means by which the role of the biologist was eroded to that of a technician feeding samples into one end of a series of swish machines, before staring at spreadsheets of data that emerged from the other end.   All the stages where the old school biologist might parse the habitat or sample s/he was investigating and collect signs and indications of its condition over and above the bare minimum set in the protocol were stripped away.

A further reason why this might be a problem is that molecular ecology takes a step backwards from the ideal of biological assessment.  Much as the chemist only sees what his chosen analyses allow him to see, so the molecular biologist will only “see” what his particular set of primers reveal.   Moreover, their interpretation of the spreadsheets of data that emerge is less likely to be qualified by their direct experience of the site because their time is now too precious, apparently, to allow them to collect samples for routine assessments.

A few points emerged out of the discussion that followed (the audience included representatives of both Environment Agency and Natural England).    First, we agreed that metabarcoding is not, itself, the problem; however, applying metabarcoding within an already-dysfunctional organisation might accentuate existing problems.  Second, budgets are under attack anyway and metabarcoding may well allow monitoring networks to be maintained at something approaching their present scale.  Third, the issue of “primers” was real but, as we move forward, it is likely that the primer sets will be expanded and a single analysis might pick up a huge range of information.  And, finally, the advent of new technologies such as the MinION might put the power of molecular biology directly into the hands of field biologists (rather than needing high throughput laboratories to harness economies of scale).

That last point is an important one: molecular ecology is a fast moving field with huge potential for better understanding of the environment.    However, we need to be absolutely clear that an ability to generate huge amounts of data does will not translate automatically into that better understanding.   We will still need biologists with an ability to exercise higher cognitive skills and, therefore, organisations will need to provide biologists with opportunities to develop those skills. Metabarcoding, in other words, could be a good friend to the ecologist but will make a poor master.  In the short term , the rush to embrace metabarcoding because it is a) fashionable and b) cheap may erode capabilities that have taken years to develop  and which will be needed it we are to get the full potential out of these methods.   What could possibly go wrong?