I’m tapping away at the back of a conference hall during the UK eDNA working group at the University of Bangor, absorbing the latest developments in the use of molecular technologies for applied ecology. Things have moved fast over the last couple of years, with one method, for the detection of Great Crested Newt, having moved out of the research laboratory and into widespread use. Several other methods – including our own method for ecological assessment using diatoms – are drawing close to the stage where their adoption by end-users is a real possibility, and lots more good ideas were bouncing around during the sessions and during the breaks.
The fascination of much of this work lies in the interplay between the new and the old, using the sensitivity of molecular tools to push our understanding of what traditional methods have told us. At the same time, the new methods are largely underpinned by libraries of reference barcode sequences that provide the essential link between the ‘old’ and the ‘new’, and these are wholly dependent upon specialists rooted in old-school taxonomy.
Here’s the rub: if the goal is to produce methods for ecological assessment, then the point will come when the research scientists step back and the end-users start to use it. At this point, real world exigencies take over and, with the public sector battered by funding cuts, any opportunity to save money will be grasped. What this means is that we cannot assume the same depth of ecological knowledge in the users as in the developers. This was not always the case as the Environment Agency had a cadre of extremely capable ecologists, with a deep knowledge their local rivers and broad understanding of ecology. I don’t believe that this was knowledge that they were taught, rather that they learnt it on the job, seeing streams in all of their moods as they collected samples, poring over identification guides as they named their specimens and sharing knowledge with colleagues. The result was a depth of understanding which, in turn, they drew upon to advise colleagues involved in regulation and catchment management.
In the last few years this system has come under scrutiny as managers have searched for cost savings. Environment Agency biologists are no longer expected to collect their own samples, which are taken for them by a separate team in the misguided assumption that highly-trained biologist will be more productive if they stay in the laboratory and focus on using their specialist identification skills. Moving to molecular techniques will just continue this process. Once the excitement of the research is over, the methods will be ripe for economies of scale; sample processing will be a task for molecular biological technicians, and the first time an ecologist encounters the sample it will be as processed output from a Next Generation Sequencing machine.
The function of the laborious process of collecting and analysing ecological samples is not just to produce the data that underpins evidence-driven catchment management. It also allows biologists to acquire the experience that lets them interpret these data sensitively and wisely. The urge to find short-term savings has focussed on the quantifiable (how many samples can a laboratory process) and ignored the unquantifiable (how good is the advice that they offer their colleagues?). I don’t think the full calamity of what we are seeing will hit for a few years because most of the ecology staff in the Environment Agency have done their apprenticeships in the field. Over time, however, a new generation will join the organisation for whom computer print-outs may be the closest they get to encountering river and stream life.
I am basically enthusiastic about the potential that molecular technologies can offer to the applied ecologist but we do need to be aware that implications of these approaches extend beyond issues of analytical performance characteristics and cost. This is because ecology is more about making lists of what organisms we find at a particular location (and, judging by the presentations, this seems to be the focus of most current studies) but about what those organisms do, and how they fit together to form ecosystems and food webs. OK, so you can read about Baetis rhodani in a textbook, but that is never going to be the same experience as watching it scuttle across the tray where you’ve dumped the contents of your pond net, or of observing a midge larva graze on algae under a microscope. The problem is that we cannot put a value on those experiences, in the same way that the cost of processing a sample can be calculated.
This is a theme that I’ve explored several times already (see “When a picture is worth a thousand base-pairs …”, “Simplicity is the ultimate sophistication: McEcology and the dangers of call centre ecology” and “Slow science and streamcraft”) but it bears repeating, simply because it is an issue that falls through the cracks of most objective appraisals of ecological assessment. The research scientists tend to see purely in terms of the objective collection and interpretation of data, whilst managers look at it in terms of the cost to produce an identifiable product. It is more than both of these: it is a profession. The challenge now is to integrate molecular methods into the working practices of professional “frontline” ecologists, and how we prevent this professionalism being degraded to a series of technical tasks, simply because the bean counters have worked out that this is the most efficient way to distribute their scarce resources around the organisation.
Biggs, J., Ewald, N., Valentini, A., Gaboriaud, C., Dejean, T., Griffiths, R.A., Fosterd, J., Wilkinson, J.A., Arnell, A., Brothertone, P., Williams, P. & Dunna, F. (2015). Using eDNA to develop a national citizen science-based monitoring programme for the great crested newt (Triturus cristatus). Biological Conservation 183: 19-28.