Survival of the most resistant

8 Jan 2016

I have been a life-long supporter of Leicester City and can now come out of the woodwork because they have the 40 points necessary to survive in the Premier League for another season. It would have been tempting fate to declare my support earlier in the season. I must admit that I have not watched them play for about 10 years and, instead, go and watch the mighty U’s (Cambridge United) a few times a season with one of my daughters.

Leicester City is not new to the dizzy heights that they have achieved so far this season. I watched them in the early 1960s when they were a top club. They were not far from doing the old League and FA Cup double in 1963 when I watched the best match that I have ever attended. They drew 2-2 at home to Spurs, then the glory team of the League. Jimmy Greaves of Spurs scored with a wonder strike and it was amazing that the great Gordon Banks in the Leicester goal even got his finger tips to the ball. That moment is forever imprinted on my mind.

There is a small core of Leicester supporters in NIAB, including Tina the director. Normally our discussions centre on Leicester’s survival in the Premier League, particularly this time last year when they were bottom. I am mentioning this because I am desperately trying to link the Leicester story to the theme of this blog which is that small percentages of a population can count, particularly in the context of the development of pesticide resistance.

It is generally accepted that pesticide resistance is a process of selection of naturally occurring mutations which happen to be resistant to a specific or a range of pesticides. The continual exposure to the mode(s) of action of the pesticide(s) to which there is resistance results in these resistant individuals becoming dominant in the pest populations, whether these pests be insects, diseases or weeds.

There has long been a debate in the industry about whether or not high or low doses increase the rate of selection of these naturally occurring mutations. Logic suggests it must be high doses in order to select more effectively for the most resistant individuals. Low or sub-optimal doses are more likely to result in the higher survival of non-resistant or less resistant individuals. This type of discussion always begs the question of what actually is a ‘high’ or ‘low’ dose.

SeptoriaField experience and experimental evidence on fungicide resistance also suggest that it is usually higher doses that result in the more rapid selection of resistant individuals. The following is from a recent statement from the Fungicide Resistance Action Group, which represents the whole industry, on reducing the speed of development of septoria resistance to the SDHI fungicides. “All effective fungicides exert a selection pressure on pathogen populations and carry a risk of resistance. This risk can be modified and reduced by either mixing or alternating with fungicides with an alternative mode of action, or by reducing the number of applications or dose of the fungicide.” This suggests that, at last, there is a more general acceptance that the higher the dose the more likely there is to be selection for resistance.

I am positive that same is generally true for herbicides. When first introduced the ‘fops’ and ‘dims’ controlled around 99% of black-grass at the recommended dose. They were rarely used at reduced doses but resistance developed very quickly. The field experience with the sulfonylurea herbicide product Atlantis was that when it was first used there were always a few survivors of a full dose. I remember suggesting to farmers that these could be resistant and should be rogued. Resistance to Atlantis is now widespread and continues to increase rapidly.

However, there is some evidence to suggest that ‘low’ or sub-optimal doses can speed up the development of weed resistance to glyphosate. That gave me cause to ponder why this could be true. It did not take me long to conclude that the development of resistance is speeded up not by whether high or low doses are used but initially by doses that result in a low number of survivors. When first introduced, crop-safe herbicides such as the ‘fops’ and ‘dims’ and the sulfonylureas left only a few survivors when used at recommended doses. The survivors were more likely to be resistant to these modes of action.

Optimal doses of glyphosate should kill everything, provided that it is applied well, growing conditions are conducive to control and the weeds are at the appopriate growth stages for good activity. However, sub-optimal doses may leave a few individuals which may be more likely to have a level of resistance. The continued adoption of sub-optimal doses, particularly where minimal or no-tillage is employed, may form the basis of future populations which could perhaps cross-fertilise, resulting in individuals with even higher levels of resistance. It may have been significant that the first case of a partially glyphosate resistant weed in the UK was in sterile brome, where the dose recommended for control on stubble (540 g ae glyphosate/ha) is often only marginally effective on this weed.

So perhaps the speed of development of resistance is all about the dose required to select the most resistant types. This is often the recommended doses for crop-safe herbicides and fungicides but could be sub-optimal doses for glyphosate, at least on some weeds. Hence, whilst it is not absolutely proven that sub-optimal doses speed the development of glyphosate resistance it would be advisable to apply it correctly in the right circumstances and use doses that will kill all the black-grass and inspect the results of treatment to ensure that there are no survivors. This is particularly the case where control by glyphosate is not supplemented by cultivations. For more information, please see the guidelines for minimising the risk of glyphosate resistance.

Best wishes for 2016.

The constructive comments on the script of this blog by Stephen Moss of Stephen Moss Consulting are gratefully acknowledged.