Jumping to conclusions

17 Apr 2014

Recently I’ve started to go to the supermarket with my wife where we’ve developed a systematic approach to the food shop. Initially we visit the wine department and then I go off to the cafeteria for an Americano (which used to be ordinary coffee) whilst my wife chooses the food. I then turn up to help bag it and get it to the car. It’s perhaps not what one would call total participation but it is an advance from total zero.

Whilst sipping coffee in the cafeteria this morning I read The Times. There were two articles that particularly interested me - one was on invasive species and the other was on research by the University of Reading showing that adding oilseed supplements to the cows’ diet resulted in milk with the same overall amount of fats but 25% less saturated fats.

As you know we have been bombarded for years by statements that saturated fat is bad for us so this could be good news for the dairy industry. The preoccupation with saturated fats has meant that for years and years we’ve been told that dairy products may actually be bad for our health. Now, after some scientifically based studies, it appears that those who eat or drink the most dairy products are less likely to suffer from cardio-vascular problems than those who consume the least. Therefore, it seems that just focussing on the saturated fats in milk has resulted in advice that may have been deleterious to our health.

It’s always a danger to select just one factor within a complex system and thus jump to conclusions. This isn’t really science and, in this case in particular, is not advantageous to human health.

There are similarities with issues that have arisen in arable agriculture. Scientists, advisers and farmers sometimes latch onto a single simple factor and think that it will determine the outcome of a very complex system, only to be proven wrong in field trials.

Systems do not come much more complex than the soil, and yet we seem hooked on simple indicators. To tell you the truth, perhaps that’s all we can do but at least we should be wary of these guidelines.

One generally held view, at least amongst soil scientists and very much less so amongst practitioners, is that the level of Soil Mineral Nitrogen (SMN) in the early spring has a profound and predictable impact on the level of applied nitrogen required to optimise yields. Field trials show that this is clearly not true. There may be a broad relationship but field trials suggest that SMN, at the levels which occur in long-term arable soils not receiving organic manures, appears to have a limited influence on optimum levels of applied nitrogen.Fertiliser application

The same appears to be true of the seemingly logical statement that higher yielding crops require more nitrogen. Field trials show that there is either no relationship or a weak relationship between yields and nitrogen requirement for feed wheat. I personally think that there may be a weak relationship between optimum yield levels and nitrogen requirement for feed wheat, but it may be no more than a few Kg of nitrogen for each additional tonne/ha above average yields. Bread wheat is another matter and the complexities of the nitrogen nutrition means that using previous farm experience is necessary to fine-tune the dose needed to achieve the required protein content in high yielding crops.

One of the worst examples of jumping to conclusions that I have come across was when Calixin (tridemorph) was first introduced in the 1970s, initially to control mildew in spring barley. BASF said that it should be applied around the first node stage of the crop, but the more academic plant pathologists said that protecting the flag leaf was so important that it should be applied at flag leaf emergence. BASF was relying on field trials and the academics were relying on their knowledge of crop physiology. Guess who was correct.

NIAB TAG had a similar experience a few years back. When strobilurins were first introduced, we ran an HGCA funded project to identify the optimum time to apply them to winter wheat. The results were that the optimum timings to use two applications of strobilurins within an overall three spray programme were at T1 (final leaf three emerged) and T3 (mid-flowering). This happened consistently in all the trials over two or three years. Despite this, the standard advice at the time was that T2 (flag leaf fully emerged) must be the most important timing for the strobilurins and that they should either be applied at T1 and T2 or at T2 and T3. However, as we finalised the project, septoria developed resistance to the strobilurins and the results were quietly forgotten.