1. Introduction
There has been a trend towards reduced soil tillage in many arable systems and in England it was estimated that 46% of farms use some form of reduced tillage [
1]. Importantly this trend towards reduced tillage results in varying degrees of disturbance to the upper soil horizons (5–15 cm), but lower horizons remain undisturbed. Reduced tillage takes many forms: the whole field may be cultivated or just some of it in strips (strip tillage) and even less disturbance is caused by direct drilling using tines, or least of all disturbance with discs. The soil physical, chemical and biological composition will increasingly differ from the ploughed and harrowed inversion tillage to the almost undisturbed along such a gradient of reducing tillage. However, most cereal breeding and cultivar testing programmes at least in temperate regions are carried out at the more disturbed extreme of this gradient.
Cereal breeding programmes carried out under inversion tillage and high input agronomy have successfully selected new cultivars with ever increasing yield potential. On farm, this cultivar performance can be realised in practice when similar agronomic conditions are used. The UK’s Agriculture and Horticulture Development Board (AHDB) Recommended List project carries out independent variety assessments mainly under similar inversion tillage with high agronomic inputs. ”Untreated” yield is reported also for cereals where no fungicides are used but otherwise the agronomy is the same.
There is some evidence that for specialised systems such as organic farms the adaptive traits may be lost during the conventional breeding selection process. Many elite cultivars selected by mainstream conventional breeders deliver higher yields under organic conditions than traditional organic cultivars, but there is evidence that the yield rankings of cultivars change according to the system [
2]. The argument applies also to selection for non-inversion systems where it has been demonstrated that both elite and older cultivars of spring barley change yield rankings compared with the inversion tillage under which they were bred [
3].
We know that genetic traits can be identified for tillage adaptation as quantitative trait loci in a population of elite cereal genotypes were found to be different in non-inversion tillage compared to conventional tillage for aspects of plant nutrition [
4]. This suggests that different sets of traits are appropriate for the varying tillage conditions. However, whilst some cultivars gain a reputation as being adapted to particular on-farm agronomic conditions, these are seldom validated in controlled trials.
Tillage can have effects on disease in cereal crops, often associated with inoculum reservoirs in the enhanced crop debris left on or near the surface when inversion tillage is not practiced. A classic example is the enhanced risk factor when cereal crops follow maize where there is an inoculum build-up of
Fusarium species, especially
F. graminearum [
5]. Therefore, in minimum tillage where more crop debris is left around the stem base or emerging seedlings, saprophytic growth of the fungal pathogen can produce enhanced levels of inoculum able to infect the crop. Another example is eyespot caused by
Oculimacula yallundae on wheat which is greater on minimum tillage compared with ploughed [
6,
7,
8]. However, in continuous minimum tillage infection this may decline probably due to development of balanced microbial populations attenuating the eyespot inoculum [
8,
9]. Another splash-dispersed pathogen able to sporulate as a saprophyte on crop debris and infect barley is
Rhynchosporium commune, causal agent of ”scald” or ”rhynchosporium”, which can be more severe if it follows a previous barley crop, but there is no published evidence that it is affected by tillage method per se [
10]. Nevertheless, conservation tillage farmers normally avoid two successive barley crops to avoid enhanced rhynchosporium problems (unpublished data and Doug Christie, Durie Farms, personal communication).
In spring barley, we were able to identify some cultivars that were consistently adapted to either inversion or non-inversion tillage across three or four contrasting seasons [
3]. Traits associated with adaptation were not defined, but one cultivar showing non-inversion tillage adaptation, KWS Sassy, for example, is characterised by extensive rooting structure and was observed to perform better under reduced tillage in years with extreme weather conditions (drought and flooding) than under conventional plough or by comparison to other varieties where a greater yield penalty was evident (George, Hawes pers comm). A mutant of Optic with no root hairs showed non-inversion tillage adaptation, whilst the parental Optic line and other mutants did not [
3]. Winter barley cultivars are like spring barley, another largely self-pollinated inbreeding crop, but exposed to a greater range of environmental stress, including disease, over a longer period, so we carried out a series of trials using different ranges of cultivars across several seasons to determine whether any cultivars showed consistent tillage adaptation. Furthermore, these trials spanned the conversion period from plough to non-inversion tillage over 10 years to identify how maturation of the soil in this process affected both yield and disease outcomes. For this work we used two sites or research platforms, ‘Mid Pilmore’ where tillage was the main factor, and the ‘Centre for Sustainable Cropping’ (CSC) where tillage was one of several cropping system factors being investigated.
The aim of this study was to (a) determine whether there were yield ranking changes between inversion and non-inversion tillage in winter barley, (b) whether this is affected by soil tillage maturity, and (c) how disease is affected by soil tillage status and any interaction with cultivar. We also compared four of the cultivars with their performance across four years in both the Mid Pilmore platform and the nearby CSC farming systems platform that contrasts inversion and non-inversion tillage-based management practices to assess the relative importance of tillage method within systems comparisons.
4. Discussion
Several cultivars were identified as differentially adapted to either inversion or non-inversion tillage in individual trials, and some of these cultivars performed consistently across several trials. The trials reported here span 10 seasons with contrasting weather patterns and cover the transition from early establishment of the tillage practices through to maturity. In addition, the Mid Pilmore trials were all carried out under continuous barley cropping, so no interactions of other crops with tillage practice need be factored in. Thus, soil microbial populations may be skewed, but this is offset by minimising any effects of differences in previous cropping history. Furthermore, cropping of spring barley for several consecutive years by farmers in the region is not uncommon, but yield depression from continuous growth of the same crop should be considered in data interpretation. However, no differential interaction of different cultivars of barley with continuous barley growing has been detected [
18]. Most of the cultivars used were commercially available and recently on the UK AHDB Recommended List at the time of the trial, although these were supplemented with a wider selection from the AGOUEB project [
17] representing commercially important historic cultivars in the pedigrees of many current cultivars. Using these trial environments and winter barley cultivars, it is possible to determine whether differential adaptation to tillage occurs, which cultivars show contrasting adaptation and how environmentally robustly the adaptation is expressed with respect to yield and disease resistance.
To aid interpretation of results, the seasonal weather was presented as summary graphs for mean monthly precipitation, days with rain greater than 0.5 mm, sunshine hours and temperature. The first two of these are often correlated as might be expected, but high values of the latter might indicate more sustained or intense rain that may lead to more spore dispersal and infection and, therefore, disease [
19]. The latter measurements are less likely to be correlated with disease expression, but more extreme values may have impacts on the general stress state of the plants and, therefore, together with the precipitation data their impact on yield. Notable high and low values for each of these parameters were highlighted for March to June in
Table 2 and the precipitation values in particular might be expected to influence the epidemiology of rhynchosporium during these months. However, none of the values highlighted appear to correlate with the disease trends observed and other factors such as the impact of cultivar and cultivation show stronger effects. For more general stress that might impact the effects of cultivation, the exceptionally low level of April precipitation in 2011 is likely to be a factor in the relatively high non-inversion tillage yields in that year as the soil conditions were mature so any advantages of soil structure maintaining water and nutrient supplies to the crops would be realised.
The four core cultivars present in all years and, therefore, compared across them give a picture of both seasonal variation and maturing soil conditions in the cultivation treatments. In these data, the individual trials with more cultivars and in published work with spring barley [
3], the three inversion tillage treatments behave very similarly as do the two non-inversion tillage treatments but there is frequently a large contrast between these two groups. This is clearly illustrated in
Figure 3 where a clear trend can be noted. There is a general yield decline in the non-inversion tillage as the years progress, but the trend is much smaller and less progressive in the inversion tillage. This is probably due to several factors, some of which are specific to this platform rather than being attributes of non-inversion tillage per se. In this platform and others, soil carbon did not increase in the non-inversion tillage, although its distribution changed [
20,
21]. Where non-inversion tillage is a part of a systems-based approach, such as conservation agriculture or biodiversity-based cropping [
22,
23] in place at the CSC platform [
16], the contribution of cover crops and other management interventions should lead to enhanced soil quality including carbon content, biophysical structure, microbial activity and related system processes such as litter decomposition and nutrient cycling [
24].
Yield decline was not attributable to increased disease as this declined over time (
Figure 4) but weed control became increasingly problematic so residual treatment effects may have contributed. However, the greatest effect is probably due to the effects of continuous barley production on this site without break crops. Although this will have affected both tillage types, the lack of soil disturbance has a strong effect on soil microbial community structure, and activity [
25] is likely to have enhanced the microbial communities responsible for such effects [
26]. The barley-specific yield-depressing effects of continuous barley production on subsequent barley crops have been demonstrated on a site with similar soil close to this tillage platform by direct comparison with a restored soil health rotation [
15]. No differential interaction with different cultivars of barley was detected in that work, and we found no other evidence and, therefore, comparisons will be valid from all the years in the experimental series reported here.
The 2006 data appear to break with the overall response difference due to cultivation, but this is explained by the autumn being exceptionally wet. It was possible to access the land to sow the non-inversion tillage treatments at the optimum time, but the inversion tillage treatments were sown late under suboptimal conditions, and these then suffered poor establishment and a yield reduction that clearly explains this departure from the trend. The 2009 exception is in the high minimum tillage yield season, and six years after establishment we might expect any benefits of maturing soil structure to show most strongly. The meteorological data showed that in March and May 2009, the highest monthly sunshine hours were recorded which could have caused stress alleviated by benefits of this soil structure and biology maturity to crop development.
A strong trend in the rhynchosporium %AUDPC data is also seen with respect to inversion and non-inversion tillage over the years, with all four core set cultivars responding similarly. In the early years, there was either more disease on the non-inversion tillage or little difference between the tillage practices followed by a period of variable responses through to very clearly lower rhynchosporium levels in the non-inversion tillage treatments in later years. The lack of progression in 2006 is likely due to the late sowing of the inversion tillage plots that year affecting canopy development and thereby epidemiological conditions, but the progression is otherwise clear. The explanation for this progressive transformation is likely to be that much more rhynchosporium inoculum is retained on the surface plant debris in the non-inversion tillage ready to infect the developing plants, whereas this is buried in the inversion tillage treatments. Under favourable epidemiological conditions occurring early in the season, this factor has more impact than later in the season [
27]. It is common knowledge that rhynchosporium can be a serious issue in second year barley in direct drill or conservation tillage (Doug Christie, Durie Farms; and unpublished data). However, as with other pathogens such as eyespot [
8,
9], antagonists are thought to build up, which attenuates the inoculum reservoir issue as the microbial ecology of the soil surface environment matures. However, non-inversion yield generally declined across the years too, affecting crop development, the microclimate of the canopy and the nutrient and defence primed status of the crop that may account for some of the differences too.
Rhynchosporium in other cultivars in the various trials from 2008 also showed little clear differential response to cultivation treatment. Although some ranking changes were significant, there was no consistency in the ranking between the common set from 2011 to 2014 or individual years (
Table 6 and
Table 16,
Figure 8 and
Figure 13). The cultivars at the disease-cultivation ranking extremes are not correlated with the yield or tillage adaptation for yield. Kingston and Flagon show positive rank difference in the 2008 trial but negative rank difference in the 2011 and 2012 trials, respectively. Therefore, we can conclude that there is no evidence of any direct link between cultivar and tillage with respect to tillage-disease interaction.
Both the Mid-Pilmore and CSC platforms compared inversion tillage with non-inversion tillage, but the former had only barley, whereas the CSC had a 6-course rotation. By the 2011–2014 period, the Mid Pilmore platform was starting to show yield depression in the non-inversion tillage, whereas the CSC platform was less mature in its development. Due to year-on-year replication of yield data in the CSC, no statistical comparison could be made between the two platforms. However, seed for both the Mid Pilmore and the CSC platforms for the same four cultivars used in four consecutive years was sourced from the same seed batch from the same merchant to that this was not a source of variation for comparing observations of cultivar performance in the two platforms. Similar differential response of the cultivars would indicate that tillage is likely to be the main factor influencing cultivar adaptation, whilst a difference in the differential response might also indicate the importance of interaction with other agronomic factors. However, there was no significant differential cultivar response to the treatments in the CSC platform, and only Retriever in the Mid Pilmore platform inversion tillage was significantly higher yielding than the other three cultivars. Nevertheless, this comparison indicates that cultivar comparisons are likely to be robust across systems contrasting in diversity due to rotation differences.
In individual trials, cultivars were identified that clearly show differential yield responses to inversion and non-inversion tillage that are very likely to represent significant yield ranking changes. However, there are also cultivars that are identified as strongly adapted in only individual trials. This is not unexpected if adaptation is attributable to multiple traits each with different environmental interactions as environmental stress was very different each year. Ranking change is dependent on both the number of entries in the trial, the absolute yield differences in each year, and the range represented by the entries chosen. Furthermore, year-to-year differential responses of each cultivar are compounded when trials are analysed together such as the core set and the 2011–2014 common set. Nevertheless, these results do indicate that some cultivars are differentially adapted to cultivation treatments. To identify the traits responsible for tillage adaptation, it is, therefore, necessary to identify cultivars showing differential responses under single or multiple environments and to characterise their trait responses. Thus, to those cultivars which respond most strongly and consistently across years could be added the most strongly adapted in individual years (
Table 15). These represent a set of 11 cultivars of winter barley likely to express traits that contribute to tillage adaptation that can be characterised in more detailed trials and associated with mechanisms and heritable markers for breeding. Others might be included such as Volume as it is a hybrid cultivar claimed to have enhanced vigour, particularly in rooting, and therefore, might be expected to be more strongly interactive with cultivation. Volume was identified in the 2011 trial as inversion tillage adapted.
Previous work identified comparable cultivars of spring barley showing differential adaptation to tillage and limited data pointed to rooting traits such as root hairs as being amongst those potentially responsible such adaptation [
3]. Early vigour is commonly suggested as another criterion, and we have evidence that pull-out force that might characterise several possible root structure traits correlates with possible tillage adaptation (Newton and Bengough, unpublished data). However, more comprehensive growth analyses associated with cultivation treatments and yield assessments on a limited number of cultivars expressing that clear tillage adaptation differences are needed to establish which trait combinations are desirable, and what the trade-offs might be.
As these trial data do not include cultivars that have come to market in recent years, none of the 11 cultivars highlighted are commonly grown currently. However, the results presented here have demonstrated the scope for testing varietal responses to non-standard, alternative and on-farm type of growing conditions. As non-inversion and low-input systems become more prevalent with the increased pressure for sustainable production, there is an urgent need for information on crop performance under these conditions. Screening of cultivars specifically for low-input reduced tillage systems will provide growers with a choice of material that would allow improved production efficiency and financial sustainability whilst minimising the negative effects of conventional high input systems on the environment. These might include not only current cultivars on recommended lists but also candidate, heritage and other novel cultivars. It could be argued that our non-inversion tillage treatment may equate to, or serve as a proxy for, suboptimal agronomy for some on-farm conditions. If the inversion tillage conditions used in these trials equate to near-optimal conditions of official national or Recommended List trials, then the data reported here provide evidence that choice of cultivars should also consider the level of inputs and agronomic treatments, at least for cultivations.