**4. Discussion**

Although our approach provides a further insight into the factors (i.e., Integrated Pest Management-IPM, spatial and temporal) determining disease severity and crop performance, it could be argued that our experimentation was not adequate to provide solid evidence about the e ffect of the nitrogen rate. Indeed, we intentionally tested the nitrogen rate e ffect for only one year. Our main objective was to explore the introduction and spread of net form net blotch and barley leaf scald under the combined e ffect of nitrogen fertilization and genotype in a field with limited sources of infected host residues in the soil. Therefore, repeating Exp 2 for a second year, which means 3 years of barley cultivation in the same field, would inevitably lead to a wide spread of infected host residues and, in turn, to a poor estimation of the spatial dynamics of disease epidemics. Furthermore, it is quite clear that both experiments (Exp 1 and Exp 2) are interrelated, and this was essential for exploring a continuous process such as the entry, establishment and spread of a disease in a new area.

Despite the possible constraints and also by taking into consideration the fact that the tested experimental field was inside a disease-free area (cereals are not cultivated in this region), this study supports the hypothesis [3,18,48,49] that both NFNB and leaf scald could be carried over from one season to the next on infected seed. Furthermore, it was shown that the disease severity, concerning both diseases, di ffered between the two experimental years (Figure 3). However, the question is whether this di fference can be attributed to the initial source of the inoculum or just to the meteorological conditions that occurred during the tested years. On the one hand, our results revealed a higher disease severity in Exp 1 during the early development of the barley, and on the other hand, there was a higher disease severity in Exp 2 from the onset of stem elongation onwards (Figure 3). What we actually know is that rain episodes and moist conditions are essential for the dissemination and the infections of conidia concerning both pathogens [5,50,51]. Therefore, the higher disease severity in Exp 2 could not be explained by favorable meteorological conditions due to the occurrence of drier conditions in Exp 2 compared to Exp 1 (Figure 2). In addition, it is widely accepted that the most important source of primary inoculum for NFNB comes from infected host residue [5], an argumen<sup>t</sup> that supports the hypothesis that the higher disease severity in Exp 2 could presumably be attributed to a greater quantity of infected host residue during Exp 2.

As far as we are aware, our study, for the first time, demonstrates a spatial epidemiology assessment of both diseases under a Mediterranean environment and also sheds more light on the role of crop residues concerning their establishment in a new barley field. The epidemiology assessment of both diseases, when the nitrogen rate and genotype were the main sources of variation (Exp 2), was implemented with hotspot and Anselin Local Moran's I analysis. We found that the location of the hotspots changed during the growing season (Figure 5). This can be explained either by soil heterogeneity or by the spatial presence of the pathogens in the soil (i.e., as infected host residue) and genotype susceptibility. Soil heterogeneity was considered negligible because (i) the acreage of the experimental field was small (approximately 0.1 ha), (ii) there was no land inclination and (iii) the di fferentiation of the field soil moisture was rather small (Figure 9). Commonality analysis during Exp 2 revealed that the most important factor concerning NFNB disease severity was the distance of the plots from the hotspots, concerning the period of the onset of stem elongation (Table 2). According to Liu et al. [4], NFNB is classified as stubble-borne disease because the fungus usually produces the ascocarp as an over-seasoning structure on infected barley debris left after harvest. The primary inoculum early in the growing season is made by mature ascospores, which are dispersed by the wind. After initial colonization, the pathogen produces a large number of conidia, which serve as secondary inocula. These asexually produced spores can be dispersed by either the wind or rain to cause new infections on plants locally or at longer distances [4]. On the other hand, Zhana was the only cultivar that was not infected in both seasons by NFNB (i.e., it was infected only by *Rhynchosporium secalis*). However, it was found that the distance of the Zhana experimental plots from the previous season crop residues (i.e., the sites with Zhana) explained 58% of the variation in the disease severity (Figure 10). This result is also supported by the Anselin Local Moran's I spatial statistical analysis. Zhana was

considered an outlier due to having lower disease severity values while being surrounded by plots with high values from stem elongation onwards (Figure 5).

**Figure 9.** The variation in soil water content from anthesis until the end of grain filling (during Exp 2). Broad lines are medians, square open dots are means, boxes show the interquartile ranges, and whiskers extend to the last data points within 1.5 times the interquartile ranges.

**Figure 10.** Relationship between disease severity and the distance of the Zhana plots from the previous season's Zhana crop.

The late occurrence of *Rhynchosporium secalis* symptoms on Zhana compared to NFNB (Figure 3) during both experiments could possibly be attributed to its specific life cycle. According to Zhan et al. [3], *R. secalis* grows symptomlessly under the cuticle, especially where the walls of adjacent cells are joined before producing new conidia and, finally, visual symptoms. Further investigations concerning the infection process of *R. secalis* in barley were conducted by Linsell et al. [52]. In general, NFNB was more prevalent compared to leaf scald during all the tested developmental phases of malt barley (Figures 3 and 4). According to Robinson and Jalli [53], this could be a result of net blotch being comparatively less demanding of environmental conditions (mostly wind dispersed) than scald (mostly splash dispersed) for effective spore dispersal and epidemic development.

The effect of N on plant disease severity is quite variable in the literature [29]. Both increases [27,30] and decreases [28] in disease severity are reported from increasing N in plants. In addition, Turkington et al. [31] found that the total leaf disease severity caused by NFNB in barley was not significantly affected by the N rate. Our results showed that the disease severity for both pathogens during the second year for the malt barley in the same field (Exp 2) tended to increase from anthesis onwards upon increasing the rate of nitrogen application (Figure 6). The lack of a significant relationship between the disease severity and N rate could presumably be hidden behind spatial and genotypic effects. Indeed, according to commonality analysis, the effect of the distance from the locations with the highest disease infections was a better predictor of disease severity (for both diseases) compared to the nitrogen rate

during the pre-anthesis period. However, after anthesis, the disease severity was best explained by the nitrogen rate, concerning only the cultivars most susceptible to NFNB (Table 2).

The typical yield losses due to NFNB (*Pyrenophora teres* f. *teres*) and leaf scald (*Rhynchosporium secalis*) outbreaks can be up to 30–40% [3,6,8–11]. However, we did not detect any consistent relationship between the disease severity and grain yield when the main source of variation was the nitrogen rate (Figure 7). Jalakas et al. [54] also found a weak relationship between malt barley grain yield and net blotch (*Pyrenophora teres*) disease severity. This can be attributed to the time of disease occurrence and to the extent of the disease severity in relation to the barley developmental stage. It is widely accepted that grain yield determination in barley is mainly explained by the variation in the grain number per unit of land area [21,41,55,56]. According to Bingham et al. [57], the grain number in barley is a function of the production and survival of tillers and spikelets and the success of the fertilization of florets. Tiller production and spikelet initiation occur before the stem elongation phase, while the survival and further growth of tillers and spikelets are largely determined from stem elongation onwards. Accordingly, our results showed that the highest disease severity, which was recorded in Traveler during the tillering phase (Figure 3), exerted a more pronounced negative e ffect on the grain yield (Figure 7). In line with this, Jordan [48] demonstrated that the inoculation of spring barley before tillering can cause 30–40% yield loss, whereas inoculation from tillering to flowering decreased the grain yield by only 10%.

The higher disease severity in Grace compared to the rest of the studied cultivars during the onset of the grain filling phase (Figure 3) led to a significant reduction in grain yield, mainly through a decrease in the mean grain weight. Indeed, an increase in disease severity by 32.5% during the grain filling phase caused a reduction in the thousand grain weight by 18.3% in Grace. In line with this, Agostinetto et al. [58] demonstrated that the strongest relationship between grain yield reduction and barley spot blotch severity occurred after the booting stage of barley. Furthermore, Khan [9] observed a reduction in barley grain yield by 25–35% from net blotch, mainly due to a significant decrease in thousand grain weight.

The grain protein content is one of the most important factors in marketing malting barley. The primary objective, particularly in Mediterranean environments, is to maintain the grain protein content below a threshold of 11.5–12.0% depending on the brewing industry [41]. Although there is some evidence from northern climates suggesting that NFNB infections are not exerting any significant effect on grain protein content [30,31], our results revealed for the first time a positive relationship between NFNB disease severity and the grain protein content under Mediterranean conditions. Additionally, it was shown that the magnitude of this relationship was genotype dependent (Figure 8). It seems that the e ffect of NFNB disease severity on the grain protein content increases under terminal drought stress conditions in April–May (Figure 2A,B). According to Bertholdsson [34], drought stress during late grain filling limits carbohydrate incorporation in the grain and causes the pre-maturation and less dilution of the protein in the grain.
