**1. Introduction**

Barley (*Hordeum vulgare* L.) is one of the leading cereal crops of the world, and it is clearly number two in Europe in terms of cultivated acreage, next to bread wheat (*Triticum aestivum* L.) [1]. According to Meussdoer ffer and Zarnkow [2], barley is a major source of brewing malts and constitute the single most important raw material for beer production. *Pyrenophora teres* f. *teres*, an ascomycete that causes the foliar disease net form net blotch (NFNB), and *Rhynchosporium secalis*, the causal agen<sup>t</sup> of barley leaf scald, are among the most important barley diseases worldwide [3–5]. It is estimated that both these diseases can decrease barley grain yield by up to 30–40% [5–11]. In addition, there are indications that these diseases can also have a negative e ffect on malt barley quality [5].

Understanding the temporal and spatial dynamics of disease epidemics is crucial for the development of more efficient, integrated disease-management systems [12]. For example, Gibson [13,14] developed a novel approach involving the spatio-temporal analysis of spatially referenced diseased plants when a sequence of disease maps is available. Recently, several authors addressed the spatial and spatiotemporal structures of epidemics [15,16]. According to Luo et al. [17], geostatistics have been proposed in plant pathology to analyze the spatial patterns of epidemics. However, although they have several advantages in characterizing the disease pattern, they do not explicitly account for the epidemiological mechanisms that determine disease spread. Despite the increasing importance of NFNB and leaf scald in Greece, only a few epidemiological studies have been conducted worldwide and, especially, under similar climatic conditions [18].

Compared to other cultural practice factors (e.g., the seeding rate, tillage practice, etc.), nitrogen managemen<sup>t</sup> presents the highest variability in the Greek cropping belt of malt barley. The nitrogen fertilizer rate plays a major role in malt barley by a ffecting to a grea<sup>t</sup> extent the final yields and grain protein content (which has to be maintained below a threshold of 11.5–12.0% depending on the brewing industry), as well as the susceptibility to leaf diseases. More nitrogen can increase the yield of malt barley [19–22] but can also exert an adverse e ffect on quality by increasing grain protein content [23–26]. In addition, high nitrogen rates can also increase the susceptibility of barley to leaf diseases [27–30]. Therefore, understanding the degree of the relationship among the nitrogen rate, grain yield, quality variables and leaf disease infections can be very useful for further raising yield and maintaining the quality at a level that meets the requirements of the malt industry.

As far as we are aware, only a few studies have addressed, to date, the impact of NFNB and leaf scald on malt barley quality [30,31], and their results have been restricted to northern climates. However, there is a lack of evidence of what really happens under Mediterranean conditions, where the occurrence of malt barley diseases coincides with terminal drought. Malt barley has to meet certain specific quality requirements according to malt industry demands. The grain size and grain protein content are among the most important quality factors for malting barley [24]. Although the average grain weight and size is primarily determined during the post-anthesis period [32,33], the grain protein content can also be a ffected during the pre-anthesis period. For example, pre-anthesis drought stress can cause a low nitrogen uptake during the vegetative period, thus reducing the yield potential. Then, more nitrogen is available during grain filling due to the low number of seeds, and the grain protein content is increased [34].

In this study we aimed (i) to estimate the epidemiology of NFNB and leaf scald in a barley disease-free area when the initial inoculation of the field occurred through infected seeds, (ii) to explore the spatial dynamics of disease spread under the interaction of the nitrogen rate and genotype when there were limited sources of infected host residues in the soil and (iii) to assess the relationship among the nitrogen rate, grain yield, quality variables (i.e., grain protein content and grain size) and disease severity.

#### **2. Materials and Methods**

#### *2.1. Study Site and Experimental Design*

The experiment was divided into three di fferent phases, namely, (a) the selection of malt barley seeds from infected crops (i.e., with NFNB and leaf scald) grown in the main productive areas for malt barley in Greece (growing season 2013–2014), (b) the inoculation year (Exp 1; growing season 2014–2015) when the seeds from the infected malt barley varieties (i.e., Grace, Charles, Fortuna, KWS Asta and Zhana) were grown in a barley disease-free area (Spata is mainly a wine-producing and olive oil-producing region due to the occurrence of dry conditions; the nearest region with cereal crops is located more than 40 km away) and (c) the application in the same location (i.e., inoculated soil with infected crop residues from Exp 1) of nitrogen treatments on the most important (in terms of harvested areas) malt barley varieties in Greece, namely, Zhana, Grace, Traveler and RGT Planet (Exp 2; growing season 2015–2016). A conceptual diagram of the methodological approach is presented in Figure 1.

**Figure 1.** Conceptual diagram of the methodological approach.

The experiments (Exp 1 and Exp 2) were conducted in Spata, Greece (37◦5844.34 N, 23◦5447.87 E and 118 m above sea level), at the experimental station of the Agricultural University of Athens, during the growing seasons 2014–2015 and 2015–2016, respectively. The soil was clay loam. The physical and chemical characteristics of the soil at the beginning of the experiments (November 2013) were a pH of 7.7 (1:1 soil/water extract), organic matter at 2.02%, CaCO3 at 27.80%, an electrical conductivity (Ec) of 0.29 mmhos cm<sup>−</sup>1, available P (Olsen) at 52.84 ppm and 452 ppm of exchangeable K.

In Exp 1, the treatments consisted of five malt barley varieties as stated above. The experimental design was a randomized complete block design with 9 replications (in order to have a better spatial distribution of the selected genotypes) per genotype. During the second year (Exp 2) the experiment was arranged in a two-factorial randomized complete block design with three replications. The treatments were completely randomized within each block and included four two-rowed malt barley (*H*. *vulgare* L.) varieties (i.e., Zhana, Grace, Traveler and RGT Planet) and four nitrogen fertilization rates. The four N application rates were 0 (N0), 60 (N1), 100 (N2) and 140 (N3) kg N ha−1. In order to achieve a more efficient use of the N, half of its application was applied to the experimental plots at the onset of tillering phase (stages 20–22 according to Zadoks et al.'s [35] scale), and the remaining, at the end of the tillering phase (stages 25–29 according to Zadoks et al.'s [35] scale), as ammonium nitrate.

In both experimental years, the plot size was 9 m2, including 15 rows with a row space of 20 cm, and the crops were planted at a seed rate of approximately 350 seeds m<sup>−</sup>2. The plots in Exp 2 were established in the same location where the plots of Exp 1 had been seeded. In Exp 1, sowing was carried out following conventional soil tillage (i.e., ploughing and then disc cultivation), whereas only a rotary cultivator was used in Exp 2 in order to simulate the conditions of increased soil-borne disease pressure. Only certified malt barley seeds were used in Exp 2; therefore, the only source for disease dispersal was the crop residues from Exp 1.

The soil water content was frequently determined during each cultivation season. EC-5 sensors of Decagon Devices, Inc. were installed at a 25 cm depth in four different plots for the monitoring of the soil water content (SWC).
