Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (111)

Search Parameters:
Keywords = L-moments estimation

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
17 pages, 1909 KB  
Article
Ergonomics Study of Musculoskeletal Disorders Among Tram Drivers
by Jasna Leder Horina, Jasna Blašković Zavada, Marko Slavulj and Damir Budimir
Appl. Sci. 2025, 15(15), 8348; https://doi.org/10.3390/app15158348 - 27 Jul 2025
Viewed by 751
Abstract
Work-related musculoskeletal disorders (WMSDs) are among the most prevalent occupational health issues, particularly affecting public transport drivers due to prolonged sitting, constrained postures, and poorly adaptable cabins. This study addresses the ergonomic risks associated with tram driving, aiming to evaluate biomechanical load and [...] Read more.
Work-related musculoskeletal disorders (WMSDs) are among the most prevalent occupational health issues, particularly affecting public transport drivers due to prolonged sitting, constrained postures, and poorly adaptable cabins. This study addresses the ergonomic risks associated with tram driving, aiming to evaluate biomechanical load and postural stress in relation to drivers’ anthropometric characteristics. A combined methodological approach was applied, integrating two standardized observational tools—RULA and REBA—with anthropometric modeling based on three representatives European morphotypes (SmallW, MidM, and TallM). ErgoFellow 3.0 software was used for digital posture evaluation, and lumbar moments at the L4/L5 vertebral level were calculated to estimate lumbar loading. The analysis was simulation-based, using digital human models, and no real subjects were involved. The results revealed uniform REBA (Rapid Entire Body Assessment) and RULA (Rapid Upper Limb Assessment) scores of 6 across all morphotypes, indicating moderate to high risk and a need for ergonomic intervention. Lumbar moments ranged from 51.35 Nm (SmallW) to 101.67 Nm (TallM), with the tallest model slightly exceeding the recommended ergonomic thresholds. These findings highlight a systemic mismatch between cabin design and user variability. In conclusion, ergonomic improvements such as adjustable seating, better control layout, and driver education are essential to reduce the risk of WMSDs. The study proposes a replicable methodology combining anthropometric, observational, and biomechanical tools for evaluating and improving transport workstation design. Full article
(This article belongs to the Section Applied Biosciences and Bioengineering)
Show Figures

Figure 1

28 pages, 9894 KB  
Article
At-Site Versus Regional Frequency Analysis of Sub-Hourly Rainfall for Urban Hydrology Applications During Recent Extreme Events
by Sunghun Kim, Kyungmin Sung, Ju-Young Shin and Jun-Haeng Heo
Water 2025, 17(15), 2213; https://doi.org/10.3390/w17152213 - 24 Jul 2025
Viewed by 494
Abstract
Accurate rainfall quantile estimation is critical for urban flood management, particularly given the escalating climate change impacts. This study comprehensively compared at-site frequency analysis and regional frequency analysis for sub-hourly rainfall quantile estimation, using data from 27 sites across Seoul. The analysis focused [...] Read more.
Accurate rainfall quantile estimation is critical for urban flood management, particularly given the escalating climate change impacts. This study comprehensively compared at-site frequency analysis and regional frequency analysis for sub-hourly rainfall quantile estimation, using data from 27 sites across Seoul. The analysis focused on Seoul’s disaster prevention framework (30-year and 100-year return periods). Employing L-moment statistics and Monte Carlo simulations, the rainfall quantiles were estimated, the methodological performance was evaluated, and Seoul’s current disaster prevention standards were assessed. The analysis revealed significant spatio-temporal variability in Seoul’s precipitation, causing considerable uncertainty in individual site estimates. A performance evaluation, including the relative root mean square error and confidence interval, consistently showed regional frequency analysis superiority over at-site frequency analysis. While at-site frequency analysis demonstrated better performance only for short return periods (e.g., 2 years), regional frequency analysis exhibited a substantially lower relative root mean square error and significantly narrower confidence intervals for larger return periods (e.g., 10, 30, 100 years). This methodology reduced the average 95% confidence interval width by a factor of approximately 2.7 (26.98 mm versus 73.99 mm). This enhanced reliability stems from the information-pooling capabilities of regional frequency analysis, mitigating uncertainties due to limited record lengths and localized variabilities. Critically, regionally derived 100-year rainfall estimates consistently exceeded Seoul’s 100 mm disaster prevention threshold across most areas, suggesting that the current infrastructure may be substantially under-designed. The use of minute-scale data underscored its necessity for urban hydrological modeling, highlighting the inadequacy of conventional daily rainfall analyses. Full article
(This article belongs to the Special Issue Urban Flood Frequency Analysis and Risk Assessment)
Show Figures

Figure 1

28 pages, 2140 KB  
Article
Application of the GEV Distribution in Flood Frequency Analysis in Romania: An In-Depth Analysis
by Cristian Gabriel Anghel and Dan Ianculescu
Climate 2025, 13(7), 152; https://doi.org/10.3390/cli13070152 - 18 Jul 2025
Viewed by 1182
Abstract
This manuscript investigates the applicability and behavior of the Generalized Extreme Value (GEV) distribution in flood frequency analysis, comparing it with the Pearson III and Wakeby distributions. Traditional approaches often rely on a limited set of statistical distributions and estimation techniques, which may [...] Read more.
This manuscript investigates the applicability and behavior of the Generalized Extreme Value (GEV) distribution in flood frequency analysis, comparing it with the Pearson III and Wakeby distributions. Traditional approaches often rely on a limited set of statistical distributions and estimation techniques, which may not adequately capture the behavior of extreme events. The study focuses on four hydrometric stations in Romania, analyzing maximum discharges associated with rare and very rare events. The research employs seven parameter estimation methods: the method of ordinary moments (MOM), the maximum likelihood estimation (MLE), the L-moments, the LH-moments, the probability-weighted moments (PWMs), the least squares method (LSM), and the weighted least squares method (WLSM). Results indicate that the GEV distribution, particularly when using L-moments, consistently provides more reliable predictions for extreme events, reducing biases compared to MOM. Compared to the Wakeby distribution for an extreme event (T = 10,000 years), the GEV distribution produced smaller deviations than the Pearson III distribution, namely +7.7% (for the Danube River, Giurgiu station), +4.9% (for the Danube River, Drobeta station), and +35.3% (for the Ialomita River). In the case of the Siret River, the Pearson III distribution generated values closer to those obtained by the Wakeby distribution, being 36.7% lower than those produced by the GEV distribution. These results support the use of L-moments in national hydrological guidelines for critical infrastructure design and highlight the need for further investigation into non-stationary models and regionalization techniques. Full article
(This article belongs to the Special Issue Hydroclimatic Extremes: Modeling, Forecasting, and Assessment)
Show Figures

Figure 1

19 pages, 1595 KB  
Article
Probabilistic Forecasting of Peak Discharges Using L-Moments and Multi-Parameter Statistical Models
by Cristian Gabriel Anghel and Dan Ianculescu
Water 2025, 17(13), 1908; https://doi.org/10.3390/w17131908 - 27 Jun 2025
Cited by 1 | Viewed by 824
Abstract
Given the global rise in magnitude and frequency of extreme events due to climate change, accurately determining these values—typically through frequency analysis—is especially important. The article analyzes the particular aspects of three probability distributions of 4 and 5 parameters in flood frequency analysis [...] Read more.
Given the global rise in magnitude and frequency of extreme events due to climate change, accurately determining these values—typically through frequency analysis—is especially important. The article analyzes the particular aspects of three probability distributions of 4 and 5 parameters in flood frequency analysis (FFA) using the L-moments as a parameter estimation method. Aspects regarding the behavior of the five-parameter Wakeby, four-parameter generalized Pareto and four-parameter Burr distributions are highlighted in generating the maximum flow values in the area of low annual exceedance probabilities characteristic of rare and very rare events. After applying these distributions to four case studies, it was found that for the 10,000-year return period event, the relative error between multi-parameter distributions is under 20%—a more than acceptable margin given the extremely low exceedance probability. However, its importance depends on the use of the generated values, which in some cases can lead to excessive costs in establishing structural flood protection measures (urban planning), which can be avoided. It also highlights possible negative consequences (material and human lives) regarding the risk associated with these analyses that can lead to an under-dimensioning of this infrastructure. Full article
(This article belongs to the Special Issue Risks of Hydrometeorological Extremes)
Show Figures

Figure 1

61 pages, 18163 KB  
Article
Regional Frequency Analysis Using L-Moments for Determining Daily Rainfall Probability Distribution Function and Estimating the Annual Wastewater Discharges
by Pau Estrany-Planas, Pablo Blanco-Gómez, Juan I. Ortiz-Vallespí, Javier Orihuela-Martínez and Víctor Vilarrasa
Hydrology 2025, 12(6), 152; https://doi.org/10.3390/hydrology12060152 - 16 Jun 2025
Viewed by 820
Abstract
The spatial distribution of precipitation is one of the major unknowns in hydrological modeling since meteorological stations do not adequately cover the territory, and their records are often short. In addition, regulations are increasingly restricting the amount of wastewater that can be discharged [...] Read more.
The spatial distribution of precipitation is one of the major unknowns in hydrological modeling since meteorological stations do not adequately cover the territory, and their records are often short. In addition, regulations are increasingly restricting the amount of wastewater that can be discharged each year. Therefore, understanding the annual behavior of rainfall events is becoming increasingly important. This paper presents Rainfall Frequency Analysis (RainFA), a software package that applies a methodology for data curation and frequency analysis of precipitation series based on the evaluation of the L-moments for regionalization and cluster classification. This methodology is tested in the city of Palma (Spain), identifying a single homogeneous cluster integrated by 7 (out of 11) stations, with homogeneity values less than 0.6 for precipitation values greater than or equal to 0.4 mm. In the evaluation of the prediction capacity, the selected cluster of 7 stations performed in the first quartile of the 120 possible combinations of 7 stations, both for the detection of the occurrence of rainfall—in terms of Probability of Detection (POD), False Alarm Ratio (FAR), Critical Success Index (CSI) and Bias Score (BS) statistics—and for the accuracy of rainfall—according to Root Mean Square Error (RMSE), Nash–Sutcliffe Efficiency coefficient (NSE) and Percent Bias (PBIAS). The cluster was also excellent for predicting different rainfall ranges, resulting in the best combination for both light—i.e., [1, 5) mm—and moderate—i.e., [5, 20) mm—rainfall prediction. The Generalized Pareto gave the best probability distribution function for the selected region, and it was used to simulate daily rainfall and system discharges over annual periods using Monte Carlo techniques. The derived discharge values were consistent with observations for 2023, with an average discharge of about 700,000 m3 of wastewater. RainFA is an easy-to-use and open-source software programmed using Python that can be applied anywhere in the world. Full article
Show Figures

Figure 1

34 pages, 6341 KB  
Article
Statistical and Physical Significance of Homogeneous Regions in Regional Flood Frequency Analysis
by Ali Ahmed, Ataur Rahman, Ridwan S. M. H. Rafi, Zaved Khan and Haider Mannan
Water 2025, 17(12), 1799; https://doi.org/10.3390/w17121799 - 16 Jun 2025
Cited by 1 | Viewed by 1137
Abstract
This study investigates formation homogeneous regions in regional flood frequency analysis (RFFA) and compares two RFFA methods, the quantile regression technique (QRT) and the index flood method (IFM). A total of 201 gauged stations from southeast Australia were adopted in this study. Multivariate [...] Read more.
This study investigates formation homogeneous regions in regional flood frequency analysis (RFFA) and compares two RFFA methods, the quantile regression technique (QRT) and the index flood method (IFM). A total of 201 gauged stations from southeast Australia were adopted in this study. Multivariate statistical techniques were applied to form candidate regions. Also, regions are formed in the L-moments space (such as the L coefficient of variation (LCV) and L coefficient of skewness (LCS) of annual maximum flood data). Hosking and Wallis test statistics were used to find discordant sites and for testing the homogeneity of the assumed regions. No homogeneous regions were found in southeast Australia based on catchment characteristics data; however, homogeneous regions can be formed in the space of L-moments. It was found that regions formed in the L-moments space have little link with the catchment characteristics data space. The QRT provides more accurate flood quantile estimates than the IFM. Full article
Show Figures

Figure 1

30 pages, 4887 KB  
Article
Regional Flood Frequency Analysis in Northeastern Bangladesh Using L-Moments for Peak Discharge Estimation at Various Return Periods in Ungauged Catchments
by Sujoy Dey, S. M. Tasin Zahid, Saptaporna Dey, Kh. M. Anik Rahaman and A. K. M. Saiful Islam
Water 2025, 17(12), 1771; https://doi.org/10.3390/w17121771 - 12 Jun 2025
Cited by 1 | Viewed by 1689
Abstract
The Sylhet Division of Bangladesh, highly susceptible to monsoon flooding, requires effective flood risk management to reduce socio-economic losses. Flood frequency analysis is an essential aspect of flood risk management and plays a crucial role in designing hydraulic structures. This study applies regional [...] Read more.
The Sylhet Division of Bangladesh, highly susceptible to monsoon flooding, requires effective flood risk management to reduce socio-economic losses. Flood frequency analysis is an essential aspect of flood risk management and plays a crucial role in designing hydraulic structures. This study applies regional flood frequency analysis (RFFA) using L-moments to identify homogeneous hydrological regions and estimate extreme flood quantiles. Records from 26 streamflow gauging stations were used, including streamflow data along with corresponding physiographic and climatic characteristic data, obtained from GIS analysis and ERA5 respectively. Most stations showed no significant monotonic trends, temporal correlations, or spatial dependence, supporting the assumptions of stationarity and independence necessary for reliable frequency analysis, which allowed the use of cluster analysis, discordancy measures, heterogeneity tests for regionalization, and goodness-of-fit tests to evaluate candidate distributions. The Generalized Logistic (GLO) distribution performed best, offering robust quantile estimates with narrow confidence intervals. Multiple Non-Linear Regression models, based on catchment area, elevation, and other parameters, reasonably predicted ungauged basin peak discharges (R2 = 0.61–0.87; RMSE = 438–2726 m3/s; MAPE = 41–74%) at different return periods, although uncertainty was higher for extreme events. Four homogeneous regions were identified, showing significant differences in hydrological behavior, with two regions yielding stable estimates and two exhibiting greater extreme variability. Full article
Show Figures

Figure 1

19 pages, 392 KB  
Article
Szász–Beta Operators Linking Frobenius–Euler–Simsek-Type Polynomials
by Nadeem Rao, Mohammad Farid and Shivani Bansal
Axioms 2025, 14(6), 418; https://doi.org/10.3390/axioms14060418 - 29 May 2025
Viewed by 365
Abstract
This manuscript associates with a study of Frobenius–Euler–Simsek-type Polynomials. In this research work, we construct a new sequence of Szász–Beta type operators via Frobenius–Euler–Simsek-type Polynomials to discuss approximation properties for the Lebesgue integrable functions, i.e., Lp[0,), [...] Read more.
This manuscript associates with a study of Frobenius–Euler–Simsek-type Polynomials. In this research work, we construct a new sequence of Szász–Beta type operators via Frobenius–Euler–Simsek-type Polynomials to discuss approximation properties for the Lebesgue integrable functions, i.e., Lp[0,), 1p<. Furthermore, estimates in view of test functions and central moments are studied. Next, rate of convergence is discussed with the aid of the Korovkin theorem and the Voronovskaja type theorem. Moreover, direct approximation results in terms of modulus of continuity of first- and second-order, Peetre’s K-functional, Lipschitz type space, and the rth-order Lipschitz type maximal functions are investigated. In the subsequent section, we present weighted approximation results, and statistical approximation theorems are discussed. To demonstrate the effectiveness and applicability of the proposed operators, we present several illustrative examples and visualize the results graphically. Full article
(This article belongs to the Special Issue Applied Mathematics and Numerical Analysis: Theory and Applications)
Show Figures

Figure 1

32 pages, 2679 KB  
Article
An In-Depth Statistical Analysis of the Pearson Type III Distribution Behavior in Modeling Extreme and Rare Events
by Cristian-Gabriel Anghel and Dan Ianculescu
Water 2025, 17(10), 1539; https://doi.org/10.3390/w17101539 - 20 May 2025
Cited by 6 | Viewed by 1411
Abstract
Statistical distributions play a crucial role in water resources management and civil engineering, particularly for analyzing data variability and predicting rare events with extremely long return periods (e.g., T = 1000 years, T = 10,000 years). Among these, the Pearson III (PE3) distribution [...] Read more.
Statistical distributions play a crucial role in water resources management and civil engineering, particularly for analyzing data variability and predicting rare events with extremely long return periods (e.g., T = 1000 years, T = 10,000 years). Among these, the Pearson III (PE3) distribution is widely used in hydrology and flood frequency analysis (FFA). This study aims to provide a comprehensive guide to the practical application of the PE3 distribution in FFA. It explores five parameter estimation methods, presenting both exact and newly developed approximate relationships for calculating distribution parameters and frequency factors. The analysis relies on data from four rivers with varying morphometric characteristics and record lengths. The results highlight that the Pearson III distribution, when used with the L-moments method, offers the most reliable quantile estimates, characterized by the smallest biases compared to other methods (e.g., 31% for the Nicolina River and, respectively, 5% for the Siret and Ialomita Rivers) and the highest confidence in predicting rare events. Based on these findings, the L-moments approach is recommended for flood frequency analysis to improve the accuracy of extreme flow forecasts. Full article
(This article belongs to the Special Issue Urban Flood Frequency Analysis and Risk Assessment)
Show Figures

Figure 1

22 pages, 3171 KB  
Article
Determination of Hydrological Flood Hazard Thresholds and Flood Frequency Analysis: Case Study of Nokoue Lake Watershed
by Namwinwelbere Dabire, Eugene C. Ezin and Adandedji M. Firmin
Water 2025, 17(8), 1147; https://doi.org/10.3390/w17081147 - 11 Apr 2025
Viewed by 848
Abstract
With the impacts of climate change, floods have become increasingly frequent in recent years. Estimating flood hazard thresholds and peak floodwater levels based on flood frequency analysis is crucial for anticipating and preparing for potential flooding events. This study aims to estimate flood [...] Read more.
With the impacts of climate change, floods have become increasingly frequent in recent years. Estimating flood hazard thresholds and peak floodwater levels based on flood frequency analysis is crucial for anticipating and preparing for potential flooding events. This study aims to estimate flood hazard thresholds, flood occurrence probabilities, and the return periods of peak floodwater levels in the Nokoue lake watershed in Benin. To achieve this, the standardized water level index, also known as the Flood hazard Index, was calculated to estimate flood hazard thresholds. The three best probability distribution models, Gumbel, Generalized Extreme Value (GEV), and Generalized Pareto (GPA), were selected to project future floodwater levels using annual maximum daily water level data for extreme floods from 1997 to 2022, obtained from a water gauge site at Nokoue lake. Three goodness-of-fit tests were applied to identify the best-fitting probability distribution model: a Taylor diagram (three-dimensional analysis), a cumulative probability density diagram based on the root-mean-square error (RMSE), and an L-moment diagram (two-dimensional analysis). The Flood hazard Index values ranged from −1.10 to +3.40, with 77.78% showing positive indices and 22.22% showing negative indices. The flood hazard thresholds were classified in ascending order of index values: limited hazards, moderate hazards, significant hazards, and critical hazards. The analysis results indicate that the flood hazard thresholds are defined as follows: below 3.94 m for limited hazards, from 3.94 m up to 4.04 m for moderate hazards, from 4.04 m to 4.14 m for significant hazards, and above 4.14 m for critical hazards. The distribution model analysis showed that the Gumbel distribution best fits the Nokoue lake watershed, with an RMSE of 0.0724, compared to 0.0754 and 0.0761 for the GEV and GPA models, respectively. The annual maximum daily water levels for various non-exhaustive return periods, 2, 3, 5, 10, 25, 50, and 100 years, were estimated and compared. The return period for the highest recorded annual maximum daily water levels (4.4 m/day) in the Nokoue lake watershed were calculated to be 12, 15, and 15 years using the Gumbel, GEV, and GPA models, respectively. Quantile analysis revealed that the Gumbel distribution produced overestimated results compared to the GEV and GPA models for return periods exceeding 10 years. Exceptional and very exceptional hydrological events have return periods of 100 and 150 years, corresponding to peak flow levels of 4.95 m and 5.05 m respectively. Finally, the results of this study will be invaluable for flood hazard managers in monitoring flood alerts and for water resource engineers in determining dimensions for designing flood control structures such as spillways, dams, and bridges, thereby improving the management of recurrent flooding events. Full article
(This article belongs to the Section Hydrology)
Show Figures

Figure 1

19 pages, 700 KB  
Article
A Fast Finite Difference Method for 2D Time Fractional Mobile/Immobile Equation with Weakly Singular Solution
by Haili Qiao and Aijie Cheng
Fractal Fract. 2025, 9(4), 204; https://doi.org/10.3390/fractalfract9040204 - 26 Mar 2025
Cited by 1 | Viewed by 408
Abstract
This paper presents a fast Crank–Nicolson L1 finite difference scheme for the two-dimensional time fractional mobile/immobile diffusion equation with weakly singular solution at the initial moment. First, the time fractional derivative is discretized using the Crank–Nicolson formula on uniform meshes, and a local [...] Read more.
This paper presents a fast Crank–Nicolson L1 finite difference scheme for the two-dimensional time fractional mobile/immobile diffusion equation with weakly singular solution at the initial moment. First, the time fractional derivative is discretized using the Crank–Nicolson formula on uniform meshes, and a local truncation error estimate is provided. The spatial derivative is discretized using the central difference quotient on uniform meshes. Then, energy analysis methods are utilized to provide an optimal error estimates. On the other hand, the numerical scheme is optimized based on the sum-of-exponentials approximation, effectively reducing computation and memory requirements. Finally, numerical examples are simulated to verify the effectiveness of the algorithm. Full article
Show Figures

Figure 1

24 pages, 2767 KB  
Article
Modeling Non-Normal Distributions with Mixed Third-Order Polynomials of Standard Normal and Logistic Variables
by Mohan D. Pant, Aditya Chakraborty and Ismail El Moudden
Mathematics 2025, 13(6), 1019; https://doi.org/10.3390/math13061019 - 20 Mar 2025
Viewed by 439
Abstract
Continuous data associated with many real-world events often exhibit non-normal characteristics, which contribute to the difficulty of accurately modeling such data with statistical procedures that rely on normality assumptions. Traditional statistical procedures often fail to accurately model non-normal distributions that are often observed [...] Read more.
Continuous data associated with many real-world events often exhibit non-normal characteristics, which contribute to the difficulty of accurately modeling such data with statistical procedures that rely on normality assumptions. Traditional statistical procedures often fail to accurately model non-normal distributions that are often observed in real-world data. This paper introduces a novel modeling approach using mixed third-order polynomials, which significantly enhances accuracy and flexibility in statistical modeling. The main objective of this study is divided into three parts: The first part is to introduce two new non-normal probability distributions by mixing standard normal and logistic variables using a piecewise function of third-order polynomials. The second part is to demonstrate a methodology that can characterize these two distributions through the method of L-moments (MoLMs) and method of moments (MoMs). The third part is to compare the MoLMs- and MoMs-based characterizations of these two distributions in the context of parameter estimation and modeling non-normal real-world data. The simulation results indicate that the MoLMs-based estimates of L-skewness and L-kurtosis are superior to their MoMs-based counterparts of skewness and kurtosis, especially for distributions with large departures from normality. The modeling (or data fitting) results also indicate that the MoLMs-based fits of these distributions to real-world data are superior to their corresponding MoMs-based counterparts. Full article
(This article belongs to the Section D1: Probability and Statistics)
Show Figures

Figure 1

16 pages, 2860 KB  
Article
Analysis of the Dynamics of Hydroclimatic Extremes in Urban Areas: The Case of Grand-Nokoué in Benin, West Africa
by Vidjinnagni Vinasse Ametooyona Azagoun, Kossi Komi, Expédit Wilfrid Vissin and Komi Selom Klassou
Climate 2025, 13(2), 39; https://doi.org/10.3390/cli13020039 - 12 Feb 2025
Cited by 1 | Viewed by 1776
Abstract
As global warming continues, extremes in key climate parameters will become more frequent. These extremes are one of the main challenges for the sustainability of cities. The aim of this study is to provide a better understanding of the evolution of extremes in [...] Read more.
As global warming continues, extremes in key climate parameters will become more frequent. These extremes are one of the main challenges for the sustainability of cities. The aim of this study is to provide a better understanding of the evolution of extremes in precipitation (pcp) and maximum (Tmax) and minimum (Tmin) temperatures in Grand-Nokoué to improve the resilience of the region. To this end, historical daily precipitation and maximum (Tmax) and minimum (Tmin) temperature data from the Cotonou synoptic station were used from 1991 to 2020. First, the extreme events identified using the 99th percentile threshold were used to analyze their annual and monthly frequency. Secondly, a Generalized Extreme Value (GEV) distribution was fitted to the annual maxima with a 95% confidence interval to determine the magnitude of the specific return periods. The parameters of this distribution were estimated using the method of L moments, considering non-stationarity. The results of the study showed significant upward trends in annual precipitation and minimum temperatures, with p-values of 0.04 and 0.001, respectively. Over the past decade, the number of extreme precipitation and Tmin events has exceeded the expected number. The model provides greater confidence for periods ≤ 50 years. Extreme values of three-day accumulations up to 68.21 mm for pcp, 79.38 °C for Tmin and 97.29 °C for Tmax are expected every two years. The results of this study can be used to monitor hydroclimatic hazards in the region. Full article
(This article belongs to the Section Climate and Environment)
Show Figures

Figure 1

8 pages, 4076 KB  
Proceeding Paper
Regional Frequency Analysis of Annual Maximum Rainfall and Sampling Uncertainty Quantification
by Marios Billios and Lampros Vasiliades
Environ. Earth Sci. Proc. 2025, 32(1), 3; https://doi.org/10.3390/eesp2025032003 - 24 Jan 2025
Viewed by 878
Abstract
Accurate quantile estimation of extreme precipitation is crucial for hydraulic infrastructure design but is often hindered by limited data records, leading to uncertainties. This study applies regional frequency analysis (RFA) using L-moments, comparing classical and Bayesian approaches to quantify uncertainties. Data from 55 [...] Read more.
Accurate quantile estimation of extreme precipitation is crucial for hydraulic infrastructure design but is often hindered by limited data records, leading to uncertainties. This study applies regional frequency analysis (RFA) using L-moments, comparing classical and Bayesian approaches to quantify uncertainties. Data from 55 rainfall stations in Thessaly, Greece, are analyzed through clustering using PCA and k-means. The Generalized Extreme Value (GEV) distribution is fitted to delineated clusters, and uncertainties are assessed via bootstrap and MCMC methods. Results highlight consistency in location and scale estimates, with Bayesian methods offering narrower uncertainty bounds, demonstrating improved reliability for long-term rainfall prediction and design. Full article
(This article belongs to the Proceedings of The 8th International Electronic Conference on Water Sciences)
Show Figures

Figure 1

27 pages, 9833 KB  
Article
A Network-Based Clustering Method to Ensure Homogeneity in Regional Frequency Analysis of Extreme Rainfall
by Marios Billios and Lampros Vasiliades
Water 2025, 17(1), 38; https://doi.org/10.3390/w17010038 - 26 Dec 2024
Cited by 1 | Viewed by 2127
Abstract
The social impacts of extreme rainfall events are expected to intensify with climate change, making reliable statistical analyses essential. High quantile estimation requires substantial data; however, available records are sometimes limited. Additionally, finite data and variability across statistical models introduce uncertainties in the [...] Read more.
The social impacts of extreme rainfall events are expected to intensify with climate change, making reliable statistical analyses essential. High quantile estimation requires substantial data; however, available records are sometimes limited. Additionally, finite data and variability across statistical models introduce uncertainties in the final estimates. This study addresses the uncertainty that arises when selecting parameters in Regional Frequency Analysis (RFA) by proposing a method to objectively identify statistically homogeneous regions. Station coordinates, elevation, annual mean rainfall, maximum annual rainfall, and l-skewness from 55 meteorological stations are selected to study annual maximum daily rainfall. These covariates are employed to investigate the interdependency of the covariates in Principal Component Analysis (PCA) as a preprocessing step in cluster analysis. Network theory, implemented through an iterative clustering process, is used in network creation where stations are linked based on the frequency of their co-occurrence in clusters. Communities are formed by maximizing the modularity index after creating a network of stations. RFA is performed in the final communities using L-moment theory to estimate regional and InSite quantiles. Quantile uncertainty is calculated through parametric bootstrapping. The application of PCA has a negligible effect on network creation in the study area. The results show that the iterative clustering approach with network theory ensures statistically created homogeneous regions, as demonstrated in Thessaly’s complex terrain for regionalisation of extreme rainfall. Full article
(This article belongs to the Section Hydrology)
Show Figures

Figure 1

Back to TopTop