Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (684)

Search Parameters:
Keywords = robust normal estimation

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
20 pages, 4589 KB  
Article
Autoencoder-Based Latent Representation Learning, SoH Estimation, and Anomaly Detection in Electric Vehicle Battery Energy Storage Systems
by Nagendra Kumar, Anubhav Agrawal, Rajeev Kumar and Manoj Badoni
Vehicles 2026, 8(4), 81; https://doi.org/10.3390/vehicles8040081 - 7 Apr 2026
Abstract
Accurate estimation of battery state of health (SoH) is an important aspect for improving the reliability, safety, and operating efficiency of an energy storage system. This study presents a unified deep learning pipeline for prediction, latent feature extraction, and anomaly detection. A convolution [...] Read more.
Accurate estimation of battery state of health (SoH) is an important aspect for improving the reliability, safety, and operating efficiency of an energy storage system. This study presents a unified deep learning pipeline for prediction, latent feature extraction, and anomaly detection. A convolution neutral network autoencoder is used to learn compact latent features from a dataset (NASA battery datasets, i.e., B0005, B0006, B0007, and B0018). These features serve as inputs to random forest and linear regression models, which are further compared with the CNN and GRU. The system is evaluated using leave-one-group-out cross-validation to ensure robustness across different batteries. Latent space quality is studied using PSA, t-SNE, and UMAP analyses. Furthermore, clustering performance is measured using the Silhouette Score, and anomalies are detected using reconstruction error and the Isolation Forest technique. The obtained results show that the AE+RF model achieves the best performance, with a 0.0285 root mean square value (RMSE) and a 0.0109 mean absolute error (MAE), with a high 0.96 coefficient of determination (R2). It is evident that AE+RF shows high prediction accuracy and model reliability. The results show that latent features improve prediction accuracy, helping to clearly separate normal and abnormal patterns, providing a robust and accurate approach to battery SoH estimation that is suitable for battery management system applications. Full article
Show Figures

Graphical abstract

25 pages, 2870 KB  
Article
Robust Maximum Half-Normal Multivariate Control Chart Based on Det-MCD and Fast-MCD Estimators
by Muhammad Ahsan, Awang Putra Sembada R, Muhammad Mashuri, Wibawati, Dinda Ayu Safira and Muhammad Hisyam Lee
Appl. Sci. 2026, 16(7), 3548; https://doi.org/10.3390/app16073548 - 4 Apr 2026
Viewed by 215
Abstract
Every company conducts evaluations to ensure the quality of its products and services, often utilizing multivariate simultaneous control charts to monitor the process mean and variability concurrently. The objective of this study is to overcome a significant limitation in the Maximum Half-Normal Multivariate [...] Read more.
Every company conducts evaluations to ensure the quality of its products and services, often utilizing multivariate simultaneous control charts to monitor the process mean and variability concurrently. The objective of this study is to overcome a significant limitation in the Maximum Half-Normal Multivariate Control Chart (Max-Half-Mchart): its vulnerability to outliers, which can trigger masking and swamping effects and lead to inaccurate process monitoring. The primary scientific contribution is the development of two robust versions of the Max-Half-Mchart by integrating the fast minimum covariance determinant (Fast-MCD) and deterministic minimum covariance determinant (Det-MCD) estimators into the chart’s statistical framework. The evaluation criteria for these methods include the average run length (ARL) to assess process shift detection speed and classification accuracy, false positive (FP) rate, false negative (FN) rate, and area under the curve (AUC) to measure outlier detection performance. Simulation results indicate that, while both robust charts effectively detect process shifts, the Det-MCD-based robust Max-Half-Mchart is particularly superior for lower contamination levels (5–20%), whereas the Fast-MCD-based chart performs best at higher contamination levels (30%). An illustrative application to ordinary Portland cement (OPC) quality data confirms the practical superiority of the Det-MCD approach, which detected six out-of-control signals compared with only two identified by conventional methods. These results suggest that the proposed robust charts are highly sensitive tools for maintaining quality in the presence of contaminated data. Full article
Show Figures

Figure 1

38 pages, 3132 KB  
Article
Lightweight Semantic-Aware Route Planning on Edge Hardware for Indoor Mobile Robots: Monocular Camera–2D LiDAR Fusion with Penalty-Weighted Nav2 Route Server Replanning
by Bogdan Felician Abaza, Andrei-Alexandru Staicu and Cristian Vasile Doicin
Sensors 2026, 26(7), 2232; https://doi.org/10.3390/s26072232 - 4 Apr 2026
Viewed by 270
Abstract
The paper introduces a computationally efficient semantic-aware route planning framework for indoor mobile robots, designed for real-time execution on resource-constrained edge hardware (Raspberry Pi 5, CPU-only). The proposed architecture fuses monocular object detection with 2D LiDAR-based range estimation and integrates the resulting semantic [...] Read more.
The paper introduces a computationally efficient semantic-aware route planning framework for indoor mobile robots, designed for real-time execution on resource-constrained edge hardware (Raspberry Pi 5, CPU-only). The proposed architecture fuses monocular object detection with 2D LiDAR-based range estimation and integrates the resulting semantic annotations into the Nav2 Route Server for penalty-weighted route selection. Object localization in the map frame is achieved through the Angular Sector Fusion (ASF) pipeline, a deterministic geometric method requiring no parameter tuning. The ASF projects YOLO bounding boxes onto LiDAR angular sectors and estimates the object range using a 25th-percentile distance statistic, providing robustness to sparse returns and partial occlusions. All intrinsic and extrinsic sensor parameters are resolved at runtime via ROS 2 topic introspection and the URDF transform tree, enabling platform-agnostic deployment. Detected entities are classified according to mobility semantics (dynamic, static, and minor) and persistently encoded in a GeoJSON-based semantic map, with these annotations subsequently propagated to navigation graph edges as additive penalties and velocity constraints. Route computation is performed by the Nav2 Route Server through the minimization of a composite cost functional combining geometric path length with semantic penalties. A reactive replanning module monitors semantic cost updates during execution and triggers route invalidation and re-computation when threshold violations occur. Experimental evaluation over 115 navigation segments (legs) on three heterogeneous robotic platforms (two single-board RPi5 configurations and one dual-board setup with inference offloading) yielded an overall success rate of 97% (baseline: 100%, adaptive: 94%), with 42 replanning events observed in 57% of adaptive trials. Navigation time distributions exhibited statistically significant departures from normality (Shapiro–Wilk, p < 0.005). While central tendency differences between the baseline and adaptive modes were not significant (Mann–Whitney U, p = 0.157), the adaptive planner reduced temporal variance substantially (σ = 11.0 s vs. 31.1 s; Levene’s test W = 3.14, p = 0.082), primarily by mitigating AMCL recovery-induced outliers. On-device YOLO26n inference, executed via the NCNN backend, achieved 5.5 ± 0.7 FPS (167 ± 21 ms latency), and distributed inference reduced the average system CPU load from 85% to 48%. The study further reports deployment-level observations relevant to the Nav2 ecosystem, including GeoJSON metadata persistence constraints, graph discontinuity (“path-gap”) artifacts, and practical Route Server configuration patterns for semantic cost integration. Full article
(This article belongs to the Special Issue Advances in Sensing, Control and Path Planning for Robotic Systems)
Show Figures

Figure 1

15 pages, 1217 KB  
Article
Detecting Phase Transitions from Data Using Generative Learning
by Xiyu Zhou, Yan Mi and Pan Zhang
Entropy 2026, 28(4), 406; https://doi.org/10.3390/e28040406 - 3 Apr 2026
Viewed by 194
Abstract
Identifying phase transitions in complex many-body systems traditionally necessitates the definition of specific order parameters, a task often requiring prior knowledge of the statistical model and the symmetry-breaking mechanism. In this work, we propose a framework for detecting phase transitions directly from raw [...] Read more.
Identifying phase transitions in complex many-body systems traditionally necessitates the definition of specific order parameters, a task often requiring prior knowledge of the statistical model and the symmetry-breaking mechanism. In this work, we propose a framework for detecting phase transitions directly from raw (experimental) data without requiring knowledge of the underlying model Hamiltonian, parameters, or pre-defined labels. Inspired by generative modeling in machine learning, our method utilizes autoregressive networks to estimate the normalized probability distribution of the system from raw configuration data. We then quantify the intrinsic sensitivity of this learned distribution to control parameters (such as temperature) to construct a robust indicator of phase transitions. This indicator is based on the expectation of the change in absolute logarithmic probability, derived entirely from the raw data. Our approach is purely data-driven: it takes raw data across varying control parameters as input and outputs the most likely estimate of the phase transition point. To validate our approach, we conduct extensive numerical experiments on the 2D Ising model on both triangular and square lattices, and on the Sherrington–Kirkpatrick (SK) model utilizing raw data generated via Markov Chain Monte Carlo and Tensor Network methods. The results demonstrate that our generative approach accurately identifies phase transitions using only raw data. Our framework provides a general tool for exploring critical phenomena in model systems, with the potential to be extended to realistic experimental data where theoretical descriptions remain incomplete. Full article
Show Figures

Figure 1

34 pages, 3026 KB  
Article
House Price Determinants: Evidence from Bulgaria as a New Eurozone Member State
by Andrey Zahariev, Galina Zaharieva, Larysa Shaulska and Mykhaylo Oryekhov
J. Risk Financial Manag. 2026, 19(4), 261; https://doi.org/10.3390/jrfm19040261 - 3 Apr 2026
Viewed by 277
Abstract
This study examines the relationship between house prices and the factors driving their growth during the transition from a long-standing currency board regime to Eurozone membership. The main objective is to identify and quantify the key factors explaining the variation in house price [...] Read more.
This study examines the relationship between house prices and the factors driving their growth during the transition from a long-standing currency board regime to Eurozone membership. The main objective is to identify and quantify the key factors explaining the variation in house price growth in Bulgaria under conditions of prolonged currency convergence. The study applies a set of econometric techniques, including stationarity tests (ADF and KPSS), diagnostic checks for normality, serial correlation and heteroscedasticity, and robustness checks. The study is based on 40 quarterly observations covering the period 2015Q4–2025Q3 and 48 selected predictors of the General house price index. The final ARIMAX(0,2,1) model is estimated using second-differenced data. The model includes a first-order moving average component and three exogenous regressors: the owner-occupiers’ housing expenditures, the actual rentals for housing in Bulgaria and the homeowners’ utility expenses. The model explains 87% of the variation in house price acceleration, with a comparatively low mean squared error. The diagnostic analysis confirms model adequacy. The three exogenous regressors are statistically significant at the 1% level with strong and stable effects on house price dynamics. No statistically significant relationship is found for the set of traditional macroeconomic, demographic, financial, and sectoral factors. The results show that during Bulgaria’s transition from a currency board to the Eurozone, the sustained house price growth was driven by country-specific factors. The three statistically significant determinants of the house price acceleration in Bulgaria reflect, respectively, the active investment behaviour of homeowners in improving existing properties, the rational assessment by housing market participants of the balance between mortgage and rental payments, and the burden of utility and maintenance costs borne by owners and tenants, depending on property size and energy efficiency. The first factor is most influential for homeowners, the second for tenants, and the third has a similarly significant impact on both groups. Full article
(This article belongs to the Special Issue Applied Public Finance and Fiscal Analysis)
Show Figures

Figure 1

19 pages, 1466 KB  
Article
Seasonal Variation of Plaque Psoriasis in Relation to Individualized MED-Adjusted Ultraviolet Exposure: A Cross-Sectional Study in Poland
by Michał Niedźwiedź, Agnieszka Czerwińska, Janusz Krzyścin, Joanna Narbutt and Aleksandra Lesiak
J. Clin. Med. 2026, 15(7), 2708; https://doi.org/10.3390/jcm15072708 - 3 Apr 2026
Viewed by 256
Abstract
Background: Patient-perceived seasonality of psoriasis is frequently reported, yet the independent contribution of objectively quantified, individualized ultraviolet (UV) exposure remains insufficiently characterized. We evaluated seasonal variation in plaque psoriasis and its association with geocoded, phototype-adjusted ambient antipsoriatic radiant exposures (ARE) using mixed-effects modeling. [...] Read more.
Background: Patient-perceived seasonality of psoriasis is frequently reported, yet the independent contribution of objectively quantified, individualized ultraviolet (UV) exposure remains insufficiently characterized. We evaluated seasonal variation in plaque psoriasis and its association with geocoded, phototype-adjusted ambient antipsoriatic radiant exposures (ARE) using mixed-effects modeling. Methods: This cross-sectional study included 119 adults with plaque psoriasis (476 seasonal observations). Participants rated seasonal disease courses using a 7-point scale. Ambient ARE was geocoded to residential postal codes and quantified as a behaviorally weighted dose normalized to individual minimal erythema dose (MED). Mixed-effects logistic regression models, adjusted for relevant confounders, estimated associations with seasonal improvement and worsening. Results: Seasonality was reported by 89.9% of participants (p < 0.001). Summer was the most favorable season, whereas winter was the most detrimental. The highest ARE quartile was independently associated with increased odds of improvement (OR 4.65, 95% CI 2.04–10.58, p < 0.001) and reduced odds of worsening (OR 0.16, 95% CI 0.08–0.33, p < 0.001). Crucially, continuous quadratic modeling revealed a significant inverted U-shaped relationship between UV exposure and improvement, with an estimated turning point of 3.85 (95% CI 1.88–5.82, p < 0.001) for the declared daily ARE (UVdecl) normalized by MED. Beyond this threshold, the probability of improvement attenuated. The protective effect against seasonal worsening remained linear. Conclusions: Psoriasis seasonality demonstrates a robust exposure–response association relationship with ambient UV. The estimated turning point (UVdecl/MED = 3.85) within our modeled exposure metric is exploratory and hypothesis-generating. It suggests an association where moderate UV exposure correlates with patient-perceived benefits, but these diminish at very high levels. This threshold requires external prospective validation before being considered a clinically actionable recommendation. Full article
Show Figures

Graphical abstract

25 pages, 2055 KB  
Article
Simultaneous Confidence Intervals for All Pairwise Differences of Coefficients of Variation of Delta-Inverse Gaussian Distributions
by Wasurat Khumpasee, Sa-Aat Niwitpong and Suparat Niwitpong
Symmetry 2026, 18(4), 604; https://doi.org/10.3390/sym18040604 - 2 Apr 2026
Viewed by 143
Abstract
This study develops and evaluates simultaneous confidence interval procedures for all pairwise differences of coefficients of variation under delta-inverse Gaussian distributions. The objective is to provide reliable comparative inference for relative variability in zero-inflated and highly skewed data, where standard normal-based methods may [...] Read more.
This study develops and evaluates simultaneous confidence interval procedures for all pairwise differences of coefficients of variation under delta-inverse Gaussian distributions. The objective is to provide reliable comparative inference for relative variability in zero-inflated and highly skewed data, where standard normal-based methods may be unreliable. Five approaches were studied and compared in terms of coverage probabilities and average widths: generalized confidence interval, adjusted generalized confidence interval, fiducial confidence interval, method of variance estimates recovery, and normal approximation. A Monte Carlo simulation study was conducted under varying shape parameters, zero-inflation probabilities, sample sizes, and numbers of populations (k = 3, 6, and 10). Although most methods produced CPs near the nominal 0.95 level, meaningful differences emerged when both coverage accuracy and interval efficiency were considered. The AGCI method consistently delivered stable coverage across parameter settings and remained robust as dimensionality increased. The MOVER approach achieved competitive coverage while frequently yielding narrower intervals. In contrast, GCI occasionally showed mild undercoverage, and FCI tended to produce overly wide intervals. An empirical application to zero-inflated mortality data supports the simulation findings. Overall, AGCI and MOVER provide reliable and practical tools for simultaneous inference on differences in CVs across delta-IG populations. Full article
Show Figures

Figure 1

30 pages, 507 KB  
Article
Beyond MSE in Poisson Ridge Regression: New Ridge Parameter Estimators with Additional Distributional Performance Criteria
by Selman Mermi
Mathematics 2026, 14(7), 1190; https://doi.org/10.3390/math14071190 - 2 Apr 2026
Viewed by 129
Abstract
Despite its widespread use for mitigating multicollinearity in count data models, Poisson ridge regression (PRR) remains methodologically constrained by the choice of the ridge parameter k. Existing studies predominantly evaluate ridge parameter estimators using only the mean squared error (MSE) criterion, largely [...] Read more.
Despite its widespread use for mitigating multicollinearity in count data models, Poisson ridge regression (PRR) remains methodologically constrained by the choice of the ridge parameter k. Existing studies predominantly evaluate ridge parameter estimators using only the mean squared error (MSE) criterion, largely neglecting their distributional properties and estimation stability. Such a narrow evaluation framework may yield unreliable inference, particularly under high correlation and small sample sizes. This study makes two original contributions to the PRR literature. First, we conduct a comprehensive comparison of 13 commonly used ridge parameter estimators and introduce two new estimators that exhibit superior empirical performance. Second, we extend performance evaluation beyond MSE by incorporating outlier ratios and conformity to normality, thereby establishing a multidimensional framework that explicitly addresses distributional robustness and estimator stability. Monte Carlo simulations across 180 scenarios—varying the number of predictors, sample size, correlation level, and intercept value—show that several estimators deemed optimal under MSE perform poorly in terms of outlier prevalence and normality. In contrast, the proposed estimators consistently achieve a balanced performance between error minimization and distributional stability. Two real-data applications further support these findings. Full article
(This article belongs to the Special Issue Statistical Models and Their Applications)
Show Figures

Figure 1

18 pages, 2678 KB  
Article
Normalization of GC-MS Metabolomics Data in Adherent Cells: A Practical Comparison of Approaches
by Ilya Yu. Kurbatov, Svyatoslav V. Zakharov, Olga I. Kiseleva, Viktoriia A. Arzumanian, Igor V. Vakhrushev, Roza Yu. Saryglar, Victoria D. Novikova, Yan S. Kim and Ekaterina V. Poverennaya
Int. J. Mol. Sci. 2026, 27(7), 3219; https://doi.org/10.3390/ijms27073219 - 2 Apr 2026
Viewed by 254
Abstract
Data compatibility remains a major challenge in metabolomics, as commonly used measures of biological material—such as sample weight or cell count—are often poorly reproducible. Here, we systematically evaluated practical normalization strategies for GC × GC-MS-based metabolomic profiling of two widely used model cell [...] Read more.
Data compatibility remains a major challenge in metabolomics, as commonly used measures of biological material—such as sample weight or cell count—are often poorly reproducible. Here, we systematically evaluated practical normalization strategies for GC × GC-MS-based metabolomic profiling of two widely used model cell lines: human hepatoblastoma (HepG2) and mesenchymal stromal cells (MSCs). We compared orthogonal biomass estimates, including total protein and double-stranded DNA quantified either directly in aliquots of the cell suspension lysate aliquots or in the post-extraction cell precipitate, alongside normalization based on extracted ion current (XIC). We also assessed three widely used extraction mixtures—methanol/chloroform/water (7:2:1); methanol/water (8:2); acetonitrile/isopropanol/water (3:3:2)—for metabolome coverage and normalization robustness. Under realistic biological variability, signal-to-biomass dependencies were moderate. In contrast, under strictly controlled conditions, DNA- and protein-based normalization yielded near-linear relationships with metabolite abundances (R2 > 0.90), demonstrating that biological variability is the dominant source of dispersion rather than technical factors. Methanol/chloroform/water system provided the broadest metabolome coverage and strongest correlation with injected biomass. Based on these findings, we recommend normalization to total precipitate protein or DNA using the methanol/chloroform/water extraction protocol, with XIC as a complementary quality control metric. Full article
(This article belongs to the Collection Advances in Cell and Molecular Biology)
Show Figures

Figure 1

37 pages, 19817 KB  
Article
A New Exponential-Type Model Under Unified Progressive Hybrid Censoring: Computational Inference and Its Applications
by Refah Alotaibi and Ahmed Elshahhat
Mathematics 2026, 14(7), 1182; https://doi.org/10.3390/math14071182 - 1 Apr 2026
Viewed by 214
Abstract
A new odd exponential-type (NOT-Exp) distribution provides a flexible and analytically tractable framework for modeling lifetime data exhibiting non-constant hazard behaviors, including increasing, decreasing, bathtub-shaped, and unimodal forms, which are commonly observed in real-world reliability and survival studies. In this work, a comprehensive [...] Read more.
A new odd exponential-type (NOT-Exp) distribution provides a flexible and analytically tractable framework for modeling lifetime data exhibiting non-constant hazard behaviors, including increasing, decreasing, bathtub-shaped, and unimodal forms, which are commonly observed in real-world reliability and survival studies. In this work, a comprehensive inferential methodology is developed for the NOT-Exp model under a unified progressive Type-II hybrid censoring, allowing several traditional censoring designs to be treated as special cases within a single unified structure. The main advantages of the proposed model lie in its ability to capture complex risk dynamics while maintaining mathematical simplicity, making it particularly suitable for censored lifetime data. Classical inference is conducted via maximum likelihood estimation, along with two asymptotic confidence interval constructions based on normal and log-normal approximations for both model parameters and reliability characteristics. In addition, a Bayesian estimation framework is introduced using independent gamma priors and Markov chain Monte Carlo techniques to obtain posterior estimates, credible intervals, and highest posterior density regions. Extensive simulations demonstrate the accuracy, stability, and robustness of the proposed estimators under varying sample sizes, censoring intensities, and prior specifications. Applications to airborne toxicological variation data and bank customer waiting times highlight the practical importance of the methodology, where the NOT-Exp model consistently outperforms twelve competing lifetime distributions according to standard goodness-of-fit criteria. These results confirm that the suggested design gives a strong and versatile tool for analyzing complex censored lifetime data across environmental and service-system applications. Full article
(This article belongs to the Special Issue Statistical Inference: Methods and Applications)
Show Figures

Figure 1

23 pages, 4073 KB  
Article
Robust Max-Half-Mchart Based on the Cellwise Minimum Covariance Determinant
by Syafi’ Bariq’ Syihabuddin Hidayatullah, Muhammad Ahsan and Wibawati
Processes 2026, 14(7), 1132; https://doi.org/10.3390/pr14071132 - 31 Mar 2026
Viewed by 221
Abstract
One of the main tools in Statistical Process Control (SPC) for monitoring quality is the control chart. The Max-Half-Mchart is a Shewhart-type simultaneous multivariate control chart designed to detect shifts in both process mean and variability. However, outliers can distort the estimation of [...] Read more.
One of the main tools in Statistical Process Control (SPC) for monitoring quality is the control chart. The Max-Half-Mchart is a Shewhart-type simultaneous multivariate control chart designed to detect shifts in both process mean and variability. However, outliers can distort the estimation of process parameters used to set control limits, leading to masking and swamping effects. Recent studies have highlighted the importance of cellwise contamination, which can reduce the effectiveness of casewise robust estimators. To overcome this limitation, this study develops a robust Max-Half-Mchart using the cellwise Minimum Covariance Determinant (cellMCD) estimator for location and covariance estimation. The proposed chart was evaluated through simulation studies, average run length analysis, and applications to synthetic and real OPC cement quality data. Simulation results under different correlation levels and contamination proportions show that the proposed chart provides more stable outlier detection performance than the conventional Max-Half-Mchart and the Fast-MCD-based Max-Half-Mchart, with better discrimination between normal and contaminated observations. The ARL analysis also indicates faster detection of small to moderate shifts. In the synthetic-data application, it achieved an Accuracy of 0.9899 and an AUC of 0.9939 under 20% contamination, and in the real-data application it detected seven out-of-control signals. Overall, the findings demonstrate that incorporating cellMCD into the Max-Half-Mchart provides a more robust and effective approach for multivariate process monitoring under cellwise contamination. Full article
Show Figures

Figure 1

21 pages, 7358 KB  
Article
Climate-Smart Framework for Olive Yield Estimation: Integrating Soil Properties, Thermal Time, and Remote Sensing NDVI Time Series
by Rosa Gutiérrez-Cabrera, Javier Borondo and Ana Maria Tarquis
Agronomy 2026, 16(7), 722; https://doi.org/10.3390/agronomy16070722 - 30 Mar 2026
Viewed by 210
Abstract
Olive groves in Mediterranean regions are being increasingly exposed to drought and heat extremes, intensifying the interannual yield variability. This study presents an integrated smart-farming framework that links soil context, climate forcing and satellite-observed canopy dynamics to enhance the interpretability and transferability of [...] Read more.
Olive groves in Mediterranean regions are being increasingly exposed to drought and heat extremes, intensifying the interannual yield variability. This study presents an integrated smart-farming framework that links soil context, climate forcing and satellite-observed canopy dynamics to enhance the interpretability and transferability of yield indicators at the parcel scale in southern Spain. Using SoilGrids root-zone properties and the Sentinel-2 time series of the normalized difference vegetation index (NDVI), we first classified parcels into three edaphic clusters. The canopy development was then expressed in thermal time using growing degree days (GDD), enabling phenology-aligned comparisons across campaigns. Two robust patterns emerged: (i) the cumulative NDVI up to 520 GDD showed a consistent negative association with both the biomass and the oil yield, suggesting an early-season vegetation trade-off and carry-over effects typical of perennial systems, and (ii) the rainfall accumulated during a thermally defined window (120–480 GDD) strongly estimated the yield in the subsequent year (R2=0.83–0.97 across soil clusters). By anchoring both vegetation and precipitation indicators to physiologically meaningful thermal milestones, the proposed framework avoids arbitrary calendar windows and enhances the interpretability, cross-year comparability, and scalability. Under projected increases in drought frequency and heat extremes, such hydro-thermal scaling approaches offer a robust basis for early yield forecasting, cooperative-level production planning, and adaptive management in Mediterranean olive systems. Full article
(This article belongs to the Special Issue Smart Farming: Advancing Techniques for High-Value Crops)
Show Figures

Figure 1

25 pages, 260979 KB  
Article
RDAH-Net: Bridging Relative Depth and Absolute Height for Monocular Height Estimation in Remote Sensing
by Liting Jiang, Feng Wang, Niangang Jiao, Jingxing Zhu, Yuming Xiang and Hongjian You
Remote Sens. 2026, 18(7), 1024; https://doi.org/10.3390/rs18071024 - 29 Mar 2026
Viewed by 222
Abstract
Generating high-precision normalized digital surface models (nDSMs) from a single remote sensing image remains a challenging and ill-posed problem due to the absence of reliable geometric constraints. In this work, we show that monocular depth provides structurally stable cues of local geometry but [...] Read more.
Generating high-precision normalized digital surface models (nDSMs) from a single remote sensing image remains a challenging and ill-posed problem due to the absence of reliable geometric constraints. In this work, we show that monocular depth provides structurally stable cues of local geometry but lacks the global scale and vertical reference required for absolute height recovery. This intrinsic mismatch limits direct depth-to-height regression, particularly when transferring across heterogeneous terrains, land-cover compositions, and imaging conditions. Building on this idea, we propose the Relative Depth–Absolute Height Prediction Network (RDAH-Net), a framework that exploits relative depth as a geometry-aware prior while learning terrain-dependent height mappings from image appearance to absolute height. As the backbone, we employ a lightweight MobileNetV2 enhanced with a Convolutional Block Attention Module (CBAM), and further incorporate a cross-modal bidirectional attention fusion scheme with positional encoding to achieve a deep and effective fusion of image appearance and depth prior cues. Finally, a PixelShuffle-based upsampling strategy is used to sharpen prediction details and mitigate typical upsampling artifacts. Extensive experiments across diverse regions demonstrate that RDAH-Net achieves robust and generalizable height estimation, providing a practical alternative for large-scale mapping and rapid update scenarios. Full article
Show Figures

Figure 1

23 pages, 26982 KB  
Article
Free Space Estimation Based on Superpixel Clustering for Assisted Driving
by Oswaldo Vitales, Ruth Aguilar-Ponce and Javier Vigueras
Sensors 2026, 26(7), 2120; https://doi.org/10.3390/s26072120 - 29 Mar 2026
Viewed by 411
Abstract
Free space detection in assisted driving applications is essential to provide information to vehicles about traversable surfaces and potential obstacles to be avoided. The current trend in free space detection favors the use of deep learning techniques. However, Deep Neural Networks require extensive [...] Read more.
Free space detection in assisted driving applications is essential to provide information to vehicles about traversable surfaces and potential obstacles to be avoided. The current trend in free space detection favors the use of deep learning techniques. However, Deep Neural Networks require extensive training that considers as many scenarios as possible, which makes it difficult to create a model that can be generalized to all types of surfaces. Additionally, their lack of explainability contrasts with the growing interest in geometrically grounded and safety-oriented design principles for autonomous vehicle systems. To address these limitations, we propose a geometric approach that incorporates coplanarity conditions and normal vector estimation, removing the dependence on datasets for different types of surfaces. Additionally, the stereoscopic images are clustered in superpixels. The use of images clustered in superpixels allows us to obtain shorter processing times, in addition to taking advantage of the spatial and color information provided by the superpixels to increase the robustness of the three-dimensional reconstruction of the scene. Experimental results show that the proposed superpixel-based approach achieves competitive performance compared to unsegmented dense stereo methods, while significantly reducing algorithmic complexity. These results demonstrate the viability of integrating superpixel clustering into stereo-based free space estimation frameworks. Full article
(This article belongs to the Section Vehicular Sensing)
Show Figures

Graphical abstract

18 pages, 464 KB  
Article
Lower Bounds for the Asymptotic Relative Efficiency of Huber Regression
by Xiaoyi Wang and Le Zhou
Mathematics 2026, 14(7), 1138; https://doi.org/10.3390/math14071138 - 28 Mar 2026
Viewed by 187
Abstract
Huber regression serves as a prominent robust alternative to ordinary least squares (OLS), particularly in the presence of heavy-tailed error distributions. While the asymptotic relative efficiency (ARE) of Huber regression is well documented for the standard normal distribution, its worst-case efficiency across the [...] Read more.
Huber regression serves as a prominent robust alternative to ordinary least squares (OLS), particularly in the presence of heavy-tailed error distributions. While the asymptotic relative efficiency (ARE) of Huber regression is well documented for the standard normal distribution, its worst-case efficiency across the class of all continuous and symmetric error distributions remains an important theoretical question. In this paper, we establish positive lower bounds for the ARE of Huber regression relative to OLS. By strategically selecting the robustification parameter based on the moments or quantiles of the error distribution, we first prove that the ARE is uniformly bounded away from zero across all continuous and symmetric error distributions. This result guarantees a baseline level of efficiency for Huber regression, sharing a similar theoretical spirit with the celebrated lower bound of the Wilcoxon rank estimator. Utilizing the empirical process theory, we further establish that the relative efficiency of Huber regression remains unchanged if the theoretical tuning parameter is replaced by an estimator with a suitable convergence rate. Simulation studies are conducted to examine the performance of Huber regression under the proposed tuning strategies. Full article
(This article belongs to the Special Issue Computational Statistics and Data Analysis, 3rd Edition)
Show Figures

Figure 1

Back to TopTop