Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (441)

Search Parameters:
Keywords = joint probability distribution

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
18 pages, 4451 KB  
Article
Radar Target Detection Based on Linear Fusion of Two Features
by Yong Huang, Yunhao Luan, Yunlong Dong and Hao Ding
Sensors 2025, 25(17), 5436; https://doi.org/10.3390/s25175436 - 2 Sep 2025
Abstract
The joint detection of multiple features significantly enhances radar’s ability to detect weak targets on the sea surface. However, issues such as large data requirements and the lack of robustness in high-dimensional decision spaces severely constrain the detection performance and applicability of such [...] Read more.
The joint detection of multiple features significantly enhances radar’s ability to detect weak targets on the sea surface. However, issues such as large data requirements and the lack of robustness in high-dimensional decision spaces severely constrain the detection performance and applicability of such methods. In response to this, this paper proposes a radar target detection method based on linear fusion of two features from the perspective of feature dimension reduction. Firstly, a two-feature linear dimensionality reduction method based on distribution compactness is designed to form a fused feature. Then, the generalized extreme value (GEV) distribution is used to model the tail of the probability density function (PDF) of the fused feature, thereby designing an asymptotic constant false alarm rate (CFAR) detector. Finally, the detection performance of this detector is comparatively analyzed using measured data. Full article
(This article belongs to the Section Radar Sensors)
Show Figures

Figure 1

54 pages, 22294 KB  
Article
Research on Risk Evolution Probability of Urban Lifeline Natech Events Based on MdC-MCMC
by Shifeng Li and Yu Shang
Sustainability 2025, 17(17), 7664; https://doi.org/10.3390/su17177664 - 25 Aug 2025
Viewed by 669
Abstract
Urban lifeline Natech events are coupled systems composed of multiple risks and entities with complex dynamic transmission chains. Predicting risk evolution probabilities is the core task for achieving the safety management of urban lifeline Natech events. First, the risk evolution mechanism is analyzed, [...] Read more.
Urban lifeline Natech events are coupled systems composed of multiple risks and entities with complex dynamic transmission chains. Predicting risk evolution probabilities is the core task for achieving the safety management of urban lifeline Natech events. First, the risk evolution mechanism is analyzed, where urban lifeline Natech events exhibit spatial evolution characteristics, which involves dissecting the parallel and synergistic effects of risk evolution in spatial dimensions. Next, based on fitting marginal probability distribution functions for natural hazard and urban lifeline risk evolution, a Multi-dimensional Copula (MdC) function for the joint probability distribution of urban lifeline Natech event risk evolution is constructed. Building upon the MdC function, a Markov Chain Monte Carlo (MCMC) model for predicting risk evolution probabilities of urban lifeline Natech events is developed using the Metropolis–Hastings (M-H) algorithm and Gibbs sampling. Finally, taking the 2021 Zhengzhou ‘7·20’ catastrophic rainstorm as a case study, joint probability distribution functions for risk evolution under Rainfall-Wind speed scenarios are fitted for traffic, electric, communication, water supply, and drainage systems (including different risk transmission chains). Numerical simulations of joint probability distributions for risk evolution are conducted, and visualizations of joint probability predictions for risk evolution are generated. Full article
Show Figures

Figure 1

26 pages, 62819 KB  
Article
Low-Light Image Dehazing and Enhancement via Multi-Feature Domain Fusion
by Jiaxin Wu, Han Ai, Ping Zhou, Hao Wang, Haifeng Zhang, Gaopeng Zhang and Weining Chen
Remote Sens. 2025, 17(17), 2944; https://doi.org/10.3390/rs17172944 - 25 Aug 2025
Viewed by 529
Abstract
The acquisition of nighttime remote-sensing visible-light images is often accompanied by low-illumination effects and haze interference, resulting in significant image quality degradation and greatly affecting subsequent applications. Existing low-light enhancement and dehazing algorithms can handle each problem individually, but their simple cascade cannot [...] Read more.
The acquisition of nighttime remote-sensing visible-light images is often accompanied by low-illumination effects and haze interference, resulting in significant image quality degradation and greatly affecting subsequent applications. Existing low-light enhancement and dehazing algorithms can handle each problem individually, but their simple cascade cannot effectively address unknown real-world degradations. Therefore, we design a joint processing framework, WFDiff, which fully exploits the advantages of Fourier–wavelet dual-domain features and innovatively integrates the inverse diffusion process through differentiable operators to construct a multi-scale degradation collaborative correction system. Specifically, in the reverse diffusion process, a dual-domain feature interaction module is designed, and the joint probability distribution of the generated image and real data is constrained through differentiable operators: on the one hand, a global frequency-domain prior is established by jointly constraining Fourier amplitude and phase, effectively maintaining the radiometric consistency of the image; on the other hand, wavelets are used to capture high-frequency details and edge structures in the spatial domain to improve the prediction process. On this basis, a cross-overlapping-block adaptive smoothing estimation algorithm is proposed, which achieves dynamic fusion of multi-scale features through a differentiable weighting strategy, effectively solving the problem of restoring images of different sizes and avoiding local inconsistencies. In view of the current lack of remote-sensing data for low-light haze scenarios, we constructed the Hazy-Dark dataset. Physical experiments and ablation experiments show that the proposed method outperforms existing single-task or simple cascade methods in terms of image fidelity, detail recovery capability, and visual naturalness, providing a new paradigm for remote-sensing image processing under coupled degradations. Full article
(This article belongs to the Section AI Remote Sensing)
Show Figures

Figure 1

22 pages, 10891 KB  
Article
DNS Study of Freely Propagating Turbulent Lean-Premixed Flames with Low-Temperature Chemistry in the Broken Reaction Zone Regime
by Yi Zhang, Yinhu Kang, Xiaomei Huang, Pengyuan Zhang and Xiaolin Tang
Energies 2025, 18(16), 4357; https://doi.org/10.3390/en18164357 - 15 Aug 2025
Viewed by 349
Abstract
The novel engines nowadays with high efficiency are operated under the superpressure, supercritical, and supersonic extreme conditions that are situated in the broken reaction zone regime. In this article, the propagation and heat/radical diffusion physics of a high-pressure dimethyl ether (DME)/air turbulent lean-premixed [...] Read more.
The novel engines nowadays with high efficiency are operated under the superpressure, supercritical, and supersonic extreme conditions that are situated in the broken reaction zone regime. In this article, the propagation and heat/radical diffusion physics of a high-pressure dimethyl ether (DME)/air turbulent lean-premixed flame are investigated numerically by direct numerical simulation (DNS). A wide range of statistical and diagnostic methods, including Lagrangian fluid tracking, Joint Probability Density Distribution (JPDF), and chemical explosive mode analysis (CEMA), are applied to reveal the local combustion modes and dynamics evolution, as well as the roles of heat/mass transport and cool/hot flame interaction in the turbulent combustion, which would be beneficial to the design of novel engines with high performances. It is found that the three-staged combustion, including cool-flame, warm-flame, and hot-flame fronts, is a unique behavior of DME flame under the elevated-pressure, lean-premixed condition. In the broken reaction zone regime, the reaction zone thickness increases remarkably, and the heat release rate (HRR) and fuel consumption rate in the cool-flame zone are increased by 16% and 19%, respectively. The diffusion effect not only enhances flame propagation, but also suppresses the local HRR or fuel consumption. The strong turbulence interplaying with diffusive transports is the underlying physics for the enhancements in cool- and hot-flame fronts. The dominating diffusive sub-processes are revealed by the aid of the diffusion index. Full article
(This article belongs to the Section I2: Energy and Combustion Science)
Show Figures

Figure 1

37 pages, 414 KB  
Article
Comparisons Between Frequency Distributions Based on Gini’s Approach: Principal Component Analysis Addressed to Time Series
by Pierpaolo Angelini
Econometrics 2025, 13(3), 32; https://doi.org/10.3390/econometrics13030032 - 13 Aug 2025
Viewed by 481
Abstract
In this paper, time series of length T are seen as frequency distributions. Each distribution is defined with respect to a statistical variable having T observed values. A methodological system based on Gini’s approach is put forward, so the statistical model through which [...] Read more.
In this paper, time series of length T are seen as frequency distributions. Each distribution is defined with respect to a statistical variable having T observed values. A methodological system based on Gini’s approach is put forward, so the statistical model through which time series are handled is a frequency distribution studied inside a linear system. In addition to the starting frequency distributions that are observed, other frequency distributions are treated. Thus, marginal distributions based on the notion of proportionality are introduced together with joint distributions. Both distributions are statistical models. A fundamental invariance property related to marginal distributions is made explicit in this research work, so one can focus on collections of marginal frequency distributions, identifying multiple frequency distributions. For this reason, the latter is studied via a tensor. As frequency distributions are practical realizations of nonparametric probability distributions over R, one passes from frequency distributions to discrete random variables. In this paper, a mathematical model that generates time series is put forward. It is a stochastic process based on subjective previsions of random variables. A subdivision of the exchangeability of variables of a statistical nature is shown, so a reinterpretation of principal component analysis that is based on the notion of proportionality also characterizes this research work. Full article
15 pages, 4207 KB  
Article
Impact Analysis of Inter-Basin Water Transfer on Water Shortage Risk in the Baiyangdian Area
by Yuhang Shi, Lixin Zhang and Jinping Zhang
Water 2025, 17(15), 2311; https://doi.org/10.3390/w17152311 - 4 Aug 2025
Viewed by 393
Abstract
This study quantitatively assesses the risk of water shortage (WSR) in the Baiyangdian area due to the Inter-Basin Water Transfer (IBWT) project, focusing on the impact of water transfer on regional water security. The actual evapotranspiration (ETa) is calculated, and the correlation simulation [...] Read more.
This study quantitatively assesses the risk of water shortage (WSR) in the Baiyangdian area due to the Inter-Basin Water Transfer (IBWT) project, focusing on the impact of water transfer on regional water security. The actual evapotranspiration (ETa) is calculated, and the correlation simulation using Archimedes’ Copula function is implemented in Python 3.7.1, with optimization using the sum of squares of deviations (OLS) and the AIC criterion. The joint distribution model between ETa and three water supply scenarios is constructed. Key findings include (1) ETa increased by 27.3% after water transfer, far exceeding the slight increase in water supply before the transfer; (2) various Archimedean Copulas effectively capture the dependence and joint probability distribution between water supply and ETa; (3) water shortage risk increased after water transfer, with rainfall and upstream water unable to alleviate the problem in Baiyangdian; and (4) cross-basin water transfer reduced risk, with a reduction of 8.90% in the total probability of three key water resource scheduling combinations. This study establishes a Copula-based framework for water shortage risk assessment, providing a scientific basis for water allocation strategies in ecologically sensitive areas affected by human activities. Full article
Show Figures

Figure 1

32 pages, 12348 KB  
Article
Advances in Unsupervised Parameterization of the Seasonal–Diurnal Surface Wind Vector
by Nicholas J. Cook
Meteorology 2025, 4(3), 21; https://doi.org/10.3390/meteorology4030021 - 29 Jul 2025
Viewed by 272
Abstract
The Offset Elliptical Normal (OEN) mixture model represents the seasonal–diurnal surface wind vector for wind engineering design applications. This study upgrades the parameterization of OEN by accounting for changes in format of the global database of surface observations, improving performance by eliminating manual [...] Read more.
The Offset Elliptical Normal (OEN) mixture model represents the seasonal–diurnal surface wind vector for wind engineering design applications. This study upgrades the parameterization of OEN by accounting for changes in format of the global database of surface observations, improving performance by eliminating manual supervision and extending the scope of the model to include skewness. The previous coordinate transformation of binned speed and direction, used to evaluate the joint probability distributions of the wind vector, is replaced by direct kernel density estimation. The slow process of sequentially adding additional components is replaced by initializing all components together using fuzzy clustering. The supervised process of sequencing each mixture component through time is replaced by a fully automated unsupervised process using pattern matching. Previously reported departures from normal in the tails of the fuzzy-demodulated OEN orthogonal vectors are investigated by directly fitting the bivariate skew generalized t distribution, showing that the small observed skew is likely real but that the observed kurtosis is an artefact of the demodulation process, leading to a new Offset Skew Normal mixture model. The supplied open-source R scripts fully automate parametrization for locations in the NCEI Integrated Surface Hourly global database of wind observations. Full article
Show Figures

Figure 1

20 pages, 2538 KB  
Article
Research on Long-Term Scheduling Optimization of Water–Wind–Solar Multi-Energy Complementary System Based on DDPG
by Zixing Wan, Wenwu Li, Mu He, Taotao Zhang, Shengzhe Chen, Weiwei Guan, Xiaojun Hua and Shang Zheng
Energies 2025, 18(15), 3983; https://doi.org/10.3390/en18153983 - 25 Jul 2025
Viewed by 362
Abstract
To address the challenges of high complexity in modeling the correlation of multi-dimensional stochastic variables and the difficulty of solving long-term scheduling models in continuous action spaces in multi-energy complementary systems, this paper proposes a long-term optimization scheduling method based on Deep Deterministic [...] Read more.
To address the challenges of high complexity in modeling the correlation of multi-dimensional stochastic variables and the difficulty of solving long-term scheduling models in continuous action spaces in multi-energy complementary systems, this paper proposes a long-term optimization scheduling method based on Deep Deterministic Policy Gradient (DDPG). First, an improved C-Vine Copula model is used to construct the multi-dimensional joint probability distribution of water, wind, and solar energy, and Latin Hypercube Sampling (LHS) is employed to generate a large number of water–wind–solar coupling scenarios, effectively reducing the model’s complexity. Then, a long-term optimization scheduling model is established with the goal of maximizing the absorption of clean energy, and it is converted into a Markov Decision Process (MDP). Next, the DDPG algorithm is employed with a noise dynamic adjustment mechanism to optimize the policy in continuous action spaces, yielding the optimal long-term scheduling strategy for the water–wind–solar multi-energy complementary system. Finally, using a water–wind–solar integrated energy base as a case study, comparative analysis demonstrates that the proposed method can improve the renewable energy absorption capacity and the system’s power generation efficiency by accurately quantifying the uncertainties of water, wind, and solar energy and precisely controlling the continuous action space during the scheduling process. Full article
(This article belongs to the Section B: Energy and Environment)
Show Figures

Figure 1

24 pages, 6142 KB  
Article
Variability of Summer Drought and Heatwave Events in Northeast China
by Rui Wang, Longpeng Cong, Ying Sun and Xiaotian Bai
Sustainability 2025, 17(14), 6569; https://doi.org/10.3390/su17146569 - 18 Jul 2025
Viewed by 398
Abstract
As global climate change intensifies, extreme climate events are becoming more frequent, presenting significant challenges to socioeconomic systems and ecosystems. Northeast China, a region highly sensitive to climate change, has been profoundly impacted by compound drought and heat extremes (CDHEs), affecting agriculture, society, [...] Read more.
As global climate change intensifies, extreme climate events are becoming more frequent, presenting significant challenges to socioeconomic systems and ecosystems. Northeast China, a region highly sensitive to climate change, has been profoundly impacted by compound drought and heat extremes (CDHEs), affecting agriculture, society, and the economy. To evaluate the characteristics and evolution of summer CDHEs in this region, this study analyzed observational data from 81 meteorological stations (1961–2020) and developed a Standardized Temperature–Precipitation Index (STPI) using the Copula joint probability method. The STPI’s effectiveness in characterizing compound drought and heat conditions was validated against historical records. Using the constructed STPI, this study conducted a comprehensive analysis of the spatiotemporal distribution of CDHEs. The Theil–Sen median trend analysis, Mann–Kendall trend tests, and the frequency of CDHEs were employed to examine drought and heatwave patterns and their influence on compound events. The findings demonstrated an increase in the severity of compound drought and heat events over time. Although the STPI exhibited a slight interannual decline, its values remained above −2.0, indicating the continued intensification of these events in the study area. Most of the stations showed a non-significant decline in the Standardized Precipitation Index and a significant rise in the Standardized Temperature Index, indicating that rising temperatures primarily drive the increasing severity of compound drought and heat events. The 1990s marked a turning point with a significant increase in the frequency, severity, and spatial extent of these events. Full article
Show Figures

Figure 1

22 pages, 2492 KB  
Article
VJDNet: A Simple Variational Joint Discrimination Network for Cross-Image Hyperspectral Anomaly Detection
by Shiqi Wu, Xiangrong Zhang, Guanchun Wang, Puhua Chen, Jing Gu, Xina Cheng and Licheng Jiao
Remote Sens. 2025, 17(14), 2438; https://doi.org/10.3390/rs17142438 - 14 Jul 2025
Viewed by 301
Abstract
To enhance the generalization of networks and avoid redundant training efforts, cross-image hyperspectral anomaly detection (HAD) based on deep learning has been gradually studied in recent years. Cross-image HAD aims to perform anomaly detection on unknown hyperspectral images after a single training process [...] Read more.
To enhance the generalization of networks and avoid redundant training efforts, cross-image hyperspectral anomaly detection (HAD) based on deep learning has been gradually studied in recent years. Cross-image HAD aims to perform anomaly detection on unknown hyperspectral images after a single training process on the network, thereby improving detection efficiency in practical applications. However, the existing approaches may require additional supervised information or stacking of networks to improve model performance, which may impose high demands on data or hardware in practical applications. In this paper, a simple and lightweight unsupervised cross-image HAD method called Variational Joint Discrimination Network (VJDNet) is proposed. We leverage the reconstruction and distribution representation ability of the variational autoencoder (VAE), learning the global and local discriminability of anomalies jointly. To integrate these representations from the VAE, a probability distribution joint discrimination (PDJD) module is proposed. Through the PDJD module, the VJDNet can directly output the anomaly score mask of pixels. To further facilitate the unsupervised paradigm, a sample pair generation module is proposed, which is able to generate anomaly samples and background representation samples tailored for the cross-image HAD task. The experimental results show that the proposed method is able to maintain the detection accuracy with only a small number of parameters. Full article
Show Figures

Figure 1

24 pages, 3798 KB  
Article
A Robust Tracking Method for Aerial Extended Targets with Space-Based Wideband Radar
by Linlin Fang, Yuxin Hu, Lihua Zhong and Lijia Huang
Remote Sens. 2025, 17(14), 2360; https://doi.org/10.3390/rs17142360 - 9 Jul 2025
Viewed by 257
Abstract
Space-based radar systems offer significant advantages for air surveillance, including wide-area coverage and extended early-warning capabilities. The integrated design of detection and imaging in space-based wideband radar further enhances its accuracy. However, in the wideband tracking mode, large aircraft targets exhibit extended characteristics. [...] Read more.
Space-based radar systems offer significant advantages for air surveillance, including wide-area coverage and extended early-warning capabilities. The integrated design of detection and imaging in space-based wideband radar further enhances its accuracy. However, in the wideband tracking mode, large aircraft targets exhibit extended characteristics. Measurements from the same target cross multiple range resolution cells. Additionally, the nonlinear observation model and uncertain measurement noise characteristics under space-based long-distance observation substantially increase the tracking complexity. To address these challenges, we propose a robust aerial target tracking method for space-based wideband radar applications. First, we extend the observation model of the gamma Gaussian inverse Wishart probability hypothesis density filter to three-dimensional space by incorporating a spherical–radial cubature rule for improved nonlinear filtering. Second, variational Bayesian processing is integrated to enable the joint estimation of the target state and measurement noise parameters, and a recursive process is derived for both Gaussian and Student’s t-distributed measurement noise, enhancing the method’s robustness against noise uncertainty. Comprehensive simulations evaluating varying target extension parameters and noise conditions demonstrate that the proposed method achieves superior tracking accuracy and robustness. Full article
Show Figures

Graphical abstract

44 pages, 523 KB  
Article
Compositional Causal Identification from Imperfect or Disturbing Observations
by Isaac Friend, Aleks Kissinger, Robert W. Spekkens and Elie Wolfe
Entropy 2025, 27(7), 732; https://doi.org/10.3390/e27070732 - 8 Jul 2025
Viewed by 416
Abstract
The usual inputs for a causal identification task are a graph representing qualitative causal hypotheses and a joint probability distribution for some of the causal model’s variables when they are observed rather than intervened on. Alternatively, the available probabilities sometimes come from a [...] Read more.
The usual inputs for a causal identification task are a graph representing qualitative causal hypotheses and a joint probability distribution for some of the causal model’s variables when they are observed rather than intervened on. Alternatively, the available probabilities sometimes come from a combination of passive observations and controlled experiments. It also makes sense, however, to consider causal identification with data collected via schemes more generic than (perfect) passive observation or perfect controlled experiments. For example, observation procedures may be noisy, may disturb the variables, or may yield only coarse-grained specification of the variables’ values. In this work, we investigate identification of causal quantities when the probabilities available for inference are the probabilities of outcomes of these more generic schemes. Using process theories (aka symmetric monoidal categories), we formulate graphical causal models as second-order processes that respond to such data collection instruments. We pose the causal identification problem relative to arbitrary sets of available instruments. Perfect passive observation instruments—those that produce the usual observational probabilities used in causal inference—satisfy an abstract process-theoretic property called marginal informational completeness. This property also holds for other (sets of) instruments. The main finding is that in the case of Markovian models, as long as the available instruments satisfy this property, the probabilities they produce suffice for identification of interventional quantities, just as those produced by perfect passive observations do. This finding sharpens the distinction between the Markovianity of a causal model and that of a probability distribution, suggesting a more extensive line of investigation of causal inference within a process-theoretic framework. Full article
(This article belongs to the Special Issue Causal Graphical Models and Their Applications)
15 pages, 1529 KB  
Article
Peak Age of Information Optimization in Cell-Free Massive Random Access Networks
by Zhiru Zhao, Yuankang Huang and Wen Zhan
Electronics 2025, 14(13), 2714; https://doi.org/10.3390/electronics14132714 - 4 Jul 2025
Viewed by 390
Abstract
With the vigorous development of Internet of Things technologies, Cell-Free Radio Access Network (CF-RAN), leveraging its distributed coverage and single/multi-antenna Access Point (AP) coordination advantages, has become a key technology for supporting massive Machine-Type Communication (mMTC). However, under the grant-free random access mechanism, [...] Read more.
With the vigorous development of Internet of Things technologies, Cell-Free Radio Access Network (CF-RAN), leveraging its distributed coverage and single/multi-antenna Access Point (AP) coordination advantages, has become a key technology for supporting massive Machine-Type Communication (mMTC). However, under the grant-free random access mechanism, this network architecture faces the problem of information freshness degradation due to channel congestion. To address this issue, a joint decoding model based on logical grouping architecture is introduced to analyze the correlation between the successful packet transmission probability and the Peak Age of Information (PAoI) in both single-AP and multi-AP scenarios. On this basis, a global Particle Swarm Optimization (PSO) algorithm is designed to dynamically adjust the channel access probability to minimize the average PAoI across the network. To reduce signaling overhead, a PSO algorithm based on local topology information is further proposed to achieve collaborative optimization among neighboring APs. Simulation results demonstrate that the global PSO algorithm can achieve performance closely approximating the optimum, while the local PSO algorithm maintains similar performance without the need for global information. It is especially suitable for large-scale access scenarios with wide area coverage, providing an efficient solution for optimizing information freshness in CF-RAN. Full article
Show Figures

Figure 1

16 pages, 662 KB  
Article
Augmenting Naïve Bayes Classifiers with k-Tree Topology
by Fereshteh R. Dastjerdi and Liming Cai
Mathematics 2025, 13(13), 2185; https://doi.org/10.3390/math13132185 - 4 Jul 2025
Viewed by 391
Abstract
The Bayesian network is a directed, acyclic graphical model that can offer a structured description for probabilistic dependencies among random variables. As powerful tools for classification tasks, Bayesian classifiers often require computing joint probability distributions, which can be computationally intractable due to potential [...] Read more.
The Bayesian network is a directed, acyclic graphical model that can offer a structured description for probabilistic dependencies among random variables. As powerful tools for classification tasks, Bayesian classifiers often require computing joint probability distributions, which can be computationally intractable due to potential full dependencies among feature variables. On the other hand, Naïve Bayes, which presumes zero dependencies among features, trades accuracy for efficiency and often comes with underperformance. As a result, non-zero dependency structures, such as trees, are often used as more feasible probabilistic graph approximations; in particular, Tree Augmented Naïve Bayes (TAN) has been demonstrated to outperform Naïve Bayes and has become a popular choice. For applications where a variable is strongly influenced by multiple other features, TAN has been further extended to the k-dependency Bayesian classifier (KDB), where one feature can depend on up to k other features (for a given k2). In such cases, however, the selection of the k parent features for each variable is often made through heuristic search methods (such as sorting), which do not guarantee an optimal approximation of network topology. In this paper, the novel notion of k-tree Augmented Naïve Bayes (k-TAN) is introduced to augment Naïve Bayesian classifiers with k-tree topology as an approximation of Bayesian networks. It is proved that, under the Kullback–Leibler divergence measurement, k-tree topology approximation of Bayesian classifiers loses the minimum information with the topology of a maximum spanning k-tree, where the edge weights of the graph are mutual information between random variables conditional upon the class label. In addition, while in general finding a maximum spanning k-tree is NP-hard for fixed k2, this work shows that the approximation problem can be solved in time O(nk+1) if the spanning k-tree also desires to retain a given Hamiltonian path in the graph. Therefore, this algorithm can be employed to ensure efficient approximation of Bayesian networks with k-tree augmented Naïve Bayesian classifiers of the guaranteed minimum loss of information. Full article
Show Figures

Figure 1

18 pages, 6585 KB  
Article
Research on the Risk of a Multi-Source Hydrological Drought Encounter in the Yangtze River Basin Based on Spatial and Temporal Correlation
by Jinbei Li and Hao Wang
Water 2025, 17(13), 1986; https://doi.org/10.3390/w17131986 - 1 Jul 2025
Viewed by 360
Abstract
For a long time, drought disasters have brought about a wide range of negative impacts on human socio-economics. Especially in large basins with many tributaries, once hydrological drought occurs synchronously in several tributaries, the hydrological drought condition in the mainstream will be aggravated, [...] Read more.
For a long time, drought disasters have brought about a wide range of negative impacts on human socio-economics. Especially in large basins with many tributaries, once hydrological drought occurs synchronously in several tributaries, the hydrological drought condition in the mainstream will be aggravated, which will lead to more serious losses. However, there is still a lack of research on the probabilistic risk of simultaneous hydrologic droughts in various areas of large watersheds. In this study, the Standardized Runoff Index was used to characterize hydrological drought, and the Standardized Runoff Index (SRI) sequence characteristics of each region were analyzed. Subsequently, a multiregional hazard encounter probability distribution model with an R-vine structure was constructed with the help of the vine copula function to study the risk pattern of simultaneous hydrological drought in multiple tributaries under environmental changes. The model results showed that the probability of the four basins gradually decreased from 7.5% to 0.16% when the SRI changed from ≤−0.5 to ≤−2.0, indicating that the likelihood of the joint distribution of the compound disaster decreases with increase in the drought extremes. Meanwhile, the probability of hydrological drought in the three major basins showed significant spatial differences, and the risk ranking was Dongting Lake Basin > Poyang Lake Basin > Han River Basin. The model constructed in this study reveals the disaster risk law, provides theoretical support for the measurement of hydrological drought risk in multiple regions at the same time, and is of great significance for the prediction of compound drought disaster risk. Full article
(This article belongs to the Section Hydrology)
Show Figures

Figure 1

Back to TopTop