Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (1)

Search Parameters:
Keywords = Chaos-Integrated Synaptic-Memory Network (CISMN)

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
37 pages, 1654 KB  
Article
CISMN: A Chaos-Integrated Synaptic-Memory Network with Multi-Compartment Chaotic Dynamics for Robust Nonlinear Regression
by Yaser Shahbazi, Mohsen Mokhtari Kashavar, Abbas Ghaffari, Mohammad Fotouhi and Siamak Pedrammehr
Mathematics 2025, 13(9), 1513; https://doi.org/10.3390/math13091513 - 4 May 2025
Cited by 3 | Viewed by 2044
Abstract
Modeling complex, non-stationary dynamics remains challenging for deterministic neural networks. We present the Chaos-Integrated Synaptic-Memory Network (CISMN), which embeds controlled chaos across four modules—Chaotic Memory Cells, Chaotic Plasticity Layers, Chaotic Synapse Layers, and a Chaotic Attention Mechanism—supplemented by a logistic-map learning-rate schedule. Rigorous [...] Read more.
Modeling complex, non-stationary dynamics remains challenging for deterministic neural networks. We present the Chaos-Integrated Synaptic-Memory Network (CISMN), which embeds controlled chaos across four modules—Chaotic Memory Cells, Chaotic Plasticity Layers, Chaotic Synapse Layers, and a Chaotic Attention Mechanism—supplemented by a logistic-map learning-rate schedule. Rigorous stability analyses (Lyapunov exponents, boundedness proofs) and gradient-preservation guarantees underpin our design. In experiments, CISMN-1 on a synthetic acoustical regression dataset (541 samples, 22 features) achieved R2 = 0.791 and RMSE = 0.059, outpacing physics-informed and attention-augmented baselines. CISMN-4 on the PMLB sonar benchmark (208 samples, 60 bands) attained R2 = 0.424 and RMSE = 0.380, surpassing LSTM, memristive, and reservoir models. Across seven standard regression tasks with 5-fold cross-validation, CISMN led on diabetes (R2 = 0.483 ± 0.073) and excelled in high-dimensional, low-sample regimes. Ablations reveal a scalability–efficiency trade-off: lightweight variants train in <10 s with >95% peak accuracy, while deeper configurations yield marginal gains. CISMN sustains gradient norms (~2300) versus LSTM collapse (<3), and fixed-seed protocols ensure <1.2% MAE variation. Interpretability remains challenging (feature-attribution entropy ≈ 2.58 bits), motivating future hybrid explanation methods. CISMN recasts chaos as a computational asset for robust, generalizable modeling across scientific, financial, and engineering domains. Full article
(This article belongs to the Special Issue Advances in Machine Learning and Graph Neural Networks)
Show Figures

Figure 1

Back to TopTop