Symmetry/Asymmetry in Neural Networks

A special issue of Symmetry (ISSN 2073-8994). This special issue belongs to the section "Computer".

Deadline for manuscript submissions: 30 November 2024 | Viewed by 3199

Special Issue Editors


E-Mail Website
Guest Editor
College of Automation, Chongqing University, Chongqing, China
Interests: neural networks; stochastic system; intelligent system; robotics

E-Mail Website
Guest Editor
The School of Mechanical and Aerospace Engineering, Nanyang Technological University, Singapore
Interests: control and optimization; source seeking; nonlinear system; multi-agent systems; formation control
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

In the rapidly evolving field of Artificial Intelligence, the concepts of symmetry and asymmetry in neural networks have garnered significant attention. These concepts play a crucial role in the design, function, and performance of various types of neural networks, including static neural networks (SNNs), recurrent neural networks (RNNs), and deep learning architectures.

Symmetry in neural networks often relates to the architecture's ability to respond identically to identical inputs, regardless of their orientation or position, enhancing the network's generalization capabilities. Conversely, introducing asymmetry, whether in data representations, network architectures, or learning algorithms, can lead to a more specialized and efficient processing of complex and variable data sets.

Dr. Yufeng Tian
Dr. Zhenghong Jin
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Symmetry is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • neural network model
  • stability analysis of neural networks
  • state estimation on neural networks
  • neural networks and deep learning
  • neural networks and their applications in robotics

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

16 pages, 1771 KiB  
Article
Admissibility Analysis and Controller Design Improvement for T-S Fuzzy Descriptor Systems
by Han Yang, Shuanghong Zhang and Fanqi Yu
Symmetry 2024, 16(8), 992; https://doi.org/10.3390/sym16080992 - 5 Aug 2024
Viewed by 765
Abstract
In this paper, a stability analysis and the controller improvement of T-S fuzzy Descriptor system are studied. Firstly, by making full use of the related theory of fuzzy affiliation function and combining the design method of fuzzy Lyapunov function with the method of [...] Read more.
In this paper, a stability analysis and the controller improvement of T-S fuzzy Descriptor system are studied. Firstly, by making full use of the related theory of fuzzy affiliation function and combining the design method of fuzzy Lyapunov function with the method of inequality deflation, a stability condition with wider admissibility and less system conservatism is proposed. The advantage of this method is that it is not necessary to ensure that each fuzzy subsystem is progressively stable. We also maximise the boundary of the derivatives of the affiliation function mined. Secondly, a PDC controller and a Non-PDC controller are designed, and the deflation conditions for the linear matrix inequalities of the two controllers are constructed. Finally, some arithmetic simulations and practical examples are given to demonstrate the effectiveness of the method studied in this paper, and the results obtained are less conservative and have larger feasible domains than previous methods. Full article
(This article belongs to the Special Issue Symmetry/Asymmetry in Neural Networks)
Show Figures

Figure 1

15 pages, 1606 KiB  
Article
A Symmetric Efficient Spatial and Channel Attention (ESCA) Module Based on Convolutional Neural Networks
by Huaiyu Liu, Yueyuan Zhang and Yiyang Chen
Symmetry 2024, 16(8), 952; https://doi.org/10.3390/sym16080952 - 25 Jul 2024
Viewed by 945
Abstract
In recent years, attention mechanisms have shown great potential in various computer vision tasks. However, most existing methods focus on developing more complex attention modules for better performance, which inevitably increases the complexity of the model. To overcome performance and complexity tradeoffs, this [...] Read more.
In recent years, attention mechanisms have shown great potential in various computer vision tasks. However, most existing methods focus on developing more complex attention modules for better performance, which inevitably increases the complexity of the model. To overcome performance and complexity tradeoffs, this paper proposes efficient spatial and channel attention (ESCA), a symmetric, comprehensive, and efficient attention module. By analyzing squeeze-and-excitation (SE), convolutional block attention module (CBAM), coordinate attention (CA), and efficient channel attention (ECA) modules, we abandon the dimension-reduction operation of SE module, verify the negative impact of global max pooling (GMP) on the model, and apply a local cross-channel interaction strategy without dimension reduction to learn attention. We not only care about the channel features of the image, we also care about the spatial location of the target on the image, and we take into account the effectiveness of channel attention, so we designed the symmetric ESCA module. The ESCA module is effective, as demonstrated by its application in the ResNet-50 classification benchmark. With 26.26 M parameters and 8.545 G FLOPs, it introduces a mere 0.14% increment in FLOPs while achieving over 6.33% improvement in Top-1 accuracy and exceeding 3.25% gain in Top-5 accuracy. We perform image classification and object detection tasks on ResNet, MobileNet, YOLO, and other architectures on popular datasets such as Mini ImageNet, CIFAR-10, and VOC 2007. Experiments show that ESCA can achieve great improvement in model accuracy at a very small cost, and it performs well among similar models. Full article
(This article belongs to the Special Issue Symmetry/Asymmetry in Neural Networks)
Show Figures

Figure 1

28 pages, 14344 KiB  
Article
Evaluation of Classification Performance of New Layered Convolutional Neural Network Architecture on Offline Handwritten Signature Images
by Yasin Ozkan and Pakize Erdogmus
Symmetry 2024, 16(6), 649; https://doi.org/10.3390/sym16060649 - 23 May 2024
Viewed by 838
Abstract
While there are many verification studies on signature images using deep learning algorithms in the literature, there is a lack of studies on the classification of signature images. Signatures are used as a means of identification for banking, security controls, symmetry, certificates, and [...] Read more.
While there are many verification studies on signature images using deep learning algorithms in the literature, there is a lack of studies on the classification of signature images. Signatures are used as a means of identification for banking, security controls, symmetry, certificates, and contracts. In this study, the aim was to design network architectures that work very fast in areas that require only signature images. For this purpose, a new Si-CNN network architecture with existing layers was designed. Afterwards, a new loss function and layer (Si-CL), a novel architecture using Si-CL as classification layer in Si-CNN to increase the performance of this architecture, was designed. This architecture was called Si-CNN+NC (New Classification). Si-CNN and Si-CNN+NC were trained with two datasets. The first dataset which was used for training is the “C-Signatures” (Classification Signatures) dataset, which was created to test these networks. The second dataset is the “Cedar” dataset, which is a benchmark dataset. The number of classes and sample numbers in the two datasets are symmetrical with each other. To compare the performance of the trained networks, four of the most well-known pre-trained networks, GoogleNet, DenseNet201, Inceptionv3, and ResNet50, were also trained with the two datasets with transfer learning. The findings of the study showed that the proposed network models can learn features from two different handwritten signature images and achieve higher accuracy than other benchmark models. The test success of the trained networks showed that the Si-CNN+NC network outperforms the others, in terms of both accuracy and speed. Finally, Si-CNN and Si-CNN+NC networks were trained with the gold standard dataset MNIST and showed superior performance. Due to its superior performance, Si-CNN and Si-CNN+NC can be used by signature experts as an aid in a variety of applications, including criminal detection and forgery. Full article
(This article belongs to the Special Issue Symmetry/Asymmetry in Neural Networks)
Show Figures

Figure 1

Back to TopTop