Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (26)

Search Parameters:
Keywords = binary decision diagram

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
33 pages, 2512 KB  
Article
Evolutionary Framework with Binary Decision Diagram for Multi-Classification: A Human-Inspired Approach
by Boyuan Zhang, Wu Ma, Zhi Lu and Bing Zeng
Electronics 2025, 14(15), 2942; https://doi.org/10.3390/electronics14152942 - 23 Jul 2025
Viewed by 504
Abstract
Current mainstream classification methods predominantly employ end-to-end multi-class frameworks. These approaches encounter inherent challenges including high-dimensional feature space complexity, decision boundary ambiguity that escalates with increasing class cardinality, sensitivity to label noise, and limited adaptability to dynamic model expansion. However, human beings may [...] Read more.
Current mainstream classification methods predominantly employ end-to-end multi-class frameworks. These approaches encounter inherent challenges including high-dimensional feature space complexity, decision boundary ambiguity that escalates with increasing class cardinality, sensitivity to label noise, and limited adaptability to dynamic model expansion. However, human beings may avoid these mistakes naturally. Research indicates that humans subconsciously employ a decision-making process favoring binary outcomes, particularly when responding to questions requiring nuanced differentiation. Intuitively, responding to binary inquiries such as “yes/no” often proves easier for humans than addressing queries of “what/which”. Inspired by the human decision-making hypothesis, we proposes a decision paradigm named the evolutionary binary decision framework (EBDF) centered around binary classification, evolving from traditional multi-classifiers in deep learning. To facilitate this evolution, we leverage the top-N outputs from the traditional multi-class classifier to dynamically steer subsequent binary classifiers, thereby constructing a cascaded decision-making framework that emulates the hierarchical reasoning of a binary decision tree. Theoretically, we demonstrate mathematical proof that by surpassing a certain threshold of the performance of binary classifiers, our framework may outperform traditional multi-classification framework. Furthermore, we conduct experiments utilizing several prominent deep learning models across various image classification datasets. The experimental results indicate significant potential for our strategy to surpass the ceiling in multi-classification performance. Full article
(This article belongs to the Special Issue Advances in Machine Learning for Image Classification)
Show Figures

Figure 1

20 pages, 4617 KB  
Article
Rapid Probabilistic Inundation Mapping Using Local Thresholds and Sentinel-1 SAR Data on Google Earth Engine
by Jiayong Liang, Desheng Liu, Lihan Feng and Kangning Huang
Remote Sens. 2025, 17(10), 1747; https://doi.org/10.3390/rs17101747 - 16 May 2025
Viewed by 1563
Abstract
Traditional inundation mapping often relies on deterministic methods that offer only binary outcomes (inundated or not) based on satellite imagery analysis. While widely used, these methods do not convey the level of confidence in inundation classifications to account for ambiguity or uncertainty, limiting [...] Read more.
Traditional inundation mapping often relies on deterministic methods that offer only binary outcomes (inundated or not) based on satellite imagery analysis. While widely used, these methods do not convey the level of confidence in inundation classifications to account for ambiguity or uncertainty, limiting their utility in operational decision-making and rapid response contexts. To address these limitations, we propose a rapid probabilistic inundation mapping method that integrates local thresholds derived from Sentinel-1 SAR images and land cover data to estimate surface water probabilities. Tested on different flood events across five continents, this approach proved both efficient and effective, particularly when deployed via the Google Earth Engine (GEE) platform. The performance metrics—Brier Scores (0.05–0.07), Logarithmic Loss (0.1–0.2), Expected Calibration Error (0.03–0.04), and Reliability Diagrams—demonstrated reliable accuracy. VV (vertical transmit and vertical receive) polarization, given appropriate samples, yielded strong results. Additionally, the influence of different land cover types on the performance was also observed. Unlike conventional deterministic methods, this probabilistic framework allows for the estimation of inundation likelihood while accounting for variations in SAR signal characteristics across different land cover types. Moreover, it enables users to refine local thresholds or integrate on-the-ground knowledge, providing enhanced adaptability over traditional methods. Full article
Show Figures

Graphical abstract

15 pages, 667 KB  
Article
An Innovative Linear Wireless Sensor Network Reliability Evaluation Algorithm
by Tao Ma, Huidong Guo and Xin Li
Sensors 2025, 25(1), 285; https://doi.org/10.3390/s25010285 - 6 Jan 2025
Viewed by 1285
Abstract
In recent years, wireless sensor networks (WSNs) have become a crucial technology for infrastructure monitoring. To ensure the reliability of monitoring services, evaluating the network’s reliability is particularly important. Sensor nodes are distributed linearly when monitoring linear structures, such as railway bridges, forming [...] Read more.
In recent years, wireless sensor networks (WSNs) have become a crucial technology for infrastructure monitoring. To ensure the reliability of monitoring services, evaluating the network’s reliability is particularly important. Sensor nodes are distributed linearly when monitoring linear structures, such as railway bridges, forming what is known as a Linear Wireless Sensor Network (LWSN). Although existing evaluation methods, such as enumeration and Binary Decision Diagram (BDD)-based methods, can be used to assess the reliability of various types of networks, their efficiency is relatively low. Therefore, we classified network states based on the number of failed nodes located at the network’s ends and analyzed the arrangement characteristics of nodes under different network states. This paper proposed a new reliability assessment method for LWSNs. This method is based on the combinatorial patterns of nodes and uses the concept of integer partitions to calculate the total number of states at different performance levels, applying probability formulas to assess network reliability. Compared to Multi-Valued Decision Diagram (MDD)-based evaluation algorithms, this method is suitable for large-scale LWSNs and offers lower time complexity. Full article
(This article belongs to the Special Issue Wireless Sensor Networks for Health Monitoring)
Show Figures

Figure 1

38 pages, 36523 KB  
Article
Application of Machine Learning to Research on Trace Elemental Characteristics of Metal Sulfides in Se-Te Bearing Deposits
by Xiaoxuan Zhang, Da Wang, Huchao Ma, Saina Dong, Zhiyu Wang and Zhenlei Wang
Minerals 2024, 14(6), 538; https://doi.org/10.3390/min14060538 - 23 May 2024
Cited by 5 | Viewed by 2391
Abstract
This study focuses on exploring the indication and importance of selenium (Se) and tellurium (Te) in distinguishing different genetic types of ore deposits. Traditional views suggest that dispersed elements are unable to form independent deposits, but are hosted within deposits of other elements [...] Read more.
This study focuses on exploring the indication and importance of selenium (Se) and tellurium (Te) in distinguishing different genetic types of ore deposits. Traditional views suggest that dispersed elements are unable to form independent deposits, but are hosted within deposits of other elements as associated elements. Based on this, the study collected trace elemental data of pyrite, sphalerite, and chalcopyrite in various types of Se-Te bearing deposits. The optimal end-elements for distinguishing different genetic type deposits were recognized by principal component analysis (PCA) and the silhouette coefficient method, and discriminant diagrams were drawn. However, support vector machine (SVM) calculation of the decision boundary shows low accuracy, revealing the limitations in binary discriminant visualization for ore deposit type discrimination. Consequently, two machine learning algorithms, random forest (RF) and SVM, were used to construct ore genetic type classification models on the basis of trace elemental data for the three types of metal sulfides. The results indicate that the RF classification model for pyrite exhibits the best performance, achieving an accuracy of 94.5% and avoiding overfitting errors. In detail, according to the feature importance analysis, Se exhibits higher Shapley Additive Explanations (SHAP) values in volcanogenic massive sulfide (VMS) and epithermal deposits, especially the latter, where Se is the most crucial distinguishing element. By comparison, Te shows a significant contribution to distinguishing Carlin-type deposits. Conversely, in porphyry- and skarn-type deposits, the contributions of Se and Te were relatively lower. In conclusion, the application of machine learning methods provides a novel approach for ore genetic type classification and discrimination research, enabling more accurate identification of ore genetic types and contributing to the exploration and development of mineral resources. Full article
(This article belongs to the Special Issue Selenium, Tellurium and Precious Metal Mineralogy)
Show Figures

Graphical abstract

16 pages, 533 KB  
Article
Dictionary Encoding Based on Tagged Sentential Decision Diagrams
by Deyuan Zhong, Liangda Fang and Quanlong Guan
Algorithms 2024, 17(1), 42; https://doi.org/10.3390/a17010042 - 18 Jan 2024
Viewed by 1972
Abstract
Encoding a dictionary into another representation means that all the words can be stored in the dictionary in a more efficient way. In this way, we can complete common operations in dictionaries, such as (1) searching for a word in the dictionary, (2) [...] Read more.
Encoding a dictionary into another representation means that all the words can be stored in the dictionary in a more efficient way. In this way, we can complete common operations in dictionaries, such as (1) searching for a word in the dictionary, (2) adding some words to the dictionary, and (3) removing some words from the dictionary, in a shorter time. Binary decision diagrams (BDDs) are one of the most famous representations of such encoding and are widely popular due to their excellent properties. Recently, some people have proposed encoding dictionaries into BDDs and some variants of BDDs and showed that it is feasible. Hence, we further investigate the topic of encoding dictionaries into decision diagrams. Tagged sentential decision diagrams (TSDDs), as one of these variants based on structured decomposition, exploit both the standard and zero-suppressed trimming rules. In this paper, we first introduce how to use Boolean functions to represent dictionary files and then design an algorithm that encodes dictionaries into TSDDs with the help of tries and a decoding algorithm that restores TSDDs to dictionaries. We utilize the help of tries in the encoding algorithm, which greatly accelerates the encoding process. Considering that TSDDs integrate two trimming rules, we believe that using TSDDs to represent dictionaries would be more effective, and the experiments also show this. Full article
Show Figures

Figure 1

30 pages, 3404 KB  
Article
Calculation of the System Unavailability Measures of Component Importance Using the D2T2 Methodology of Fault Tree Analysis
by John Andrews and Sally Lunt
Mathematics 2024, 12(2), 292; https://doi.org/10.3390/math12020292 - 16 Jan 2024
Viewed by 1889
Abstract
A recent development in Fault Tree Analysis (FTA), known as Dynamic and Dependent Tree Theory (D2T2), accounts for dependencies between the basic events, making FTA more powerful. The method uses an integrated combination of Binary Decision Diagrams (BDDs), Stochastic [...] Read more.
A recent development in Fault Tree Analysis (FTA), known as Dynamic and Dependent Tree Theory (D2T2), accounts for dependencies between the basic events, making FTA more powerful. The method uses an integrated combination of Binary Decision Diagrams (BDDs), Stochastic Petri Nets (SPN) and Markov models. Current algorithms enable the prediction of the system failure probability and failure frequency. This paper proposes methods which extend the current capability of the D2T2 framework to calculate component importance measures. Birnbaum’s measure of importance, the Criticality measure of importance, the Risk Achievement Worth (RAW) measure of importance and the Risk Reduction Worth (RRW) measure of importance are considered. This adds a vital ability to the framework, enabling the influence that components have on system failure to be determined and the most effective means of improving system performance to be identified. The algorithms for calculating each measure of importance are described and demonstrated using a pressure vessel cooling system. Full article
(This article belongs to the Special Issue Reliability Analysis and Stochastic Models in Reliability Engineering)
Show Figures

Figure 1

23 pages, 4520 KB  
Article
Combinatorial Test Case Generation Based on ROBDD and Improved Particle Swarm Optimization Algorithm
by Shunxin Li, Yinglei Song and Yaying Zhang
Appl. Sci. 2024, 14(2), 753; https://doi.org/10.3390/app14020753 - 16 Jan 2024
Cited by 5 | Viewed by 1748
Abstract
In applications of software testing, the cause–effect graph method is an approach often used to design test cases by analyzing various combinations of inputs with a graphical approach. However, not all inputs have equal impacts on the results, and approaches based on exhaustive [...] Read more.
In applications of software testing, the cause–effect graph method is an approach often used to design test cases by analyzing various combinations of inputs with a graphical approach. However, not all inputs have equal impacts on the results, and approaches based on exhaustive testing are generally time-consuming and laborious. As a statute-based software-testing method, combinatorial testing aims to select a small but effective number of test cases from the large space of all possible combinations of the input values for the software to be tested, and to generate a set of test cases with a high degree of coverage and high error detection capability. In this paper, the reduced ordered binary decision diagram is utilized to simplify the cause–effect graph so as to reduce the numbers of both the inputs and test cases, thereby saving the testing cost. In addition, an improved particle swarm optimization algorithm is proposed to significantly reduce the computation time needed to generate test cases. Experiments on several systems show that the proposed method can generate excellent results for test case generation. Full article
Show Figures

Figure 1

13 pages, 1792 KB  
Article
Reliability and Service Life Analysis of Airbag Systems
by Hongyan Dui, Jiaying Song and Yun-an Zhang
Mathematics 2023, 11(2), 434; https://doi.org/10.3390/math11020434 - 13 Jan 2023
Cited by 5 | Viewed by 3446
Abstract
Airbag systems are important to a car’s safety protection system. To further improve the reliability of the system, this paper analyzes the failure mechanism of automotive airbag systems and establishes a dynamic fault tree model. The dynamic fault tree model is transformed into [...] Read more.
Airbag systems are important to a car’s safety protection system. To further improve the reliability of the system, this paper analyzes the failure mechanism of automotive airbag systems and establishes a dynamic fault tree model. The dynamic fault tree model is transformed into a continuous-time Bayesian network by introducing a unit step function and an impulse function, from which the failure probability of the system is calculated. Finally, the system reliability and average life are calculated and analyzed and compared with the sequential binary decision diagram method. The results show that the method can obtain more accurate system reliability and effectively identify the weak parts of the automotive airbag system, to a certain extent compensating for the lack of computational complexity of dynamic Bayesian networks in solving system reliability problems with continuous failure processes. Full article
(This article belongs to the Section E2: Control Theory and Mechanics)
Show Figures

Figure 1

22 pages, 26866 KB  
Article
A Kamm’s Circle-Based Potential Risk Estimation Scheme in the Local Dynamic Map Computation Enhanced by Binary Decision Diagrams
by Arvind Kumar and Hiroaki Wagatsuma
Sensors 2022, 22(19), 7253; https://doi.org/10.3390/s22197253 - 24 Sep 2022
Cited by 2 | Viewed by 3178
Abstract
Autonomous vehicles (AV) are a hot topic for safe mobility, which inevitably requires sensors to achieve autonomy, but relying too heavily on sensors will be a risk factor. A high-definition map (HD map) reduces the risk by giving geographical information if it covers [...] Read more.
Autonomous vehicles (AV) are a hot topic for safe mobility, which inevitably requires sensors to achieve autonomy, but relying too heavily on sensors will be a risk factor. A high-definition map (HD map) reduces the risk by giving geographical information if it covers dynamic information from moving entities on the road. Cooperative intelligent transport systems (C-ITS) are a prominent approach to solving the issue and local dynamic maps (LDMs) are expected to realize the ideal C-ITS. An actual LDM implementation requires a fine database design to be able to update the information to represent potential risks based on future interactions of vehicles. In the present study, we proposed an advanced method for embedding the geographical future occupancy of vehicles into the database by using a binary decision diagram (BDD). In our method, the geographical future occupancy of vehicles was formulated with Kamm’s circle. In computer experiments, sharing BDD-based occupancy data was successfully demonstrated in the ROS-based simulator with the linked list-based BDD. Algebraic operations in exchanged BDDs effectively managed future interactions such as data insertion and timing of collision avoidance in the LDM. This result opened a new door for the realization of the ideal LDM for safety in AVs. Full article
(This article belongs to the Special Issue Sensors and Sensing for Automated Driving)
Show Figures

Figure 1

28 pages, 4606 KB  
Article
Towards a Systematic Description of Fault Tree Analysis Studies Using Informetric Mapping
by Kai Pan, Hui Liu, Xiaoqing Gou, Rui Huang, Dong Ye, Haining Wang, Adam Glowacz and Jie Kong
Sustainability 2022, 14(18), 11430; https://doi.org/10.3390/su141811430 - 12 Sep 2022
Cited by 17 | Viewed by 6754
Abstract
Fault tree analysis (FTA) is one of the important analysis methods of safety system engineering commonly utilized in various industries to evaluate and improve the reliability and safety of complex systems. To grasp the current situation and development trend of FTA research and [...] Read more.
Fault tree analysis (FTA) is one of the important analysis methods of safety system engineering commonly utilized in various industries to evaluate and improve the reliability and safety of complex systems. To grasp the current situation and development trend of FTA research and to further point out FTA’s future development directions, 1469 FTA-related articles from the literature were retrieved from the SCIE and SSCI databases. Informetric methods, including co-authorship analysis, co-citation analysis and co-occurrence analysis, were adopted for analyzing the cooperation relationship, research knowledge base, research hotspots and frontier in the FTA research field. The results show that China has the highest number of publications, and the Loughborough University of England has the highest number of publications of relevant institutions. Dynamic fault tree analysis, fuzzy fault tree analysis and FTA based on binary decision diagrams are recognized as the knowledge bases in FTA studies. “Reliability Engineering and System Safety”, “Safety Science” and “Fuzzy Sets and Systems” are the core journals in this field. Fuzzy fault tree analysis, dynamic fault tree analysis based on Bayesian networks and FTA combined with management factors may be both the main research hotspots and the frontiers. Then, by deriving the above results, this study can help scholars better master the current research status and frontiers of FTA to improve system reliability and safety. Full article
Show Figures

Figure 1

23 pages, 795 KB  
Article
Storing Set Families More Compactly with Top ZDDs
by Kotaro Matsuda, Shuhei Denzumi and Kunihiko Sadakane
Algorithms 2021, 14(6), 172; https://doi.org/10.3390/a14060172 - 31 May 2021
Cited by 1 | Viewed by 3112
Abstract
Zero-suppressed Binary Decision Diagrams (ZDDs) are data structures for representing set families in a compressed form. With ZDDs, many valuable operations on set families can be done in time polynomial in ZDD size. In some cases, however, the size of ZDDs for representing [...] Read more.
Zero-suppressed Binary Decision Diagrams (ZDDs) are data structures for representing set families in a compressed form. With ZDDs, many valuable operations on set families can be done in time polynomial in ZDD size. In some cases, however, the size of ZDDs for representing large set families becomes too huge to store them in the main memory. This paper proposes top ZDD, a novel representation of ZDDs which uses less space than existing ones. The top ZDD is an extension of the top tree, which compresses trees, to compress directed acyclic graphs by sharing identical subgraphs. We prove that navigational operations on ZDDs can be done in time poly-logarithmic in ZDD size, and show that there exist set families for which the size of the top ZDD is exponentially smaller than that of the ZDD. We also show experimentally that our top ZDDs have smaller sizes than ZDDs for real data. Full article
(This article belongs to the Special Issue Algorithms and Data-Structures for Compressed Computation)
Show Figures

Figure 1

12 pages, 9350 KB  
Article
The Stoichiometry, Structure and Possible Formation of Crystalline Diastereomeric Salts
by Dorottya Fruzsina Bánhegyi and Emese Pálovics
Symmetry 2021, 13(4), 667; https://doi.org/10.3390/sym13040667 - 13 Apr 2021
Cited by 9 | Viewed by 4777
Abstract
Knowing the eutectic composition of the binary melting point phase diagrams of the diastereomeric salts formed during the given resolution, the achievable F (F = eeDia*Y) value can be calculated. The same value can also be calculated and predicted by knowing [...] Read more.
Knowing the eutectic composition of the binary melting point phase diagrams of the diastereomeric salts formed during the given resolution, the achievable F (F = eeDia*Y) value can be calculated. The same value can also be calculated and predicted by knowing the eutectic compositions of the binary melting point phase diagrams of enantiomeric mixtures of the racemic compound or the resolving agent. An explanation was sought as to why and how the crystalline precipitated diastereomeric salt—formed in the solution between a racemic compound and the corresponding resolving agent—may be formed. According to our idea, the self-disproportionation of enantiomers (SDE) has a decisive role when the enantiomers form two nonequal ratios of conformers in solution. The self-organized enantiomers form supramolecular associations having M and P helicity, and double helices are formed. Between these double spirals, with the formation of new double spirals, a dynamic equilibrium is achieved and the salt crystallizes. During this process between acids and bases, chelate structures may also be formed. Acids appear to have a crucial impact on these structures. It is assumed that the behavior of each chiral molecule is determined by its own code. This code validates the combined effect of constituent atoms, bonds, spatial structure, charge distribution, flexibility and complementarity. Full article
(This article belongs to the Collection Feature Papers in Chemistry)
Show Figures

Figure 1

13 pages, 2262 KB  
Article
A Method to Avoid Underestimated Risks in Seismic SUPSA and MUPSA for Nuclear Power Plants Caused by Partitioning Events
by Woo Sik Jung
Energies 2021, 14(8), 2150; https://doi.org/10.3390/en14082150 - 12 Apr 2021
Cited by 6 | Viewed by 2223
Abstract
Seismic probabilistic safety assessment (PSA) models for nuclear power plants (NPPs) have many non-rare events whose failure probabilities are proportional to the seismic ground acceleration. It has been widely accepted that minimal cut sets (MCSs) that are calculated from the seismic PSA fault [...] Read more.
Seismic probabilistic safety assessment (PSA) models for nuclear power plants (NPPs) have many non-rare events whose failure probabilities are proportional to the seismic ground acceleration. It has been widely accepted that minimal cut sets (MCSs) that are calculated from the seismic PSA fault tree should be converted into exact solutions, such as binary decision diagrams (BDDs), and that the accurate seismic core damage frequency (CDF) should be calculated from the exact solutions. If the seismic CDF is calculated directly from seismic MCSs, it is drastically overestimated. Seismic single-unit PSA (SUPSA) models have random failures of alternating operation systems that are combined with seismic failures of components and structures. Similarly, seismic multi-unit PSA (MUPSA) models have failures of NPPs that undergo alternating operations between full power and low power and shutdown (LPSD). Their failures for alternating operations are modeled using fraction or partitioning events in seismic SUPSA and MUPSA fault trees. Since partitioning events for one system are mutually exclusive, their combinations should be excluded in exact solutions. However, it is difficult to eliminate the combinations of mutually exclusive events without modifying PSA tools for generating MCSs from a fault tree and converting MCSs into exact solutions. If the combinations of mutually exclusive events are not deleted, seismic CDF is underestimated. To avoid CDF underestimation in seismic SUPSAs and MUPSAs, this paper introduces a process of converting partitioning events into conditional events, and conditional events are then inserted explicitly inside a fault tree. With this conversion, accurate CDF can be calculated without modifying PSA tools. That is, this process does not require any other special operations or tools. It is strongly recommended that the method in this paper be employed for avoiding CDF underestimation in seismic SUPSAs and MUPSAs. Full article
Show Figures

Figure 1

17 pages, 459 KB  
Article
Using Extended Logical Primitives for Efficient BDD Building
by David Fernandez-Amoros, Sergio Bra, Ernesto Aranda-Escolástico and Ruben Heradio
Mathematics 2020, 8(8), 1253; https://doi.org/10.3390/math8081253 - 31 Jul 2020
Cited by 13 | Viewed by 3144
Abstract
Binary Decision Diagrams (BDDs) have been used to represent logic models in a variety of research contexts, such as software product lines, circuit testing, and plasma confinement, among others. Although BDDs have proven to be very useful, the main problem with this technique [...] Read more.
Binary Decision Diagrams (BDDs) have been used to represent logic models in a variety of research contexts, such as software product lines, circuit testing, and plasma confinement, among others. Although BDDs have proven to be very useful, the main problem with this technique is that synthesizing BDDs can be a frustratingly slow or even unsuccessful process, due to its heuristic nature. We present an extension of propositional logic to tackle one recurring phenomenon in logic modeling, namely groups of variables related by an exclusive-or relationship, and also consider two other extensions: one in which at least n variables in a group are true and another one for in which at most n variables are true. We add XOR, atLeast-n and atMost-n primitives to logic formulas in order to reduce the size of the input and also present algorithms to efficiently incorporate these constructions into the building of BDDs. We prove, among other results, that the number of nodes created during the process for XOR groups is reduced from quadratic to linear for the affected clauses. the XOR primitive is tested against eight logical models, two from industry and six from Kconfig-based open-source projects. Results range from no negative effects in models without XOR relations to performance gains well into two orders of magnitude on models with an abundance of this kind of relationship. Full article
(This article belongs to the Section E1: Mathematics and Computer Science)
Show Figures

Figure 1

20 pages, 1237 KB  
Article
Reversible Circuit Synthesis Time Reduction Based on Subtree-Circuit Mapping
by Amjad Hawash, Ahmed Awad and Baker Abdalhaq
Appl. Sci. 2020, 10(12), 4147; https://doi.org/10.3390/app10124147 - 16 Jun 2020
Cited by 6 | Viewed by 3785
Abstract
Several works have been conducted regarding the reduction of the energy consumption in electrical circuits. Reversible circuit synthesis is considered to be one of the major efforts at reducing the amount of power consumption. The field of reversible circuit synthesis uses a large [...] Read more.
Several works have been conducted regarding the reduction of the energy consumption in electrical circuits. Reversible circuit synthesis is considered to be one of the major efforts at reducing the amount of power consumption. The field of reversible circuit synthesis uses a large number of proposed algorithms to minimize the overall cost of circuits synthesis (represented in the line number and quantum cost), with minimal concern paid for synthesis time. However, because of the iterative nature of the synthesis optimization algorithms, synthesis time cannot be neglected as a parameter which needs to be tackled, especially for large-scale circuits which need to be realized by cascades of reversible gates. Reducing the synthesis cost can be achieved by Binary Decision Diagrams (BDDs), which are considered to be a step forward in this field. Nevertheless, the mapping of each BDD node into a cascade of reversible gates during the synthesis process is time-consuming. In this work, we implement the idea of the subtree-based mapping of BDD nodes to reversible gates instead of the classical nodal-based algorithm to effectively reduce the entire reversible circuit synthesis time. Considering Depth-First Search (DFS), we convert an entire BDD subtree in one step into a cascade of reversible gates. A look-up table for all possible combinations of subtrees and their corresponding reversible gates has been constructed, in which a hash key is used to directly access subtrees during the mapping process. This table is constructed as a result of a comprehensive study of all possible BDD subtrees and considered as a reference during the conversion process. The conducted experimental tests show a significant synthesis time reduction (around 95% on average), preserving the correctness of the algorithm in generating a circuit realizing the required Boolean function. Full article
(This article belongs to the Section Quantum Science and Technology)
Show Figures

Figure 1

Back to TopTop