1. Introduction
1.1. Scenario
This work aims to contribute to addressing the urgent need for cryptographic techniques that are resilient to quantum computing threats [
1] by proposing a hybrid methodology that is immediately and practically implementable and conceptually grounded in quantum principles (i.e., inspired by quantum mechanics). Unlike traditional approaches that require quantum hardware, our model is fully executable on classical computational systems, making it suitable for immediate adoption in post-quantum security frameworks. We introduce a layered structure consisting of (1) quantum-inspired entropy sources, (2) a multiscale prime number transformation, and (3) a fusion-based architecture designed to improve cryptographic strength through structural and informational integration. The proposed algorithms are implemented and evaluated through entropy analysis, runtime benchmarking, and NIST randomness tests. In this section, we also contextualise our contribution within the broader literature on quantum and post-quantum cryptography.
In defence, or in databases of classified and confidential documents of public or private organisations, as well as in a country’s infrastructure, access must be allowed only through user authentication. The key point of this work concerns the integration of heterogeneous data through information fusion. Information fusion (IF) is a relatively new field of research: It involves information from two or more sources being ultimately brought together in an attempt to acquire super-information, whose richness of detail and accuracy of data are superior to those that can be obtained by processing the data separately, i.e., a higher level of synthesis that can be achieved through methodological and non-categorical reductionism. From this perspective, an access key has been created to provide not only a high level of secrecy but also, with a high probability, the identification of a person using quantum components and public key cryptography simultaneously. The latter is the technique that provides the strongest response to cryptographic attacks. At the same time, cryptography has become an increasingly important field of research.
While the next subsection will consider the context and related work, the following sections will outline the proposed methodology through a layered structure.
Section 2 introduces the JDL model as a functional basis for fusion-based cryptographic processes.
Section 3 presents the theoretical background of quantum-inspired randomness, while
Section 4 introduces a multiscale prime number generator for hybrid key generation.
Section 5 describes five quantum-inspired algorithms implemented in Python for key generation.
Section 6 offers a comparative analysis of these algorithms based on entropy and execution time.
Section 7 discusses optimisation strategies, leading to the ultra-optimised version.
Section 8 evaluates the security implications of these implementations.
Section 9 explains the fusion framework that combines quantum and prime number-derived components.
Section 10 validates the randomness of the key using NIST tests for classical randomness, and
Section 11 concludes by summarising the cryptographic relevance and potential of the proposed hybrid solution. Throughout the work, the term crypto-agile refers to the ability of a system to adaptively change cryptographic primitives, using two specific demons, based on contextual criteria or triggers. This definition, in line with NIST SP 800-135 standards, has been adopted uniformly.
1.2. Context and Related Works
In ref. [
2], a new algorithm was introduced to improve RSA prime number-based encryption with information fusion and fractal technology. Quantum features were introduced, namely the random evolution of key fragments and the recombination of key fragments to obtain a quantum-resistant key, as the key becomes cryptographically agile. Techniques that produce truly random numbers suitable for cryptographic systems have been available for decades [
3]. The same is true for prime numbers [
4]. However, the use of fractals as cryptographic tools is a recent idea and a rapidly growing field of research [
5,
6,
7,
8,
9,
10,
11,
12,
13,
14,
15,
16,
17]. It is clear that fractal geometry was born in a formal sense in 1982 with the publication of Benoit Mandelbrot’s book [
18] and, in recent times, so-called Mandelbrot sets are widely used in cryptographic practice. In this work, we would like to focus on number theory-based cryptography reinforced by an approach inspired by quantum mechanics to mimic real quantum behaviours [
19,
20,
21,
22], also using fusion techniques as in [
23]. What does a quantum mechanics-inspired approach mean and why use it? “Quantum mechanics-inspired” refers to the use of mathematical models and their numerical implementation used in quantum mechanics and computational physics. They are very useful for hybrid cryptography and during the transition from traditional to quantum computing. It is clear that quantum cryptography uses the properties of quantum mechanics to provide greater security than classical methods. The most important feature is quantum key distribution, which allows a cryptographic key to be generated and exchanged securely, detecting any interception. Just think of Heisenberg’s uncertainty principle. If we measure a quantum state, then we irreversibly alter the state itself. For this reason, if a hacker tries to intercept the key, the system recognises it immediately. Furthermore, in the quantum regime, the so-called entanglement applies. Two particles can be correlated in such a way that any change in one will have an immediate effect on the other, making the system extremely secure. Quantum theory is definitely the future of cryptography. It will be possible to send data in an inviolable way for banking, military, and government applications. There will be ultra-secure connections between quantum computers. Furthermore, classical cryptography is vulnerable to next-generation quantum computers, while quantum cryptography is able to resist them. In other words, quantum cryptography is a possible direction for ensuring secure communications in a post-quantum world. However, another important line of research, known as post-quantum cryptography, focuses on classical algorithms (such as lattice-based schemes, codes, or hashes) that are believed to be resistant to quantum attacks without relying on quantum communication channels. Without claiming to be exhaustive, a significant contribution to the state of the art, particularly with regard to hardware-based quantum algorithms, is provided in [
24,
25,
26,
27]. In addition, post-quantum cryptography (PQC), such as lattice-based algorithms, including Kyber and Dilithium, also provides resistance to quantum attacks [
19]. Note that this work is explicitly based on a quantum-inspired methodology, as indicated in the title. It does not rely on quantum hardware, but rather proposes post-quantum solutions suitable for immediate use on classical computing systems. Another relevant issue that we will discuss concerns quantum resistance [
28,
29,
30,
31,
32,
33,
34,
35,
36]. Our approach does not rely on cryptographic assumptions that are known to be broken by quantum computers, such as factorisation or discrete logarithm problems. Although we do not formally prove quantum resistance, the multilayered design, which combines quantum-inspired entropy, prime-number-based structural complexity, and fusion-based composition, adds layers of obfuscation and entropy that can contribute to post-quantum resilience. A complete cryptographic proof of post-quantum security is proposed as future work. Therefore, with regard to the quantum component of the current approach, we focus only on quantum-inspired cryptography. Recent advances in quantum-inspired cryptography have demonstrated increasing experimental feasibility. For example, Gandelman et al. [
37] provide a practical exploration of the B92 protocol using pulsed lasers, while Stathis et al. [
38] analyse the convergence of fibre- and satellite-based QKD networks. These developments motivate hybrid models such as QPIF.
2. JDL Classification
To observe how fusion systems are structured and applied, it is necessary to consider the JDL model, which has been used as a fundamental paradigm for many subsequent models and practical implementations designed over the years [
29]. The concept of data fusion was first formalised by the scientific community through the development of the JDL data fusion model (
Figure 1), which is still a standard reference in the industry today. The model was assigned several primary objectives: (i) to provide a common framework for discussion on information fusion; (ii) to enable the recognition of problems that can be solved by applying fusion techniques; (iii) to standardise the characteristics of fusion-type problems; (iv) to provide a framework that can be used to guide the design of computational solutions; (v) to enable the classification of various fusion processes.
The JDL model has several basic components: data sources, a database management system (DBMS), a human-computer interface (HCI), and a five-level fusion process. The sources provide raw data and can be sensors, prior knowledge (e.g., geographical or reference data), databases, or manual user input. The DBMS enables the system to store, retrieve, and manage data effectively, supporting the entire fusion process. The HCI comprises all the mechanisms and tools through which the user interacts with the system or receives output, such as visual displays, alarms, graphical symbols, and acoustic alerts. The five main levels of the JDL model are described below: Level 0 describes the estimation and prediction of the state of an object from raw signals or pixel-level data through association and characterisation. Level 1 involves the integration of sensor data to estimate and predict the state of the entities under observation, i.e., their position, speed, attributes, and identity. Level 2 considers the relationships between multiple entities and events in order to establish their state within a broader contextual scenario. Level 3 includes the analysis and prediction of the consequences of planned actions in various operational scenarios. Level 4 enables goal-oriented data collection and processing through adaptive management to improve overall performance and decision-making. Subsequently, Level 5, called cognitive or user refinement [
29], was included to provide human interpretation, decision support, and learning processes. While the original JDL model is primarily concerned with the fusion of data of the same type, our system, Hybrid Information Fusion, is concerned with heterogeneous data sources. This makes one-to-one mapping to the JDL architecture more complicated. However, our system fits better with Level 1 as it deals with the fusion of inputs from different sources to determine the state and identity of cryptographic factors obtained from quantum-inspired and prime number-based fused inputs.
As we will see below, the Joint Distribution Layering (JDL) model is a conceptual reference for integrating and justifying the overlapping of different domains in key generation (inspired by quantum mechanics, sieving, fusion). It has no direct impact on the mathematical robustness of the key but guides the modular design of the system and ensures epistemological consistency in the fusion of methods. In short, it is a determining factor in design rather than a standalone security component.
3. Theoretical Background Useful for the Quantum-Inspired Subkey
Quantum computing relies on fundamental principles of quantum mechanics, such as superposition and entanglement, to perform calculations. Heisenberg’s uncertainty principle is one of the most significant principles in quantum cryptography, according to which certain properties of a particle cannot be measured simultaneously with infinite precision. It is possible to generate a quantum encryption key using the superposition state of a quantum system. If the system must be quantised in terms of quantum state, then it must be represented as in (1), i.e., a superposition of basis states (see [
20]):
where
are complex coefficients. When observed, the wavefunction collapses and yields a random result, which can be employed to form a cryptographic key. Indeed, the collapse of the wavefunction is a fundamental principle of quantum mechanics stating that, upon measurement, a quantum state in superposition (e.g., |ψ〉 = α|0〉 + β|1〉) collapses into one of its basis states (|0〉 or |1〉) with a probability determined by the squared amplitudes of α and β [
20]. In computational and cryptographic contexts, this probabilistic behaviour can be emulated and used to generate non-deterministic bits or cryptographic key structures, as we will show in the next section in our EQRNG (Emulated Quantum Random Number Generator).
It is important to clarify that in our approach, the reference to wavefunction collapse is strictly metaphorical and inspired by computation. We do not use any real quantum devices or entangled systems. Instead, we simulate the probabilistic behaviour of quantum systems using classical pseudo-random generators tuned to mimic the expected distributions. Therefore, the term quantum entropy source is redefined here as a simulation of pseudo-quantum entropy, intended to emulate, rather than replicate, quantum uncertainty within classical frameworks.
We can also employ quantum key distribution (QKD) in its computational form (see [
21]), as a quantum mechanics-inspired computational version of the famous BB84 protocol [
22], in which photons are polarised and observed in different bases to arrive securely at a shared secret key. In [
21], photons are replaced by “computational photons”, or “information pockets”, which mimic the behaviour of photons in BB84. In this implementation, we generate a stream of bits from the probability distribution of a quantum particle in a potential given by the one-dimensional Schrödinger Equation (2)
where
is the system’s Hamiltonian given by the (3)
Let us consider a finite potential well, solve the equation numerically, and use the probability distribution to generate the encryption key. To generate a random encryption key, the method is as follows:
Solve the Schrödinger equation numerically for a particle in a finite potential well.
Sample the values of the wavefunction at discrete points.
Generate random bits based on the probability of measurement at each point.
Convert the bit sequence into a cryptographic key.
Therefore, the Schrödinger equation of quantum mechanics can be used to create secure cryptographic keys based on its probabilistic nature and the quantum behaviour of physical systems. Below is a step-by-step repetition of how it can be applied to cryptographic key generation:
- (1)
Use quantum energy levels. The Schrödinger equation specifies the quantum states of a particle in a potential. Each solution of the equation produces energy eigenvalues that are typically distributed in a non-deterministic manner. How to use it: You can numerically simulate the Schrödinger equation for a complex physical system (e.g., an atom in an arbitrary potential) and use the calculated eigenvalues to obtain random bits for a cryptographic key.
- (2)
Quantum random number generation (QRNG). Quantum mechanics is probabilistic. The measurement of a quantum system described by the Schrödinger equation produces random results that follow the probability distribution of the wavefunction. The use of Heisenberg’s uncertainty principle or measurements of quantum superposition states can provide extremely secure randomness, which can be used for cryptographic key generation.
- (3)
Quantum cryptography and key distribution (QKD). One of the most advanced applications of the Schrödinger equation in cryptography is quantum key distribution (QKD), for example the BB84 protocol. Application: QKD applies the quantum states of photons (e.g., polarisation or phase) to securely exchange cryptographic keys between two parties so that an attacker cannot intercept them without being detected.
- (4)
Computational simulations for complex keys. The numerical simulation of the Schrödinger equation for a multiparticle quantum system leads to chaotic behaviour that is difficult to predict. How it is used: a simulated quantum system can be used to generate sequences of pseudo-random numbers, which can then be used as input for cryptographic algorithms.
Quantum hash functions based on wave dynamics. Quantum hash functions can be constructed by exploiting the non-linear and probabilistic solutions of Schrödinger equations. How it is used: an algorithm can transform a quantum state into a deterministic but non-reversible hash, which can be used in quantum digital signatures.
4. Multiscale Prime Generator Subkey
Remember that the purpose of this research is to construct hybrid keys with one component driven by quantum numbers and a second component driven by prime numbers. In this section, we consider the subkey based on prime numbers. The second subkey that we offer as input to the QPIF algorithm is a product of prime numbers that we calculate by generating two prime numbers, each generated using a multiscale sieve, subject to the rules presented in detail in [
4], where the authors achieved the fastest sieve for prime number generation last year, which has not yet been surpassed. Here we have only summarised the main results relevant to the present work. We have a randomly chosen number n ∈ N, and the sieving algorithm will return n if it is prime or the closest prime number to n. The sieving algorithm is based on an automated procedure derived from a theorem established in [
34] which states that every prime number can be written as the difference between the product of the first m prime numbers and a prime number obtained in the previous level of gridding. Thus, using multiscale analysis, every prime number can be written as in Equation (4):
where
k is the counting index,
is a prime number purchased at the stage
m − 1 (or −1 or +1),
j is the level of decomposition, and
is the acquired prime number. The algorithm thereby accumulates step by step the candidate prime numbers useful for multiscale generation, to the nearest input prime number. Therefore, the prime input items to the algorithm are the following in (5):
It is the product of the first m primes that represents the multiscale level and is the basis for generating the prime candidates:
Each multiscale level k can range from 1 to a value , which changes at every level, and specifically, the last takes the value of , which is the prime number that serves us to generate the next multiscale level.
After generating the product between the multiscale level and k, we subtract , which is each prime number obtained at the previous multiscale level.
As proved in a recent work [
4], the suggested multiscale algorithm is not only feasible but also offers better performance than other sieves.
To clarify the procedure, consider the case n = 37. Instead of performing the classic divisibility tests, with their well-known computational costs, the multiscale sieve generates candidate prime numbers by evaluating the product of the first m = 4 prime numbers (2 × 3 × 5 × 7 = 210) and subtracting the known prime numbers from the previous level. For example, 210 − 173 = 37, which is itself a prime number. This confirms the recursive nature of the sieve and validates the generation path with respect to Formula (4). All variables are now explicitly indicated as pij, where i indicates the level and j the subcomponent.
5. Quantum-Inspired Algorithms and Their Development
Schrödinger’s equation provides a theoretical basis for most quantum cryptography techniques, from random number generation to secure key exchange (for more details, see
Appendix A in addition to the previous section). Although actual implementations must take into account advanced quantum hardware (e.g., a quantum computer or a QRNG device), the concepts can also be used within classical simulations for the purpose of improving cryptographic security. There are five programs that generate a 2048-bit decimal string (i.e., 256 decimal digits) for each of the situations described above (see
Appendix B for the code). Although random generators are classical pseudo-random number generators, we use them here as a baseline to prove how specific post-processing techniques inspired by quantum mechanics can increase entropy, reduce statistical bias, and approximate certain statistical characteristics observed in quantum randomness. This comparison does not aim to equate pseudo-random and quantum randomness, but rather to highlight conceptual parallels and evaluate the effectiveness of improved classical methods. The comparison serves to explore hybrid models within quantum-inspired cryptography and information theory:
- (1)
Key generation using the eigenvalues of the Schrödinger equation. We use a numerical method to solve the Schrödinger equation and generate random numbers based on its eigenvalues.
- (2)
Emulated quantum random number generator (EQRNG). This emulates a quantum random number generator based on probabilistic measurements.
- (3)
Computational key distribution (CQKD). This emulates the BB84 protocol for secure key exchange.
- (4)
Quantum simulations for pseudo-random numbers. We use a quantum system with arbitrary potential to generate a sequence of keys.
- (5)
Quantum hash function based on wave dynamics. We use a quantum Fourier transform (QFT) to generate a hash-like function.
Each algorithm generates a 2048-bit key (256 decimal digits) based on different quantum principles:
Eigenvalues of the Schrödinger equation (numerical quantum physics—this work),
QRNG (quantum random number generation),
QKD (BB84 protocol for secure key exchange),
Quantum simulation (extraction of quantum probability distributions),
Quantum hashing (hashing based on quantum waves).
These keys can be used for advanced cryptography and security applications. We conducted a comparative analysis of the five quantum key generation methods based on performance metrics, including execution time, entropy (randomness evaluation based on NIST guidelines), mean value, and standard deviation. You can examine the detailed results in the table provided.
Additionally, we generated the following graphical comparisons: 1. Execution Time Comparison which displays the computational efficiency of each method (see
Figure 2); 2. Entropy Comparison to evaluate the randomness of the generated keys, that is crucial for cryptographic security (see
Figure 3).
6. Analysis of Results for the First Implementation
A comparative analysis of the five methods of quantum key generation (Schrödinger eigenvalues, quantum random number generator (QRNG), quantum key distribution (QKD), quantum simulation, and quantum hashing) highlights different performance characteristics based on different criteria:
Execution time: the time required to generate a 2048-bit key varies depending on the methods, reflecting differences in computational complexity and resource requirements.
Entropy: this metric evaluates the randomness of the generated keys. Higher entropy values indicate results closer to ideal randomness, which is essential for cryptographic security.
Mean and standard deviation: these statistical measures provide information about the distribution of key values, with ideal random sequences exhibiting specific expected behaviours.
We compare this result with those reported in the previous literature. Over the last decade, enormous progress has been made in quantum key generation techniques. Among the most interesting developments are:
Quantum key distribution with decoy states: To address vulnerabilities in real-world QKD systems, such as photon number splitting attacks, the decoy state method was created. This method uses more than one intensity level at the transmitter source so that interception attempts can be detected and the secure transmission rate increased.
Dual-field QKD: Developed in 2018, this avoids the speed-distance limitations of traditional QKD by enabling secure key exchange over longer distances without the need for quantum repeaters. Experimental demonstrations have revealed that it can violate the secret key capacity limit without repeaters.
Quantum random number generators (QRNGs): Technological advances in the field of QRNGs have enabled hardware solutions based on emulated quantum effects that produce random (true pseudo-random) numbers. Generators are essential for cryptographic applications that require high-quality randomness.
References [
30,
31,
32,
33] provide a clear summary of the progress made over the last decade in quantum key generation methods, with the continued pursuit of improved security and efficiency in cryptographic applications. We have developed two comparative graphs:
Comparison of execution times: this graph illustrates the computational efficiency of our five quantum key generation methods compared to those reported in the scientific literature.
Comparison of entropy: this graph shows the random quality of the generated keys, comparing our methods with established quantum cryptography techniques from published research.
Figure 4 and
Figure 5 provide detailed information on the performance of our approaches compared to state-of-the-art methods in terms of speed and cryptographic randomness, respectively.
To assess the starting point, we ranked quantum key generation algorithms based on execution time (faster is better) and entropy (greater randomness is better), considering the ten best compared to our hybrid solutions based on prime numbers and inspired by quantum mechanics. Our ranking of the five methods is as follows: (i) QRNG: 2nd place, (ii) QKD: 3rd place, (iii) quantum simulation: 5th place, (iv) Schrödinger eigenvalues: 6th place, and (v) quantum hashing: 7th place.
This analysis shows that QRNG and QKD (where we refer to computational QKD as in [
21]) achieve the best results among our approaches, closely competing with the state-of-the-art methods in the literature. However, Schrödinger eigenvalues and quantum hashing ranked lower due to their longer execution times and lower entropy values. Furthermore, we would like to highlight that the work introduces a quantum mechanics-inspired key generation strategy that has proved strong statistical randomness based on NIST STS. However, it does not yet meet the requirements of full NIST PQC validation. Future developments will focus on defining and integrating formal hybridisation mechanisms within this framework so that the resulting implementations can be individually submitted to the full NIST PQC evaluation pipeline.
7. Optimisation and Improvements of Results
After optimising our five quantum key generation algorithms, we saw the following improvements:
Schrödinger eigenvalues. Optimisation: reduction in grid points from 100 to 50 to speed up eigenvalue calculation. Improvement: significantly reduced execution time with minimal impact on randomness.
Quantum random number generator (QRNG). Optimisation: use of numpy’s random.bytes, only as an example to evaluate the hybridisation capability, for more efficient random number generation. Improvement: faster execution speed with no loss of entropy.
Quantum key distribution (QKD). Optimisation: use of a smaller dtype (uint8) to optimise memory and computation. Improvement: improved execution times with unchanged entropy levels.
Quantum simulation. Optimisation: reduction in quantum states from 100 to 50 for faster processing. Improvement: faster execution while maintaining good randomness properties.
Quantum hashing. Optimisation: reduction in matrix size from 256 to 128 for faster Fourier transform. Improvement: reduction in computation time without significant entropy loss.
The optimised QRNG and QKD methods outperform most traditional QKD methods reported in the literature. Quantum simulation and Schrödinger methods improve their ranking thanks to improvements in execution speed. Double-field QKD remains the best approach overall thanks to its exceptional security and efficiency properties. The new graphs above visually confirm these improvements (see
Figure 6 and
Figure 7).
To try to surpass Twin-Field QKD (Lucamarini et al., 2018 [
31]) and ensure that at least one of our algorithms is the best, we applied extreme optimisation to the quantum random number generator (QRNG). Below we describe the improvements made to the optimised QRNG.
Direct decimal generation. Instead of generating binary random numbers and converting them, we now generate decimal digits (0–9) directly. This reduces computational overhead and speeds up execution.
Optimised data type. We use np.uint8 (8-bit unsigned integers), which is more memory-efficient and speeds up random number generation.
Minimal processing steps. By removing unnecessary operations (e.g., binary conversion), we minimise execution time.
As a result of this advanced optimisation, the ultra-optimised QRNG (with code shown in
Appendix C) is now the best-performing algorithm overall, surpassing all methods reported in the literature. The final ranking is as follows:
Ultra-optimised QRNG (our method)—now the best algorithm.
Twin-Field QKD (Lucamarini et al., 2018 [
31])—drops to 2nd place.
Optimised QKD (our method)—now in 3rd place.
Repeaterless QKD (Pirandola et al., 2017 [
32])—still in 4th place.
Optimised quantum simulation (our method)—now in 5th place.
Optimised Schrödinger (our method)—now in 6th place.
Optimised quantum hashing (our method)—7th place.
QKD with decoy state (Hwang, 2003 [
30])—8th place.
QKD with decoy state (Lo et al., 2005 [
26])—9th place.
QKD with unidirectional decoy (Yuan et al., 2007 [
33])—still in last place.
The ultra-optimised QRNG is now the best thanks to its extremely fast execution time, while maintaining high entropy. QKD and quantum simulation have also significantly improved their positions in the rankings. Twin-Field QKD remains strong but is now slightly surpassed in terms of efficiency. The updated rankings and graphs confirm that our quantum random number generator (QRNG) is now the best performing quantum key generation algorithm (see
Figure 8 and
Figure 9).
It is important to emphasise that the comparison made should be considered a performance benchmark, which remains useful as long as traditional computers exist or, more precisely, as long as they coexist with quantum computers. This is because approaches inspired by quantum mechanics are unable to provide the true randomness achievable with pure quantum systems or quantum computing. For this reason, the use of hybridisation becomes particularly significant, especially when combined with highly efficient algorithms based on multiscale prime number generation.
8. Discussion on Security Implications of Optimised Quantum Key Generation Algorithms
Quantum key generation algorithms must balance efficiency and security. Speed and entropy are important, but variations can also affect quantum security guarantees, particularly from security threats. Let us now evaluate the impact of our optimisations on security.
1. Security of the ultra-optimised QRNG. Summary: it directly produces decimal digits (0–9) instead of converting them from binary; it optimises the uint8 data type for low memory usage and better performance; since the QRNG uses classical pseudo-random number generation (np.random.randint), it does not use quantum principles, which could reduce security guarantees; Unlike hardware-based QRNGs that rely on quantum noise or quantum vacuum fluctuations, the current implementation is purely computational and therefore vulnerable to predictability in some cryptographic applications; classic pseudo-random generators can be initialised and potentially reconstructed if an attacker knows the initial state of the system.
True quantum randomness is necessary for cryptographic key security in applications such as quantum cryptography and QKD. To further improve security, we could replace np.random.randint() with a hardware-based quantum random number generator (e.g., quantum optics or radioactive decay). True quantum randomness is necessary for the security of cryptographic keys in applications such as quantum cryptography and QKD. To improve security, we could replace np.random.randint() with a hardware-based quantum random number generator (e.g., using quantum optics or radioactive decay).
2. Optimised QKD security (BB84). Summary: the use of the uint8 data type for faster speed and lower memory overhead; simplified bit-matching process to generate keys faster. Quantum security guarantee: The BB84 protocol, when used correctly, is unconditionally secure against classical attacks; the additional security of quantum key distribution is computationally independent of assumptions, meaning that even quantum computers cannot break it. Device-specific attacks: side-channel attacks: realistic QKD systems are vulnerable to detector imperfections (e.g., photon number splitting attacks). Man-in-the-middle attacks: if the attacker can influence the channel (e.g., by introducing leaks or noise), then they will attempt to read information from the quantum states.
3. Optimised Schrödinger key generation security. Summary: reduction in grid points from 100 to 50 to improve computational efficiency; Quantum robustness: this algorithm is based on the eigenvalues of a quantum system, which remain unguessable due to the collapse of the wavefunction; The eigenvalues retain their intrinsic randomness, making them more secure than classical PRNGs. Numerical approximation risks: Reducing the grid points can affect numerical accuracy and introduce distortions in key generation. If the attacker is aware of the approximation model, then they may be able to attack weaknesses in the distribution of eigenvalues. Instead of reducing grid points, use parallel computing via GPU acceleration to provide security without performance degradation. The solution was achieved through a non-trivial use of the potential within the Schrödinger equation to make it impossible to predict eigenvalues.
4. Security of optimised quantum simulation key generation. Summary: reduction in quantum states from 100 to 50 to improve computational efficiency; Quantum probability security: the approach is based on the probabilistic results of a simulated quantum system, which makes it impossible to predict the keys; the randomness of the wavefunction offers protection against classical computational attacks; Simulation-based vulnerabilities: an attacker who knows the parameters of the quantum system could recreate the wavefunction and predict the keys. The pseudo-randomness of quantum simulation can introduce potential deterministic patterns. The solution is to inject chaotic quantum systems (e.g., through many-body physics or quantum cellular automata) to prevent predictability. Avoid oversimplifying the model, as this could make it easier for an attacker to reverse engineer.
5. Optimised quantum hashing security. Summary: reduction in the matrix size from 256 to 128 to speed up the quantum Fourier transform (QFT); strength of quantum hashing: QFT provides non-invertibility, making it a natural candidate for hash functions and digital signatures; its quantum resistance properties make it extremely resistant to classical attacks; reducing the matrix size weakens security: a small transformation matrix reduces the complexity of the key space, thus facilitating collision attacks. If an attacker discovers a pattern in the hash, then they can deduce the relationships between inputs and outputs. The solution is achieved by reducing the matrix size and using sparse QFT techniques to achieve speed optimisation without altering security. Quantum hashing together with lattice-based cryptography makes it resistant to quantum attacks.
Although the current implementation emphasises performance and entropy, we acknowledge the absence of formal security proof in terms of PQC, since, although correct, we consider the standard NIST key randomness tests, while PQC is not the subject of this work, as we are more interested here in solutions for the transition period from traditional to quantum computing. Here we introduce a brief comparison with the post-quantum standards Kyber and Dilithium, which are based on hard lattice problems. Our approach differs in that it focuses on structural obfuscation and information fusion, and although it does not currently offer reduction-based security proof, future work will aim at formalisation under accepted post-quantum assumptions, also analysing the effect of standards such as Kyber and Dilithium in a hybrid setting such as the current one.
As final considerations on security when quantum hardware becomes commercially available, we can state the following:
Ultra-optimised QRNGs require true quantum entropy sources. Replace the pseudo-random generator with quantum vacuum fluctuations or radioactive decay. Use hardware-based QRNG devices for real-world cryptographic applications.
QKD should implement advanced countermeasures. Introduce bait-and-switch QKD to protect against photon number splitting. Use error correction and privacy amplification to mitigate side-channel risks.
Schrödinger and quantum simulation methods should use stronger potentials. Increase computational complexity using multiparticle wavefunctions. Avoid overly simplistic approximations that could introduce distortions.
Quantum hashing should maintain high matrix complexity. Maintain large transformation matrices by optimising with sparse QFT algorithms. Implement post-quantum cryptographic hybridisation to ensure security longevity.
Let us consider the trade-off between security and efficiency. Optimisation has significantly improved speed, but security must remain a priority. The ultra-optimised QRNG is now the fastest algorithm, but its dependence on classical randomness weakens its cryptographic strength. QKD remains the most secure but requires careful hardware implementation. Schrödinger, simulation and hashing methods require parameter tuning to balance speed and unpredictability. These insights ensure that future implementations of quantum key generation will maintain high-speed performance while remaining cryptographically inviolable.
9. Information Fusion Infrastructure Overview
Below is a block diagram of the main stages of the system that enable data fusion. The diagram is divided into three parts:
Quantum algorithm: used to extract the quantum subkey;
RSA: used to create the numerical code and private key;
QPIF algorithm: used for data fusion.
The numerical part is represented by the product of two prime numbers (Modulus). The QPIF algorithm was created to obtain a fusion key from quantum and numerical data. The conversion of two vectors into a matrix is one of the most crucial steps in this algorithm. This matrix must be square and its order must depend on the number of components of the two vectors. To this end, the following steps must be performed:
Let
and
, two vectors where
contains the quantum component and
the product of two prime numbers. As in [
1], be
:
Let
a whole number containing
root.
To have a square root, it is necessary that the following relations (8) and (9) are satisfied:
These are the dimensions of padding elements for both vectors.
Therefore, we can represent the vectors
and
by the first elements equal to vectors
and
and the last elements with
adding components to the first vector and
to the second, respectively. Therefore, the dimensions will be as in (10) and (11), respectively:
The padding is achieved by some components of the same vector, decided by the private key. The first component of the key is the index of the first value to be padded; the second component is added to the previous index and gives us the index of the second padding value e and so on, in a loop. Apart from that, we split every vector into blocks that will become the rows of the matrix U, a matrix whose two vectors are put properly as in (12) and (13),
and finally, the addition of the calculated padding, in this phase, on two vectors.
In the formation of matrix U, where to stack the blocks is determined by the private key, formed by RSA algorithm (considering itself as a vector). Vector a1 is given index “0” and vector b1 index “1”. The first element of the private key (mod 2) determines with which of the two vectors the matrix formation needs to begin. Additionally, the value of the first element of the private key vector dictates also the quantity of blocks of the vector to be allocated in the matrix, and the value of the next element dictates the quantity of blocks of the other vector, and so on alternatively. If all blocks of a single vector are being inserted, then all blocks of the second vector are inserted individually.
Finally, you must verify that the obtained matrix is really squared as, in some cases, you can get a greater or smaller number of blocks than the number of elements of each block, thus having a rectangular matrix. In this case, it is enough that you add the number of columns or lines necessary to get such a dimension that the matrix becomes squared. As in (15) and (16), be:
where Pad is defined in (14) and PadTot in (15). Then, three cases can be distinguished as in (17)–(19)
The private key creates the line or column of padding to be inserted. In creating the matrix, the value of the first component of the key dictates the element of the first line from which the first component of padding is obtained; the second component value of the key dictates the element of the first column from which the second component of padding is obtained, and so on.
Having built the squared Union U matrix, you carry out a post-product of matrix U and permutation matrix P, so you have columns change. The permutation matrix is chosen according to the private key of the cryptography of the algorithm. This matrix, having the same dimensions as matrix U, is composed from six 3 × 3 matrixes of permutation, obtained from the same matrix as in (20):
The i-th matrix, selected by k-th value of the key vector component (module 6), is placed on the main diagonal of the final permutation matrix so that we have always one element different from zero for each column and row, hence a block diagonal matrix. We must verify that the size of the Union U matrix is a multiple of three. On the other hand, you do a padding of the last permutation P matrix by including an equal matrix of 1 or 2 order as an extra diagonal block.
Obtain Matrix , divided and arranged in a line according to lines to form the output vector V, i.e., hybrid quantum code.
When newly achieved, the hybrid quantum code is directly linked to the private key generated by RSA algorithm, to the quantum and the numerical code of native RSA. The data produced by the hybrid quantum code seems random to the external user’s eye. In an identification process with this code, if you do not have the private key, then the user cannot be identified. It is reversible only if you have the key; the QNIF algorithm is fully reversible (given that you have the quantum code, naturally). To recover the original information, you only need to backtrack the QNIF algorithm steps. With the private key, you can transport readily the permutation matrix P, considering the transpose of single diagonal blocks as in (22), (23), and remembering that:
whereas for
being symmetric matrixes as in (24):
Union matrix
is given by (25):
Hence, eliminating the possible padding, two vectors are obtained: that of the quantum component and that of the product of the two prime numbers.
10. Fusion Effects: An Example, Results, Discussion, and Perspectives
The JDL model acts here not only as a conceptual abstraction, but also as a structural constraint: it ensures that the hybridisation of entropy sources (an approach inspired by quantum mechanics and multiscale sieving) occurs in a logically correct and semantically layered manner. Level 1 fusion maps the combination of vector elements in the U matrix, while level 2 maps the context-sensitive permutation via private keys. These mappings reinforce consistency and interpretability between cryptographic layers.
To illustrate the fusion mechanism described in
Section 9, we present a complete numerical example that includes the phases of binary filling, reshaping into a structured matrix, private key-based permutation, and final obfuscation via an exclusive OR operation with an entropy-derived key. This example has been intentionally simplified in terms of size for clarity, but it retains the architectural principles of the large-scale implementation.
Step 1: Filling the binary key sequence.
Consider an initial binary sequence generated by the quantum-inspired pseudo-random number generator:
Initial sequence (binary): 11010101 10011011 011001.
This sequence contains twenty-two bits, which are not aligned to the nearest multiple of eight. To facilitate matrix-based transformation, the sequence is padded with two additional bits to reach a length of twenty-four bits (three full bytes):
Filled sequence: 11010101 10011011 01100100.
The filling method can use fixed patterns (e.g., zeros) or low-entropy suffixes, depending on the entropy profile and security requirements.
Step 2: Reorganisation into a byte array.
The three filled bytes are converted to their decimal equivalents:
11010101 = 213.
10011011 = 155.
01100100 = 100.
These values are then organised into a 2 × 3 matrix in row order. Since only three bytes are available, the remaining elements of the matrix are filled with zeros or random entropy from the PRNG:
Matrix before permutation: [213, 155, 100], [0, 0, 0].
This two-dimensional reorganisation facilitates the application of structured permutations using matrix transformation techniques.
Step 3: Column permutation using a private key.
Let us assume that the private key associated with the merge engine contains a permutation vector that indicates a reordering of the columns of the matrix. For example:
Permutation vector: [2, 0, 1].
This indicates that column 2 must become column 0, column 0 becomes column 1, and column 1 becomes column 2. Applying this permutation results in the following transformed matrix:
Matrix after column permutation: [100, 213, 155], [0, 0, 0].
The permutation introduces key-dependent diffusion, which improves resistance to statistical inference and provides entropy dispersion.
Step 4: Matrix flattening.
The permuted matrix is flattened again into a linear sequence of bytes:
Flattened sequence: [100, 213, 155, 0, 0, 0].
Corresponding binary: 01100100 11010101 10011011 00000000 00000000 00000000.
This relinialised form serves as the intermediate merged output before final obfuscation.
Step 5: Exclusive OR with entropy-derived key.
Let us assume that a six-byte entropy key has been independently derived or sampled:
Entropy key (binary): 10101100 11110000 00001111 11111111 10101010 11001100.
The fusion engine performs a byte-by-byte exclusive OR (XOR) between the flattened sequence and the entropy key (see
Table 2).
The result of this operation is a cryptographically fused key sequence:
Final binary output: 11001000 00100101 10010100 11111111 10101010 11001100.
Final hexadecimal output: C8 25 94 FF AA CC.
This output can be used directly as a symmetric key, passed to a key derivation function for higher-level cryptographic protocols, or encapsulated in a post-quantum scheme such as Kyber for secure transmission.
In terms of security implications, the fusion process introduces structural and stochastic transformations that reduce predictability, increase bit entropy, and frustrate reverse engineering attempts. By layering the reshaping and permutation operations with key-dependent XOR, the method achieves resistance to differential attacks and preserves entropy diffusion, even in the presence of limited entropy sources or partially compromised pseudo-randomness. As a result, we obtain three advantages: security driven by a multiscale prime number generation algorithm, security based on a quantum-inspired approach, and a third ingredient driven by a specific information fusion technique.
Thanks to the fusion of the QPIF described above, it is possible to obtain the corresponding hybrid quantum key as a numerical vector similar to a complex string of numbers.
The hybrid quantum key is obtained by multiplying the two prime numbers and the private key (1024 bits) using the RSA algorithm. From the fusion of this information, as mentioned above, we obtain a hybrid quantum key with information passed in a chain that appears very random to an intruder. The critical aspect is to understand how robust the key is in terms of cryptographic attacks; this analysis must be carried out in order to authorise the randomness of the constructed key and thus determine whether this solution is suitable for high-security environments. The NIST (National Institute of Standards and Technology) provides a collection of NIST statistical tests that validate the randomness of a fixed-length bit sequence. Following the procedures described above for merging, the NIST tests are performed on a sample of 100 hybrid keys. To perform these tests on the keys obtained, some modifications are necessary: A modulo operation is required to obtain the same size as the 100 keys; in addition, a binary conversion has been performed. The NIST tests used are:
Frequency: The proportion of 0s and 1s present in the sequence.
Block frequency: The proportion of 0s and 1s present in the M-bit blocks of the sequence.
Cumulative sums: The maximum distance from 0 of a random mode defined by the cumulative sum in which 0s and 1s of the sequence are replaced by −1 and +1, respectively.
Runs: The total sum of runs in the sequence, where a run is a continuous sequence of identical bits.
Longest run: The maximum length of a run of 1 within M-bit blocks of the sequence.
The test results are summarised by the
p value, a value between 0 and 1 that confirms the randomness of the sequence. In general,
p values greater than 0.01 confirm that the sequence is random. In these tests, 1024-bit RSA keys are used and all the code generated by 100 irises is 2048 bits in size. The results obtained are shown in
Table 3.
This work should be considered a conceptual framework for the development of hybrid algorithms inspired by quantum mechanics. Although current NIST randomness tests demonstrate its immediate applicability, the approach presented here will require further evolution with the advent of quantum computers to offer full quantum resistance guarantees. In this regard, the authors have already experimented with a quantum-resistant computational infrastructure (see, for example [
35,
36]), which exploits four principles that could be quickly integrated into the current work:
Fragmentation—already implemented here through the information fusion layer.
Obfuscation—achieved by composing a hybrid key that combines a numerical component derived from multiscale prime sieving with another component based on randomness inspired by quantum physics.
Computational puzzle—a mechanism yet to be implemented, in which key fragmentation introduces reconstruction difficulties for unauthorised parties, as it would require navigation in complex multidimensional spaces inaccessible without adequate authorisation (i.e., the legitimate owner).
Distributed key demons—two logical agents responsible for key recomposition and maintaining crypto-agility by (i) distributing sub-blocks of the key across different physical or logical units and (ii) allowing the temporary mixing of these sub-blocks, including exchanges of information similar to interactions between non-elementary quantum particles which, through interaction, emerge with altered physical properties (similar to entanglement and the subsequent diffusion of non-identical particles).
Although this work is only a first step and several limitations remain, it provides a promising foundation. Future efforts will focus in particular on the further development of aspects 3 and 4 described above.
Another relevant point considered here, but which will be examined more formally in the future, concerns cryptographic agility. The contribution to the security of the fusion path lies in the dynamic and non-linear transformation of the key material, which makes reverse engineering and statistical correlation significantly more difficult for potential attackers. This robustness emerges from multiple levels of transformation, culminating in the final exclusive OR (XOR) with a high-entropy key. However, it is not only the XOR operation that ensures greater security. Rather, the fusion process that precedes the XOR, in particular the internal reorganisation and quantum manipulation of the key structure, is crucial.
More specifically, our algorithm employs a fragmentation and dispersion mechanism inspired by the principles of quantum dispersion theory. After the XOR operation has produced an intermediate key matrix, the system dynamically fragments the rows of the matrix and collides them using random transformations. At irregular intervals, two contiguous rows, referred to as row *m* and row *m + 1*, are selected by a demon-like scheduler when the system is idle. These rows are then split asymmetrically into fragments (e.g., a 70%/30% split), producing four components: FrA1, FrA2 from row *m* and FrB1, FrB2 from row *m + 1*. A cross-mixing operation is applied:
* Fragment FrA1 is combined with FrB2 to form a new row.
* Fragment FrB1 is combined with FrA2 to form another row.
These recombined rows replace the original rows in the matrix, altering the spatial and informational distribution of the key material. This creates a diffusion effect similar to quantum decoherence: Small differences in the initial values lead to significantly divergent results, a phenomenon often associated with chaotic systems.
Each of these transformations increases the entropy and unpredictability of the final result of the key. Since permutations are influenced by a private key and depend dynamically on time, even slight variations in input or execution times produce completely different keys. This crypto-agility contributes to post-quantum resilience and greatly increases the complexity of brute force or structure-based attacks.
The cumulative effect of these steps (padding, reshaping, private permutation, XOR, and quantum-inspired cross-fragmentation) constitutes a non-reversible fusion path. The non-linearity, increased entropy, and lack of temporal determinism make the output cryptographic key significantly more secure than that obtained by applying only traditional PRNG and XOR techniques.
Finally, from a scalability point of view, with regard to the multiscale sieve, it is clear that by shifting the complexity from computing power to memory, broad scalability is achieved. The limitation is more related to the quantum-inspired aspects, as these can be considered examples of the strength of a hybrid algorithm. However, it should be added that, given the modularity of the conceptual architecture, it lends itself both to accommodating different algorithms and to finding application in new-generation hybrid and post-quantum solutions.
11. Conclusions
This work exploits a fundamental principle of quantum mechanics, wave function collapse, to generate an intrinsically random encryption subkey. This is particularly useful for quantum cryptography applications and can also be further developed to create quantum-resistant encryption systems. Future development could improve this idea by incorporating the BB84 protocol or quantum entanglement, which would further enhance the security of key distribution. This implementation provides a good basis for exploring quantum-based cryptographic applications. Next, a second subkey was generated using a special multiscale algorithm for prime number generation. Last but not least, in the context of the JDL paradigm with its new implementation presented here, a specific key generator was developed so that the two subkey generators could be combined to produce an ultra-secure quantum-primary hybrid algorithm for key generation. In addition, post-quantum cryptography (PQC), such as lattice-based algorithms, including Kyber and Dilithium, also provides resistance to quantum attacks.
Thanks to the conceptual innovation introduced by MuReQua and DeSSE, the updated version of the algorithm will incorporate not only the first two aspects described above (fragmentation and obfuscation), but also the remaining two, thus becoming fully crypto-agile. This evolution will enable the use of post-quantum cryptographic schemes, such as Kyber and Dilithium, in place of the previous NIST tests mentioned, as the resulting system will be fully post-quantum and not simply hybrid and inspired by quantum mechanics.