Classical Data in Quantum Machine Learning Algorithms: Amplitude Encoding and the Relation Between Entropy and Linguistic Ambiguity
Abstract
:1. Introduction
- 1.
- Following Lorenz et al. [3], we train variational quantum circuits to learn meanings of words and sentences in a dataset. We explore the encoding of nouns on two qubits, in contrast to the one qubit in the original study, and we use an additional dataset, closely related to the one in the original study.
- 2.
- We investigate the effect of amplitude-encoding classical data on the models trained in the step before.
- 3.
- We then investigate the effect that classical data (introduced via amplitude encoding) has on the relation between the ambiguity in a sentence and the von Neumann entropy in the quantum state representing it.
2. Related Work
2.1. Quantum Natural Language Processing
2.2. Theoretical Foundations of QNLP
2.3. The DisCoCat Framework
2.4. Quantum Computing
2.5. From Linguistics to Quantum Circuits
2.6. Density Matrices
3. The Underlying Replication Task
- 1.
- NumPy model:Uses the Python library NumPy (https://numpy.org (accessed on 14 August 2024)). Quantum circuits are converted to tensor networks. The SPSA optimiser is used to estimate the gradient. The simulation is noiseless. The model cannot be run on real quantum hardware.
- 2.
- pennylane model:Uses the python libraries pennylane (https://pennylane.ai/ (accessed on 12 August 2024)) and PyTorch (https://pytorch.org/ (accessed on 12 August 2024)). Both state vector simulations and density matrix simulations can be performed. The pennylane model uses exact backpropagation, in contrast to the NumPy model that uses the Simultaneous Perturbation Stochastic Approximation (SPSA) function. This model can be run on real quantum hardware.
3.1. Datasets
where the three dots indicate a missing word. By introducing non-ambiguous subjects to the dataset, the sentence can be disambiguated [14]:Person prepares …
Person prepares … and chef does too.
3.2. The Experiment
3.2.1. Tket Model
- Discussion
3.2.2. NumPy and Pennylane Models
- Discussion
4. Amplitude Encoding
4.1. Pre-Study
4.1.1. Implementation
and nine nouns in the new dataset, where the words man and woman are replaced with the words chef and programmer. For the case of the two-dimensional encodings (onto one qubit), we discuss the encoding learned by the model using the Bloch sphere representation of the learned embeddings. We present the encodings for the three models: Tket, NumPy, and pennylane.person, man, woman, dinner, meal, sauce, program, application, software
Discussion and Further Investigation
4.2. Encoding Data on One Qubit
4.2.1. Implementation
- Discussion
4.3. Encoding Data on Two Qubits
4.3.1. Principal Component Analysis and Independent Component Analysis
4.3.2. Implementation—PCA
- Discussion
4.3.3. Implementation—ICA
- Discussion
5. Entropy, Fidelity, and Ambiguity
5.1. Approach
5.2. Disambiguation
5.3. Amplitude Encoding
5.4. Implementation
5.4.1. Results—Original Dataset
- Discussion
5.4.2. Results—New Dataset
- Discussion
6. Conclusions and Further Work
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Conflicts of Interest
Appendix A. Lambek Calculus and Proof Nets
Appendix B. Plots
Appendix C. Dataset
woman prepares tasty dinner | woman cooks tasty sauce |
skillful man prepares dinner | skillful woman cooks sauce |
man prepares sauce | person cooks tasty sauce |
man cooks sauce | woman bakes meal |
skillful man bakes sauce | person bakes meal |
skillful woman bakes dinner | skillful woman cooks dinner |
man cooks meal | woman bakes sauce |
woman prepares meal | skillful man prepares sauce |
skillful man bakes dinner | woman cooks tasty meal |
man prepares meal | woman prepares tasty meal |
woman prepares sauce | woman prepares dinner |
skillful person prepares meal | skillful person bakes dinner |
skillful woman bakes meal | man bakes tasty meal |
person prepares tasty meal | man bakes tasty dinner |
skillful man cooks dinner | person cooks dinner |
skillful woman prepares meal | skillful woman bakes sauce |
skillful man bakes meal | woman cooks meal |
woman bakes dinner | skillful man cooks meal |
man cooks dinner | woman cooks tasty dinner |
woman cooks dinner | man bakes tasty sauce |
man prepares dinner | skillful person cooks sauce |
person prepares tasty sauce | skillful person bakes sauce |
skillful man cooks sauce | woman bakes tasty meal |
person cooks meal | person bakes tasty sauce |
person bakes dinner | man cooks tasty meal |
skillful person cooks meal | person cooks sauce |
man cooks tasty sauce | skillful person bakes meal |
man prepares tasty meal | man prepares tasty sauce |
person bakes tasty meal | person prepares dinner |
man bakes sauce | person cooks tasty dinner |
woman bakes tasty sauce | skillful person prepares sauce |
person prepares tasty dinner | woman bakes tasty dinner |
woman cooks sauce | skillful woman prepares software |
woman runs useful program | skillful person runs software |
skillful person prepares program | man prepares program |
skillful person prepares software | man prepares useful software |
woman debugs program | skillful woman runs application |
man debugs software | skillful woman debugs application |
person debugs software | woman runs useful software |
person debugs program | skillful woman debugs software |
skillful woman debugs program | person runs program |
person runs useful application | woman runs useful application |
woman runs application | man prepares software |
person prepares useful program | man debugs useful application |
person debugs useful application | woman prepares program |
man prepares useful application | man debugs useful software |
man prepares application | person debugs useful software |
person runs application | woman runs program |
skillful man prepares program | woman runs software |
skillful man debugs software | skillful man prepares software |
person prepares software | person runs software |
man debugs program | man runs software |
person prepares useful application | woman debugs software |
skillful man runs software | woman debugs application |
woman debugs useful program | skillful woman runs program |
person runs useful program | skillful person prepares application |
man prepares useful program | man runs program |
woman prepares software | person prepares useful software |
skillful person debugs program | person debugs application |
skillful person debugs software | skillful woman runs software |
person debugs useful program | man runs application |
woman debugs useful software | man runs useful application |
person prepares program | woman debugs useful application |
skillful woman prepares application | man debugs application |
woman prepares useful application | man debugs useful program |
References
- Sipio, R.D.; Huang, J.H.; Chen, S.Y.C.; Mangini, S.; Worring, M. The Dawn of Quantum Natural Language Processing. arXiv 2021, arXiv:cs.CL/2110.06510. [Google Scholar]
- Coecke, B.; Sadrzadeh, M.; Clark, S. Mathematical Foundations for a Compositional Distributional Model of Meaning. arXiv 2010, arXiv:cs.CL/1003.4394. [Google Scholar]
- Lorenz, R.; Pearson, A.; Meichanetzidis, K.; Kartsaklis, D.; Coecke, B. QNLP in Practice: Running Compositional Models of Meaning on a Quantum Computer. J. Artif. Intell. Res. 2023, 76, 1305–1342. [Google Scholar] [CrossRef]
- Gauderis, W.; Wiggins, G. Quantum Theory in Knowledge Representation: A Novel Approach to Reasoning with a Quantum Model of Concepts. Master’s Thesis, Vrije Universiteit Brussel, Ixelles, Belgium, 2023. [Google Scholar]
- Coecke, B.; de Felice, G.; Meichanetzidis, K.; Toumi, A. Foundations for Near-Term Quantum Natural Language Processing. arXiv 2020, arXiv:quant-ph/2012.03755. [Google Scholar]
- Meyer, F.; Lewis, M. Modelling Lexical Ambiguity with Density Matrices. In Proceedings of the 24th Conference on Computational Natural Language Learning, Online, 19–20 November 2020; Fernández, R., Linzen, T., Eds.; Association for Computational Linguistics: Stroudsburg, PA, USA, 2020; pp. 276–290. [Google Scholar] [CrossRef]
- Bruhn, S. Density Matrix Methods in Quantum Natural Language Processing. Ph.D. Thesis, Universität Osnabrück, Osnabrück, Germany, 2022. [Google Scholar]
- Hoffmann, T. Quantum Models for Word-Sense Disambiguation. Master’s Thesis, Chalmers University of Technology, Gothenburg, Sweden, 2021. [Google Scholar]
- Coecke, B. The Mathematics of Text Structure. arXiv 2020, arXiv:cs.CL/1904.03478. [Google Scholar]
- Eisinger, J.; Gauderis, W.; de Huybrecht, L.; Wiggins, G.A. Quantum Methods for Managing Ambiguity in Natural Language Processing. arXiv 2025, arXiv:2504.00040. [Google Scholar] [CrossRef]
- Balkir, E.; Sadrzadeh, M.; Coecke, B. Distributional Sentence Entailment Using Density Matrices. In Topics in Theoretical Computer Science; Hajiaghayi, M.T., Mousavi, M.R., Eds.; Springer: Cham, Switzerland, 2016; pp. 1–22. [Google Scholar]
- Piedeleu, R.; Kartsaklis, D.; Coecke, B.; Sadrzadeh, M. Open System Categorical Quantum Semantics in Natural Language Processing. arXiv 2015, arXiv:cs.CL/1502.00831. [Google Scholar]
- Carette, T.; Jeandel, E.; Perdrix, S.; Vilmart, R. Completeness of Graphical Languages for Mixed States Quantum Mechanics. arXiv 2019, arXiv:quant-ph/1902.07143. [Google Scholar] [CrossRef]
- Wijnholds, G.J. A Compositional Vector Space Model of Ellipsis and Anaphora. Ph.D. Thesis, Queen Mary University of London, London, UK, 2020. [Google Scholar]
- Barney, S.; Lewis, W.; Beach, J.; Berghof, O. The Etymologies of Isidore of Seville; Cambridge University Press: Cambridge, UK, 2006. [Google Scholar]
- Coppock, E. Gapping: In Defense of Deletion. In Chicago Linguistics Society; University of Chicago: Chicago, IL, USA, 2001; Volume 37. [Google Scholar]
- Lambek, J. Type Grammar Revisited. In Logical Aspects of Computational Linguistics; Lecomte, A., Lamarche, F., Perrier, G., Eds.; Springer: Berlin/Heidelberg, Germany, 1999; pp. 1–27. [Google Scholar]
- Wazni, H.; Lo, K.I.; McPheat, L.; Sadrzadeh, M. A Quantum Natural Language Processing Approach to Pronoun Resolution. arXiv 2022, arXiv:cs.CL/2208.05393. [Google Scholar]
- Fock, V. Konfigurationsraum und zweite Quantelung. Z. Phys. 1932, 75, 622–647. [Google Scholar] [CrossRef]
- Harris, Z.S. Distributional Structure. WORD 1954, 10, 146–162. [Google Scholar] [CrossRef]
- Rieser, H.M.; Köster, F.; Raulf, A.P. Tensor networks for quantum machine learning. Proc. R. Soc. A Math. Phys. Eng. Sci. 2023, 479, 20230218. [Google Scholar] [CrossRef]
- Orús, R. A practical introduction to tensor networks: Matrix product states and projected entangled pair states. Ann. Phys. 2014, 349, 117–158. [Google Scholar] [CrossRef]
- Lambek, J. The Mathematics of Sentence Structure. Am. Math. Mon. 1958, 65, 154–170. [Google Scholar] [CrossRef]
- Coecke, B.; Grefenstette, E.; Sadrzadeh, M. Lambek vs. Lambek: Functorial vector space semantics and string diagrams for Lambek calculus. Ann. Pure Appl. Log. 2013, 164, 1079–1100. [Google Scholar] [CrossRef]
- Bradley, T.D.; Lewis, M.; Master, J.; Theilman, B. Translating and Evolving: Towards a Model of Language Change in DisCoCat. Electron. Proc. Theor. Comput. Sci. 2018, 283, 50–61. [Google Scholar] [CrossRef]
- Wang-Mascianica, V.; Liu, J.; Coecke, B. Distilling Text into Circuits. arXiv 2023, arXiv:cs.CL/2301.10595. [Google Scholar]
- Coecke, B.; Wang, V. Grammar Equations. arXiv 2021, arXiv:cs.CL/2106.07485. [Google Scholar]
- Miranda, E.R.; Yeung, R.; Pearson, A.; Meichanetzidis, K.; Coecke, B. A Quantum Natural Language Processing Approach to Musical Intelligence. arXiv 2021, arXiv:quant-ph/2111.06741. [Google Scholar]
- Frobenius, F.G. Theorie der Hyperkomplexen Größen. Preussische Akademie der Wissenschaften Berlin: Sitzungsberichte der Preußischen Akademie der Wissenschaften zu Berlin, Reichsdr. 1903. Available online: https://www.e-rara.ch/zut/doi/10.3931/e-rara-18860 (accessed on 15 February 2025).
- Carboni, A.; Walters, R. Cartesian bicategories I. J. Pure Appl. Algebra 1987, 49, 11–32. [Google Scholar] [CrossRef]
- Coecke, B.; Duncan, R. Interacting quantum observables: Categorical algebra and diagrammatics. New J. Phys. 2011, 13, 043016. [Google Scholar] [CrossRef]
- Sadrzadeh, M.; Clark, S.; Coecke, B. The Frobenius anatomy of word meanings I: Subject and object relative pronouns. J. Log. Comput. 2013, 23, 1293–1317. [Google Scholar] [CrossRef]
- Nielsen, M.A.; Chuang, I.L. Quantum Computation and Quantum Information: 10th Anniversary Edition; Cambridge University Press: Cambridge, UK, 2010. [Google Scholar]
- Grover, L.K. A fast quantum mechanical algorithm for database search. arXiv 1996, arXiv:quant-ph/9605043. [Google Scholar]
- Shor, P.W. Polynomial-Time Algorithms for Prime Factorization and Discrete Logarithms on a Quantum Computer. SIAM J. Comput. 1997, 26, 1484–1509. [Google Scholar] [CrossRef]
- Coecke, B.; Kissinger, A. Picturing Quantum Processes: A First Course in Quantum Theory and Diagrammatic Reasoning; Cambridge University Press: Cambridge, UK, 2017. [Google Scholar]
- Penrose, R. Applications of negative dimensional tensors. Comb. Math. Its Appl. 1971, 1, 221–244. [Google Scholar]
- Havlíček, V.; Córcoles, A.D.; Temme, K.; Harrow, A.W.; Kandala, A.; Chow, J.M.; Gambetta, J.M. Supervised learning with quantum-enhanced feature spaces. Nature 2019, 567, 209–212. [Google Scholar] [CrossRef]
- Truger, F.; Barzen, J.; Leymann, F.; Obst, J. Warm-Starting the VQE with Approximate Complex Amplitude Encoding. arXiv 2024, arXiv:2402.17378. [Google Scholar]
- McClean, J.R.; Boixo, S.; Smelyanskiy, V.N.; Babbush, R.; Neven, H. Barren plateaus in quantum neural network training landscapes. Nat. Commun. 2018, 9, 4812. [Google Scholar] [CrossRef]
- Ravi, G.S.; Gokhale, P.; Ding, Y.; Kirby, W.M.; Smith, K.N.; Baker, J.M.; Love, P.J.; Hoffmann, H.; Brown, K.R.; Chong, F.T. CAFQA: A classical simulation bootstrap for variational quantum algorithms. arXiv 2023, arXiv:2202.12924. [Google Scholar]
- Gacon, J.; Zoufal, C.; Carleo, G.; Woerner, S. Simultaneous Perturbation Stochastic Approximation of the Quantum Fisher Information. Quantum 2021, 5, 567. [Google Scholar] [CrossRef]
- Meichanetzidis, K.; Toumi, A.; de Felice, G.; Coecke, B. Grammar-aware sentence classification on quantum computers. Quantum Mach. Intell. 2023, 5. [Google Scholar] [CrossRef]
- Meichanetzidis, K.; Gogioso, S.; de Felice, G.; Chiappori, N.; Toumi, A.; Coecke, B. Quantum Natural Language Processing on Near-Term Quantum Computers. Electron. Proc. Theor. Comput. Sci. 2021, 340, 213–229. [Google Scholar] [CrossRef]
- Lewis, M. Compositional Hyponymy with Positive Operators. In Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2019), Varna, Bulgaria, 4–6 September 2023; pp. 638–647. [Google Scholar] [CrossRef]
- Kirkpatrick, K.A. The Schrodinger-HJW Theorem. arXiv 2005, arXiv:quant-ph/0305068. [Google Scholar] [CrossRef]
- Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
- Von Neumann, J. Mathematische Grundlagen der Quantenmechanik; Springer: Berlin/Heidelberg, Germany, 1932. [Google Scholar]
- Lund, K.; Burgess, C. Producing high-dimensional semantic space from lexical co-occurence. Behav. Res. Methods Instrum. Comput. 1996, 28, 203–208. [Google Scholar] [CrossRef]
- Sivarajah, S.; Dilkes, S.; Cowtan, A.; Simmons, W.; Edgington, A.; Duncan, R. t|ket⟩: A retargetable compiler for NISQ devices. Quantum Sci. Technol. 2020, 6, 014003. [Google Scholar] [CrossRef]
- Kartsaklis, D.; Fan, I.; Yeung, R.; Pearson, A.; Lorenz, R.; Toumi, A.; de Felice, G.; Meichanetzidis, K.; Clark, S.; Coecke, B. lambeq: An Efficient High-Level Python Library for Quantum NLP. arXiv 2021, arXiv:2110.04236. [Google Scholar]
- Abu-Mostafa, Y.; Magdon-Ismail, M.; Lin, H. Learning from Data: A Short Course. 2012. Available online: https://amlbook.com/ (accessed on 15 February 2025).
- Cohen, J. A Coefficient of Agreement for Nominal Scales. Educ. Psychol. Meas. 1960, 20, 37–46. [Google Scholar] [CrossRef]
- Sasaki, Y. The truth of the F-measure. Teach Tutor Mater 2007, 1, 1–5. [Google Scholar]
- Mitarai, K.; Negoro, M.; Kitagawa, M.; Fujii, K. Quantum circuit learning. Phys. Rev. A 2018, 98, 032309. [Google Scholar] [CrossRef]
- Schuld, M.; Sweke, R.; Meyer, J.J. Effect of data encoding on the expressive power of variational quantum-machine-learning models. Phys. Rev. A 2021, 103, 032430. [Google Scholar] [CrossRef]
- Schuld, M.; Killoran, N. Quantum Machine Learning in Feature Hilbert Spaces. Phys. Rev. Lett. 2019, 122, 040504. [Google Scholar] [CrossRef] [PubMed]
- Kerenidis, I.; Prakash, A. Quantum Recommendation Systems. In Proceedings of the 8th Innovations in Theoretical Computer Science Conference (ITCS 2017), Berkeley, CA, USA, 9–11 January 2017; Leibniz International Proceedings in Informatics (LIPIcs). 2017; Volume 67, pp. 49:1–49:21. [Google Scholar] [CrossRef]
- Mikolov, T.; Sutskever, I.; Chen, K.; Corrado, G.S.; Dean, J. Distributed Representations of Words and Phrases and their Compositionality. In Advances in Neural Information Processing Systems; Burges, C., Bottou, L., Welling, M., Ghahramani, Z., Weinberger, K., Eds.; Curran Associates, Inc.: Red Hook, NY, USA, 2013; Volume 26. [Google Scholar]
- Mikolov, T.; Chen, K.; Corrado, G.; Dean, J. Efficient Estimation of Word Representations in Vector Space. arXiv 2013, arXiv:1301.3781. [Google Scholar]
- Yamada, I.; Asai, A.; Sakuma, J.; Shindo, H.; Takeda, H.; Takefuji, Y.; Matsumoto, Y. Wikipedia2Vec: An Efficient Toolkit for Learning and Visualizing the Embeddings of Words and Entities from Wikipedia. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations, Online, 16–20 November 2020; Association for Computational Linguistics: Stroudsburg, PA, USA, 2020; pp. 23–30. [Google Scholar]
- Greenacre, M.; Groenen, P.J.F.; Hastie, T.; D’Enza, A.I.; Markos, A.; Tuzhilina, E. Principal component analysis. Nat. Rev. Methods Prim. 2022, 2, 100. [Google Scholar] [CrossRef]
- Sammut, C.; Webb, G.I. Encyclopedia of Machine Learning, 1st ed.; Springer Publishing Company, Incorporated: New York, NY, USA, 2011. [Google Scholar]
- Comon, P. Independent component analysis, A new concept? Signal Process. 1994, 36, 287–314. [Google Scholar] [CrossRef]
- Lambek, J. On the calculus of syntactic types. Proc. Symp. Appl. Math. 1961, 12, 166–178. [Google Scholar] [CrossRef]
- Buszkowski, W. Lambek Grammars Based on Pregroups. In Proceedings of the Logical Aspects of Computational Linguistics, Le Croisic, France, 27–29 June 2001; de Groote, P., Morrill, G., Retoré, C., Eds.; Springer: Berlin/ Heidelberg, Germany, 2001; pp. 95–109. [Google Scholar]
- Wijnholds, G.J. Coherent Diagrammatic Reasoning in Compositional Distributional Semantics. In Proceedings of the Logic, Language, Information, and Computation, London, UK, 18–21 July 2017; Kennedy, J., de Queiroz, R.J., Eds.; Springer: Berlin/Heidelberg, Germany, 2017; pp. 371–386. [Google Scholar]
- Moot, R. Partial Orders, Residuation, and First-Order Linear Logic. arXiv 2020, arXiv:2008.06351. [Google Scholar]
Model | Dataset | Accuracy | -Score | ||
---|---|---|---|---|---|
pennylane | 1 | original | 0.97 | 1.00 | 1.00 |
pennylane | 1 | new | 0.97 | 1.00 | 1.00 |
NumPy | 1 | original | 0.90 | 0.87 | 0.94 |
NumPy | 1 | new | 0.90 | 0.87 | 0.94 |
pennylane | 2 | original | 1.00 | 1.00 | 1.00 |
pennylane | 2 | new | 1.00 | 1.00 | 1.00 |
NumPy | 2 | original | 0.82 | 0.63 | 0.81 |
NumPy | 2 | new | 0.80 | 0.60 | 0.79 |
Original Dataset | Average Entropy | Average Fidelity |
---|---|---|
Sentence type (1): | ||
subject prepares object | ||
Forget subject | 0.825 | 0.507 |
Forget object | 0.592 | 0.475 |
Forget subject, amplitude-encoded | 0.210 | 0.516 |
Forget object, amplitude-encoded | 0.960 | 0.493 |
Sentence type (2): | ||
subject prepares object and verb* it | ||
Forget subject | 0.302 | 0.706 |
Forget object | 0.144 | 0.707 |
Forget subject, amplitude-encoded | 0.0198 | 0.848 |
Forget object, amplitude-encoded | 0.222 | 0.836 |
New Dataset | Average Entropy | Average Fidelity |
---|---|---|
Sentence type 1: | ||
subject prepares object | ||
Forget subject | 0.652 | 0.488 |
Forget object | 0.287 | 0.428 |
Forget subject, amplitude-encoded | 0.0686 | 0.594 |
Forget object, amplitude-encoded | 0.936 | 0.492 |
Sentence type 2: | ||
subject prepares object and verb* it | ||
Forget subject | 0.0110 | 0.990 |
Forget object | 0.0392 | 0.971 |
Forget subject, amplitude-encoded | 0.0812 | 0.628 |
Forget object, amplitude-encoded | 0.267 | 0.694 |
New Dataset | Average Entropy | Average Fidelity |
---|---|---|
Sentence type 3: | ||
subject prepares object, subject* does too | ||
Forget subject | 0.443 | 0.730 |
Forget object | 0.217 | 0.486 |
Forget subject, amplitude-encoded | 2.26 × 10−15 | 0.579 |
Forget object, amplitude-encoded | 0.648 | 0.492 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Eisinger, J.; Gauderis, W.; Huybrecht, L.d.; Wiggins, G.A. Classical Data in Quantum Machine Learning Algorithms: Amplitude Encoding and the Relation Between Entropy and Linguistic Ambiguity. Entropy 2025, 27, 433. https://doi.org/10.3390/e27040433
Eisinger J, Gauderis W, Huybrecht Ld, Wiggins GA. Classical Data in Quantum Machine Learning Algorithms: Amplitude Encoding and the Relation Between Entropy and Linguistic Ambiguity. Entropy. 2025; 27(4):433. https://doi.org/10.3390/e27040433
Chicago/Turabian StyleEisinger, Jurek, Ward Gauderis, Lin de Huybrecht, and Geraint A. Wiggins. 2025. "Classical Data in Quantum Machine Learning Algorithms: Amplitude Encoding and the Relation Between Entropy and Linguistic Ambiguity" Entropy 27, no. 4: 433. https://doi.org/10.3390/e27040433
APA StyleEisinger, J., Gauderis, W., Huybrecht, L. d., & Wiggins, G. A. (2025). Classical Data in Quantum Machine Learning Algorithms: Amplitude Encoding and the Relation Between Entropy and Linguistic Ambiguity. Entropy, 27(4), 433. https://doi.org/10.3390/e27040433