TDLearning: Trusted Distributed Collaborative Learning Based on Blockchain Smart Contracts
Abstract
:1. Introduction
- We use smart contracts to define and encapsulate collaborative behaviours, relationships and specifications between distributed-collaboration nodes.
- Based on the idea of smart contract collaboration architecture and transfer learning, we propose a model-fusion approach based on feature fusion to replace the direct sharing of local data resources with distributed-model collaboration training.
2. Related Work
2.1. Replacing the Central Server in Distributed Learning
2.2. Recording Distributed-Collaboration Processes
2.3. Distributed-Collaborative-Learning Mechanism Based on Blockchain Smart Contracts
3. Method
3.1. Architecture of TDLearning
- Tp uses its local data to develop pretrained models.
- Mv verifies all pretrained models by using the standard validation set provided by To.
- To receives a validated pretrained model, and model fusion is completed by using its local data to develop the global model.
- The role-authority-management module locates the trust issue for collaborative-learning nodes, which maintain a list of trusted collaborative nodes through the AMC. In this mechanism, the prerequisite for participation in collaboration is the registration and authentication of identity, and role-based authority management assigns authority to collaborative nodes by role. The binding of identity information to authority encourages collaborative nodes to act honestly, and the traceability and tamper-evident nature of the blockchain strongly restrains them from acting maliciously.
- Trusted database construction locates data-availability and collaborative-data-quality issues. The DRC maintains an index of registered data, which enables the on-chain retrieval of data. To achieve the keyword retrieval of blockchain data, the traditional solution is to synchronise data off-chain for retrieval; however, due to the existence of synchronisation cycles and the fact that off-chain data security cannot be guaranteed, this will inevitably lead to a reduction in the data availability, hence the need for secure on-chain retrieval in this mechanism. The DEC introduces a fuzzy comprehensive evaluation method, which uses the information in the data and the collaborative behaviour of the data nodes as indicators to evaluate the quality of the data. The DMC completes data registration and maintains a list of trusted data to build a trusted database by invoking the DRC and DEC.
- The collaborative task-management and collaboration-incentive-allocation modules locate the credibility and fairness of collaboration. The TMC controls the process of task execution by differentiating the status of collaborative tasks and maintaining task lists. At the same time, to ensure the quality of pretrained models and the fairness of collaboration, we introduce a verification mechanism for the validity of pretrained models, in which the smart contract determines whether multiple model verifiers have reached a consensus on model verification. Model-verification consensus ensures verification fidelity and pretrained model validity and provides incentive weighting metrics for collaborative incentive allocation. Once consensus is reached, the TIC assigns incentives with predetermined rules and model-verification results. In summary, the collaboration mechanism uses smart contracts to implement distributed-collaborative-learning-task interaction control to guarantee collaborative trustworthiness.
3.2. Task Flow of TDLearning
- Su deploys the relevant smart contracts and assigns a set of Mv to initialise their credibility ratings.
- Each collaborating node registers identity information through the AMC, and the Su assigns the nodes corresponding to role authority based on the identity request information.
- Tp registers the local data summary via the DMC. The DMC invokes the DRC and DEC to index the newly registered data and initialise its quality evaluation.
- To registers and initialises collaborative tasks via the TMC, which specifies the task overview, pretrained model structure, standard test sets and Mv credibility threshold, and pledges task incentives to the TMC. To uses the smart contract to perform secure on-chain searches and queries the quality ratings of relevant datasets to find high-value trusted data.
- To specifies the data nodes involved in the task, i.e., the task participants, while the supervisor specifies the model verifiers for the task based on the Mv credibility threshold requirement. After the above steps are completed, if the TMC determines that sufficient collaboration incentives have been deposited, the collaboration task is formally published.
- develop a pretrained model locally by using private data in the specified model structure. Once the pretrained model is developed, send it to the specified as an on-chain encrypted transmission.
- complete pretrained model verification locally with a standard test set specified by To and upload the verification results encrypted to the TMC.
- The TMC determines whether a consensus on pretrained model verification is reached based on the verification results of . When the consensus is reached, send the pretrained model to To via on-chain encryption. At the same time, the TMC creates the TIC, and the TIC completes the collaboration-incentive allocation, and both and involved in the collaboration will be automatically incentivised by the smart contract.
- To invokes the DEC to update the collaborative data-quality evaluation. At the same time, To invokes the AMC to update the credibility evaluation of . If a collaborative node believes that Mv has suspicious malicious behaviour, the collaborative node can request to initiate a trust arbitration vote on the Mv, and the AMC updates the Mv credibility evaluation based on the vote arbitration result.
- To completes pretrained model fusion to develop the global model, and the collaborative task ends.
3.3. Smart Contract Design
3.3.1. Role-Authority Management
Algorithm 1 Authority Management (AMC) |
|
3.3.2. Trusted Database Construction
Algorithm 2 Data registration and retrieval (DMC and DRC) |
|
- Data-summary preprocessing. When the blockchain listens for a data-registration event, the DRC calls propressString() and stem() to traverse the Dsummary and complete the preprocessing. The preprocessing includes special character substitution, word separation and stemming, and the DRC builds the dictionary through these steps.
- Building the inverted index. The smart contract iterates through the dictionary and updates the inverted list PostingList with the inverted index table invertedIndex when the dictionary array lexicon[i] is not empty and not for retrieving stop words. The inverted list stores a list of data indexes as well as word frequencies. The inverted index table is in the form of key–value pairs, where the key is the word and the value is the inverted list corresponding to the word. Through the above steps, the smart contract establishes a keyword dictionary and an inverted index table corresponding to each of these words. In addition, due to the peculiarities of the permanent storage of blockchain data, the data-retrieval contract designs a list of stop words. When you want data related to a keyword to no longer be searchable, add the keyword to the table.
3.3.3. Collaborative Task Management and Collaboration-Incentive Allocation
Algorithm 3 Data evaluation (DEC) |
|
3.3.4. Collaborative Task Management and Collaboration Incentive Allocation
Algorithm 4 Collaborative task management (TMC) |
|
Algorithm 5 Collaboration-incentive allocation (TIC) |
|
3.4. Model-Fusion Method Based on Feature Fusion
- Submodel pretraining: data node i (i ∈ [1, n]) develops a pretrained model i by using its local data.
- Model fusion and fine tuning: The task-publishing node uses its local data to perform parallel-feature extraction from n pretrained submodels and then uses an attention mechanism to fuse the extracted features from the submodels to generate global features, which are finally used as the input to the classifier to obtain the global model.
4. Formal Security Analysis of Smart Contracts
4.1. Preliminary
4.1.1. Formal Concepts for CPN
4.1.2. CPN-Based Formal Modelling and Verification
4.2. Attribute Modelling and the Top-Layer Model Design
4.2.1. Modelling Smart Contract Attributes
4.2.2. Top-Layer Model Design
4.3. Design and Analysis of Authority Layer and Validation Layer
4.3.1. Formal Modelling
4.3.2. Formal Verification
- Step 1
- [Init_V>, Init_V transition fired and contract status is converted from to . At this point, E(VInfo, Init_2) <add, r, s, i, c>= 1’(“0x3001”, 0, 0, “v1”, 80). That is, the arc pointing to the Init_V transition binds the authority information of the v1 to be authorised. After the Init_V transition is fired, the v1 status is updated to the legal status.
- Step 2
- [V_Auth>, V_Auth transition fired and contract status is converted from to . At this point, E(V_Nodes, V_Auth) <add, r, s, i, c>= 1’(“0x3001”, 0, 1, “v1”, 80), E(v_times, V_Auth) <times>= 1, E(admin, V_Auth) <upack>= 1’(“0x1001”, 1, 1, “admin”). After the V_Auth transition is fired, the contract assigns the model-verifier role authority to v1.
- Step 3
- [V_As>, V_As transition fired and contract status is converted from to . At this point, E(co_verifiers, V_As) <add, r, s, i, c>= 1’(“0x3001”, 3, 1, “v1”, 80). After the V_As transition is fired, the contract adds v1 to the list of legitimate model verifiers V_List.
- Step 4
- [Validation_V>, Validation_V transition fired and contract status is converted from to . At this point, E(Verifiers, Validation_V) >vmap>= 1’(“0x3001”, 80), E(V_nodes, Validation_V) <v_addlist>= 1’[(“0x3001”, 3, 1, “v1”, 80)]. It is used to determine whether or not v1 has legal-role authority.
- Step 5
- [Validation_U>, Validation_U transition fired and contract status is converted from to . At this point, E(verifier, Validation_U) <vmap>= 1’(“0x3001”, 80), E(V_nodes, Validation_U) <require_2>= “access”. That is, the two arcs pointing to the Validation_U transition bind the v1 operating on the application chain to the application request, respectively. After the Validation_U transition is fired, the contract determines whether the corresponding credibility value of v1 is greater than the required threshold. Here, the threshold is met, and then v1 can perform the on-chain operation it requested.
- Step 6
- [Cancel_V>, Cancel_U transition fired and contract status is converted from to . At this point, E(v_node, Cancel_V) <vpack> = 1’(“0x3001”, 3, 1, “v1”, 80). That is, the arc pointing to the transition Cancel_V binds information about the model-verifier authority for possible malicious behaviour.
- Step 7
- [Deau_V>, Deau_U transition fired and contract status is converted from to . At this point, E(V_Malicious, Deau_V) <m>= m2, E(cancel_v, Deau_V) <vpack>= 1’(“0x3001”, 3, 1, “v1”, 80). That is, the contract confirms the presence of malicious behaviour by verifier v1. After the Deau_V transition is fired, the contract removes v1 from the list of legitimate verifiers and initialises its authority information to (“0x3001”, 0, 0, “v1”, 0).
4.4. Design and Analysis of Consensus Layer
4.4.1. Formal Modelling
4.4.2. Formal Verification
4.5. Design and Analysis of Arbitration Layer
4.5.1. Formal Modelling
4.5.2. Formal Verification
5. Experiment and Case Study
5.1. Platform
5.2. Model-Fusion Validation
5.2.1. Experimental Scenario Setup
5.2.2. Experimental Conclusions and Analysis
5.3. Case Study
5.3.1. Task-Information-Initialisation Phase
5.3.2. Task-Execution Phase
5.3.3. Trustworthiness and Fairness
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Roh, Y.; Heo, G.; Whang, S.E. A Survey on Data Collection for Machine Learning: A Big Data—AI Integration Perspective. IEEE Trans. Knowl. Data Eng. 2019, 33, 1328–1347. [Google Scholar] [CrossRef]
- Issa, W.; Moustafa, N.; Turnbull, B.; Sohrabi, N.; Tari, Z. Blockchain-based federated learning for securing internet of things: A comprehensive survey. ACM Comput. Surv. 2023, 55, 1–43. [Google Scholar] [CrossRef]
- Qammar, A.; Karim, A.; Ning, H.; Ding, J. Securing federated learning with blockchain: A systematic literature review. Artif. Intell. Rev. 2023, 56, 3951–3985. [Google Scholar] [CrossRef] [PubMed]
- Yang, F.; Abedin, M.Z.; Hajek, P. An explainable federated learning and blockchain-based secure credit modeling method. Eur. J. Oper. Res. 2023; in press. [Google Scholar] [CrossRef]
- McMahan, B.; Moore, E.; Ramage, D.; Hampson, S.; Arcas, B.A.y. Communication-efficient learning of deep networks from decentralized data. In Proceedings of the 20th International Conference on Artificial Intelligence and Statistics (AISTATS) 2017, Fort Lauderdale, FL, USA, 20–22 April 2017; pp. 1273–1282. [Google Scholar]
- Li, T.; Sahu, A.K.; Talwalkar, A.; Smith, V. Federated learning: Challenges, methods, and future directions. IEEE Signal Process. Mag. 2020, 37, 50–60. [Google Scholar] [CrossRef]
- Toyoda, K.; Zhang, A.N. Mechanism design for an incentive-aware blockchain-enabled federated learning platform. In Proceedings of the 2019 IEEE International Conference on Big Data (Big Data), Los Angeles, CA, USA, 9–12 December 2019; pp. 395–403. [Google Scholar]
- Lyu, L.; Yu, H.; Yang, Q. Threats to federated learning: A survey. arXiv 2020, arXiv:2003.02133. [Google Scholar]
- Guo, H.; Yu, X. A Survey on Blockchain Technology and its security. Blockchain Res. Appl. 2022, 3, 100067. [Google Scholar] [CrossRef]
- Lin, S.; Zhang, L.; Li, J.; Ji, L.; Sun, Y. A survey of application research based on blockchain smart contract. Wirel. Netw. 2022, 28, 635–690. [Google Scholar] [CrossRef]
- Wang, X.; Ren, X.; Qiu, C.; Xiong, Z.; Yao, H.; Leung, V.C.M. Integrating edge intelligence and blockchain: What, why, and how. IEEE Commun. Surv. Tutorials 2022, 24, 2193–2229. [Google Scholar] [CrossRef]
- Khan, A.A.; Laghari, A.A.; Rashid, M.; Li, H.; Javed, A.R.; Gadekallu, T.R. Artificial intelligence and blockchain technology for secure smart grid and power distribution Automation: A State-of-the-Art Review. Sustain. Energy Technol. Assess. 2023, 57, 103282. [Google Scholar]
- Shukla, A.; Lodha, N. Investigating the Role of Artificial Intelligence in Building Smart Contact on Blockchain. In Proceedings of the 2022 International Conference on Emerging Smart Computing and Informatics (ESCI), Pune, India, 9–11 March 2022; pp. 1–6. [Google Scholar]
- Kim, H.; Park, J.; Bennis, M.; Kim, S.-L. Blockchained on-device federated learning. IEEE Commun. Lett. 2019, 24, 1279–1283. [Google Scholar] [CrossRef]
- Qu, Y.; Gao, L.; Luan, T.H.; Xiang, Y.; Yu, S.; Li, B.; Zheng, G. Decentralized privacy using blockchain-enabled federated learning in fog computing. IEEE Internet Things J. 2020, 7, 5171–5183. [Google Scholar] [CrossRef]
- Wang, Y.; Su, Z.; Zhang, N.; Benslimane, A. Learning in the air: Secure federated learning for UAV-assisted crowdsensing. IEEE Trans. Netw. Sci. Eng. 2020, 8, 1055–1069. [Google Scholar] [CrossRef]
- Lu, Y.; Huang, X.; Dai, Y.; Maharjan, S.; Zhang, Y. Blockchain and federated learning for privacy-preserved data sharing in industrial IoT. IEEE Trans. Ind. Inform. 2019, 16, 4177–4186. [Google Scholar] [CrossRef]
- Harris, J.D.; Waggoner, B. Decentralized and collaborative AI on blockchain. In Proceedings of the IEEE International Conference on Blockchain (Blockchain), Atlanta, GA, USA, 14–17 July 2019; pp. 368–375. [Google Scholar]
- Lugan, S.; Desbordes, P.; Brion, E.; Tormo, L.X.R.; Legay, A.; Macq, B. Secure architectures implementing trusted coalitions for blockchained distributed learning (TCLearn). IEEE Access 2019, 7, 181789–181799. [Google Scholar] [CrossRef]
- Awan, S.; Li, F.; Luo, B.; Liu, M. Poster: A reliable and accountable privacy-preserving federated learning framework using the blockchain. In Proceedings of the 2019 ACM SIGSAC Conference on Computer and Communications Security, London, UK, 11–15 November 2019; pp. 2561–2563. [Google Scholar]
- Miao, Y.; Liu, Z.; Li, H.; Choo, K.R.; Deng, R.H. Privacy-preserving Byzantine-robust federated learning via blockchain systems. IEEE Trans. Inf. Forensics Secur. 2022, 17, 2848–2861. [Google Scholar] [CrossRef]
- Ma, C.; Li, J.; Shi, L.; Ding, M.; Wang, T.; Han, Z.; Poor, H.V. When federated learning meets blockchain: A new distributed learning paradigm. IEEE Comput. Intell. Mag. 2022, 17, 26–33. [Google Scholar] [CrossRef]
- Bozkurt, A.; Ucar, H. Blockchain technology as a bridging infrastructure among formal, non-formal, and informal learning processes. In Research Anthology on Adult Education and the Development of Lifelong Learners; Information Science Reference: Hershey, PA, USA, 2021; pp. 959–970. [Google Scholar]
- Alsobeh, A.; Woodward, B. AI as a Partner in Learning: A Novel Student-in-the-Loop Framework for Enhanced Student Engagement and Outcomes in Higher Education. In Proceedings of the 24th Annual Conference on Information Technology Education, Marietta, GA, USA, 11–14 October 2023; pp. 171–172. [Google Scholar]
- Ramanan, P.; Nakayama, K. Baffle: Blockchain based aggregator free federated learning. In Proceedings of the 2020 IEEE International Conference on Blockchain (Blockchain), Rhodes Island, Greece, 2–6 November 2020; pp. 72–81. [Google Scholar]
- Mendis, G.J.; Wu, Y.; Wei, J.; Sabounchi, M.; Roche, R. A blockchain-powered decentralized and secure computing paradigm. IEEE Trans. Emerg. Top. Comput. 2020, 9, 2201–2222. [Google Scholar] [CrossRef]
- Ouyang, L.; Yuan, Y.; Wang, F.-Y. Learning markets: An AI collaboration framework based on blockchain and smart contracts. IEEE Internet Things J. 2020, 9, 4273–14286. [Google Scholar] [CrossRef]
- Ouyang, L.; Yuan, Y.; Cao, Y.; Wang, F.-Y. A novel framework of collaborative early warning for COVID-19 based on blockchain and smart contracts. Inf. Sci. 2021, 570, 124–143. [Google Scholar] [CrossRef]
- Oktian, Y.E.; Stanley, B.; Lee, S.-G. Building Trusted Federated Learning on Blockchain. Symmetry 2022, 14, 1407. [Google Scholar] [CrossRef]
- Benet, J. Ipfs-content addressed, versioned, p2p file system. arXiv 2014, arXiv:1407.3561. [Google Scholar]
- Neyshabur, B.; Sedghi, H.; Zhang, C. What is being transferred in transfer learning? Adv. Neural Inf. Process. Syst. 2020, 33, 512–523. [Google Scholar]
- Jensen, K.; Kristensen, L.M. Coloured Petri Nets: Modelling and Validation of Concurrent Systems; Springer Science & Business Media: Berlin, Germany, 2009. [Google Scholar]
- Jensen, K.; Kristensen, L.M.; Wells, L. Coloured Petri Nets and CPN Tools for modelling and validation of concurrent systems. Int. J. Softw. Tools Technol. Transf. 2007, 9, 213–254. [Google Scholar] [CrossRef]
- Cheng, A.; Christensen, S.; Mortensen, K.H. Model Checking Coloured Petri Nets-Exploiting Strongly Connected Components; DAIMI Report Series; The Royal Danish Library: Copenhagen, Denmark, 1997; Volume 519, pp. 1–17. [Google Scholar]
- LeCun, Y. The MNIST Database of Handwritten Digits. 1998. Available online: http://yann.lecun.com/exdb/mnist/ (accessed on 20 November 2023).
- Krizhevsky, A. Learning multiple layers of features from tiny images. In Handbook of Systemic Autoimmune Diseases; University of Toronto: Toronto, ON, Canada, 2009. [Google Scholar]
- Xiao, H.; Rasul, K.; Vollgraf, R. Fashion-mnist: A novel image dataset for benchmarking machine learning algorithms. arXiv 2017, arXiv:1708.07747. [Google Scholar]
Role | Collaborative Behaviours |
---|---|
Supervisor—Su | (1) Smart contract deployment and collaborative node authority management. (2) Collaboration task access audit and collaboration node on-chain behaviour monitoring. (3) Initiation of smart-contract-function execution, such as data-quality assessment, contract incentive, etc. |
Task Owner—To | (1) To issues global-model-development tasks, specifying pretrained model structures as well as standard test sets. (2) To registers on-chain tasks via smart contracts and completes the incentive funding pledge. (3) After receiving all the validated pretrained models, To completes global model fusion and fine tuning locally. |
Task Participant—Tp | (1) Tp registers local private data by using a data-management contract, which participates in the collaborative-learning task as a data node. (2) Tp uses the specified model structure to develop pretrained models locally by using the private data. (3) After the pretrained model is verified, Tp receives automatic motivation from the smart contract. |
Model Verifier—Mv | (1) Mv completes pretrained model verification based on the standard test set given by the task owner and uploads the validation results via the smart contract. (2) After the smart contract reaches a consensus on the pretrained model validation, Mv receives an automatic incentive from the smart contract. |
Notation | Description |
---|---|
Collaborative node Ethereum address | |
Node identity and authority information | |
/ | Node role/status |
/ | Node name/summary |
/ | Data hash/availability status |
/ | Data name/summary |
Collaborative task name | |
/ | Task summary/incentive benchmark |
Cryptographic numbers for model verification | |
The number of specified by To | |
Credibility for | |
Credibility threshold for | |
Model-verification thresholds |
Initial Static Factors | Dynamic Behaviour Factors |
---|---|
Data size: | Number of tasks completed: |
Sample label completeness: | Completion time for this task: |
Node computing power: | Node Credits: |
Step | Simulations |
---|---|
1 | [Start Verification>, smart contract initialises consensus flag. |
2 | [Send Model>, Tp sends pretrained models to all model verifiers. |
3 | [Send Model>, smart contracts specify the cryptographic number for model verifier v1. |
4 | [Send Model>, smart contracts specify the cryptographic number for model verifier v2. |
5 | [Send Model>, smart contracts specify the cryptographic number for model verifier v3. |
6 | [Verify>, smart contract verifies the encryption verification result of v1. |
7 | [Verify>, smart contract verifies the encryption verification result of v2. |
8 | [Verify>, smart contract verifies the encryption verification result of v3. |
9 | [Consensus>, smart contract determines that a model-verification consensus has been reached, modifies the consensus flag bit and creates TIC. |
Step | Simulations |
---|---|
1 | , smart contract creates trust arbitration. |
2 | , task participants participate in the vote of confidence in arbitration. |
3 | , task participants confirm suspicious behaviour of the model verifier. |
4 | , task participant 1 selects trust model verifier. |
5 | , task participant 2 selects trust model verifier. |
6 | , task participant 3 selects trust model verifier. |
7 | , smart contract adds model verifier with updated credibility value to list. |
8 | , smart contracts remove old model verifier information from the list. |
CNN-MNIST | CNN-CIFAR10 | |
---|---|---|
Convolution layer 1 | 16 5 × 5 kernels | 64 5 × 5 kernels |
Pooling layer 1 | 2 × 2 max pooling | 2 × 2 max pooling |
Convolution layer 2 | 32 5 × 5 kernels | 64 5 × 5 kernels |
Pooling layer 2 | 2 × 2 max pooling | 2 × 2 max pooling |
Fully connected layer | 128 units (with ReLU activation) | 512 units (with ReLU activation) |
Output layer | Softmax | Softmax |
Optimiser | Adam | Adam |
Learning rate | 0.001 | 0.001 |
Batch size | 64 | 64 |
Max local-training epochs | 20 | 20 |
Max fine-tuning epochs | 20 | 20 |
Collaborative Nodes | To | Tp 1 | Tp 2 | Tp 3 | Tp 4 |
---|---|---|---|---|---|
Amount | 1000/10,000 | 600/3000 | 600/3000 | 600/3000 | 600/3000 |
Label |
Collaborative Nodes | To | Tp 1 | Tp 2 | Tp 3 | Tp 4 |
---|---|---|---|---|---|
Amount | 1000/10,000 | 600/3000 | 600/3000 | 600/3000 | 600/3000 |
Label |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Liu, J.; Hai, X.; Li, K. TDLearning: Trusted Distributed Collaborative Learning Based on Blockchain Smart Contracts. Future Internet 2024, 16, 6. https://doi.org/10.3390/fi16010006
Liu J, Hai X, Li K. TDLearning: Trusted Distributed Collaborative Learning Based on Blockchain Smart Contracts. Future Internet. 2024; 16(1):6. https://doi.org/10.3390/fi16010006
Chicago/Turabian StyleLiu, Jing, Xuesong Hai, and Keqin Li. 2024. "TDLearning: Trusted Distributed Collaborative Learning Based on Blockchain Smart Contracts" Future Internet 16, no. 1: 6. https://doi.org/10.3390/fi16010006
APA StyleLiu, J., Hai, X., & Li, K. (2024). TDLearning: Trusted Distributed Collaborative Learning Based on Blockchain Smart Contracts. Future Internet, 16(1), 6. https://doi.org/10.3390/fi16010006