High-Precision Prediction of Total Nitrogen Based on Distance Correlation and Machine Learning Models—A Case Study of Dongjiang River, China
Abstract
:1. Introduction
2. Materials and Methods
2.1. Research Areas and Data Collection
2.2. Data Preprocessing
2.3. Feature Selection
- Input Data Configuration: The time step, input layer, and output layer dimensions are configured based on the input data. In this study, data points are recorded at 4-h intervals, and the goal is to predict total nitrogen concentration for the next 4 h using the past 24 h of water quality data. As a result, the time step is set to 6, the output feature dimension is 1, and the input feature dimension corresponds to the number of variables;
- LSTM Neuron Configuration: The number of neurons in each LSTM layer has been optimized through extensive experimentation. The first LSTM hidden layer contains 64 neurons, while the second layer has 16 neurons;
- Dropout Layer Addition: To mitigate overfitting, a Dropout layer is added after the LSTM layer, with a dropout rate of 0.1;
- Dense Layer Implementation: The output Dense layer is configured with 1 unit, and the activation function is set to ReLU;
- Optimizer Selection: The Adam optimizer is used with a learning rate of 0.001;
- Training Configuration: The number of epochs is set to 256, and the batch size is also 256. A suitable batch size ensures accurate gradient descent, while an appropriate number of epochs enhances the model’s ability to fit the data effectively.
2.4. Water Quality Prediction Model
2.4.1. LSTM Basic Structure and Principles
2.4.2. The Basic Structure and Principles of CNN
- Convolutional Layer: Initially, the well-prepared time series data are fed into the convolutional layer. This layer is the core component of the CNN, where sliding window operations are performed using convolutional kernels (filters). Within each window, the data are element-wise multiplied by the convolutional kernel, and the results are summed to produce the output;
- Pooling Layer: The pooling layer reduces the output dimensions of the convolutional layer and extracts significant features. Common pooling operations include max pooling and average pooling. Max pooling selects the maximum value within a time window as the output, while average pooling computes the average value within that window. Pooling operations help reduce the data volume, enhance computational efficiency, and improve the model’s invariance to time;
- Fully Connected Layer: After passing through several convolutional and pooling layers, the extracted features are flattened into a one-dimensional vector, transforming the time series data into a vector format. This flattened feature vector is then connected to the fully connected layer. By multiplying the flattened feature vector by the weight matrix and adding a bias term, the final output is generated.
2.4.3. Bidirectional Long Short-Term Memory (Bi-LSTM)
2.4.4. Attention Mechanism
2.4.5. At-CBiLSTM Model
2.5. Evaluation Indicators for Prediction Results
3. Results
3.1. DCC Feature Selection Results
3.2. The Training Results of the LSTM Model with Different Input Schemes
3.3. Performance Prediction of Models
3.4. Impact of Time Step Length on Model Performance
3.5. Comparative Analysis of Model Performance
4. Discussion
4.1. Key Findings on Model Performance
4.2. Feature Selection and Model Optimization
4.3. Model Applicability and Future Directions
4.4. Innovations and Implications for Water Quality Management
5. Conclusions
- Automated nonlinear feature engineering: The Distance Correlation Coefficient (DCC) method quantified nonlinear interactions between TN and water quality parameters, identifying conductivity, CODMn, and total phosphorus as dominant drivers;
- Hybrid spatiotemporal architecture: The Attention-Convolutional-Bidirectional LSTM (At-CBiLSTM) model integrated CNN-based spatial pattern extraction, bidirectional temporal modeling, and adaptive attention mechanisms. This approach achieved a 22.4% RMSE reduction (0.045 vs. 0.058) compared to conventional Bi-LSTM models;
- Real-time optimization: Systematic time-step analysis identified 3-day intervals as optimal (MAE = 0.032, MSE = 0.005, MAPE = 0.218, RMSE = 0.045), balancing prediction accuracy with computational efficiency for deployable monitoring systems.
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Rong, Q.Q.; Su, M.R.; Yang, Z.F.; Cai, Y.P.; Yue, W.C.; Dang, Z. Spatial distribution and output characteristics of nonpoint source pollution in the Dongjiang River basin in south China. In Proceedings of the 3rd International Conference on Advances in Energy Resources and Environment Engineering (ICAESEE), Harbin, China, 8–10 December 2017; IOP Publishing Ltd.: Bristol, UK, 2017. [Google Scholar]
- Xu, P.L. Research and application of near-infrared spectroscopy in rapid detection of water pollution. Desalination Water Treat. 2018, 122, 1–4. [Google Scholar] [CrossRef]
- He, Z.; Yao, J.; Lu, Y.; Guo, D. Detecting and explaining long-term changes in river water quality in south-eastern Australia. Hydrol. Process. 2022, 36, 15. [Google Scholar]
- Yan, X.; Zhang, T.; Du, W.; Meng, Q.; Xu, X.; Zhao, X. A Comprehensive Review of Machine Learning for Water Quality Prediction over the Past Five Years. J. Mar. Sci. Eng. 2024, 12, 18. [Google Scholar] [CrossRef]
- Guo, H.-N.; Wu, S.-B.; Tian, Y.-J.; Zhang, J.; Liu, H.-T. Application of machine learning methods for the prediction of organic solid waste treatment and recycling processes: A review. Bioresour. Technol. 2021, 319, 13. [Google Scholar] [CrossRef] [PubMed]
- Zhao, S.; Zhang, S.; Liu, J.; Wang, H.; Zhu, J.; Li, D.; Zhao, R. Application of machine learning in intelligent fish aquaculture: A review. Aquaculture 2021, 540, 19. [Google Scholar] [CrossRef]
- Zhong, S.; Zhang, K.; Bagheri, M.; Burken, J.G.; Gu, A.; Li, B.; Ma, X.; Marrone, B.L.; Ren, Z.J.; Schrier, J.; et al. Machine Learning: New Ideas and Tools in Environmental Science and Engineering. Environ. Sci. Technol. 2021, 55, 12741–12754. [Google Scholar]
- Cai, W.; Ye, C.; Ao, F.; Xu, Z.; Chu, W. Emerging applications of fluorescence excitation-emission matrix with machine learning for water quality monitoring: A systematic review. Water Res. 2025, 277, 16. [Google Scholar]
- Mountrakis, G.; Im, J.; Ogole, C. Support vector machines in remote sensing: A review. Isprs J. Photogramm. Remote Sens. 2011, 66, 247–259. [Google Scholar]
- Breiman, L. Random forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar]
- Khoi, D.N.; Quan, N.T.; Linh, D.Q.; Nhi, P.T.T.; Thuy, N.T.D. Using Machine Learning Models for Predicting the Water Quality Index in the La Buong River, Vietnam. Water 2022, 14, 12. [Google Scholar] [CrossRef]
- Huang, G.B.; Zhu, Q.Y.; Siew, C.K. Extreme learning machine: Theory and applications. Neurocomputing 2006, 70, 489–501. [Google Scholar]
- Juna, A.; Umer, M.; Sadiq, S.; Karamti, H.; Eshmawi, A.A.; Mohamed, A.; Ashraf, I. Water Quality Prediction Using KNN Imputer and Multilayer Perceptron. Water 2022, 14, 19. [Google Scholar] [CrossRef]
- Wang, K.; Ye, Z.; Wang, Z.; Liu, B.; Feng, T. MACLA-LSTM: A Novel Approach for Forecasting Water Demand. Sustainability 2023, 15, 19. [Google Scholar] [CrossRef]
- Guo, S.; Sun, S.; Zhang, X.; Chen, H.; Li, H. Monthly precipitation prediction based on the EMD-VMD-LSTM coupled model. Water Supply 2023, 23, 4742–4758. [Google Scholar]
- Xu, H.; Lv, B.; Chen, J.; Kou, L.; Liu, H.; Liu, M. Research on a Prediction Model of Water Quality Parameters in a Marine Ranch Based on LSTM-BP. Water 2023, 15, 15. [Google Scholar] [CrossRef]
- Cai, J.; Luo, J.; Wang, S.; Yang, S. Feature selection in machine learning: A new perspective. Neurocomputing 2018, 300, 70–79. [Google Scholar]
- Schober, P.; Boer, C.; Schwarte, L.A. Correlation Coefficients: Appropriate Use and Interpretation. Anesth. Analg. 2018, 126, 1763–1768. [Google Scholar]
- Edelmann, D.; Móri, T.F.; Székely, G.J. On relationships between the Pearson and the distance correlation coefficients. Stat. Probab. Lett. 2021, 169, 6. [Google Scholar]
- Miao, C.S. Clustering of different dimensional variables based on distance correlation coefficient. J. Ambient. Intell. Humaniz. Comput. 2021, 12. [Google Scholar] [CrossRef]
- Li, C.; Zhu, D.; Hu, C.; Li, X.; Nan, S.; Huang, H. ECDX: Energy consumption prediction model based on distance correlation and XGBoost for edge data center. Inf. Sci. 2023, 643, 13. [Google Scholar]
- Huang, Y.K.; Chen, Y.L. Prediction of Total Phosphorus Based on Distance Correlation and Machine Learning Methods-a Case Study of Dongjiang River, China. Water Air Soil Pollut. 2024, 235, 14. [Google Scholar] [CrossRef]
- Ruan, S.; Chen, B.; Song, K.; Li, H. Weighted naive Bayes text classification algorithm based on improved distance correlation coefficient. Neural Comput. Appl. 2022, 34, 2729–2738. [Google Scholar] [CrossRef]
- Chen, Y.; Song, L.; Liu, Y.; Yang, L.; Li, D. A Review of the Artificial Neural Network Models for Water Quality Prediction. Appl. Sci. 2020, 10, 49. [Google Scholar] [CrossRef]
- Noori, N.; Kalin, L.; Isik, S. Water quality prediction using SWAT-ANN coupled approach. J. Hydrol. 2020, 590, 10. [Google Scholar] [CrossRef]
- Wang, L.; Zou, H.; Su, J.; Li, L.; Chaudhry, S. An ARIMA-ANN Hybrid Model for Time Series Forecasting. Syst. Res. Behav. Sci. 2013, 30, 244–259. [Google Scholar] [CrossRef]
- Hochreiter, S.; Schmidhuber, J. Long short-term memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
- Gers, F.A.; Schmidhuber, J.; Cummins, F. Learning to forget: Continual prediction with LSTM. Neural Comput. 2000, 12, 2451–2471. [Google Scholar] [CrossRef]
- Liu, P.; Wang, J.; Sangaiah, A.K.; Xie, Y.; Yin, X. Analysis and Prediction of Water Quality Using LSTM Deep Neural Networks in IoT Environment. Sustainability 2019, 11, 14. [Google Scholar] [CrossRef]
- Li, G.; Cui, Q.; Wei, S.; Wang, X.; Xu, L.; He, L.; Kwong, T.C.H.; Tang, Y. Long short-term memory network-based wastewater quality prediction model with sparrow search algorithm. Int. J. Wavelets Multiresolut. Inf. Process. 2023, 21, 20. [Google Scholar] [CrossRef]
- Kratzert, F.; Klotz, D.; Brenner, C.; Schulz, K.; Herrnegger, M. Rainfall-runoff modelling using Long Short-Term Memory (LSTM) networks. Hydrol. Earth Syst. Sci. 2018, 22, 6005–6022. [Google Scholar] [CrossRef]
- Khullar, S.; Singh, N. Water quality assessment of a river using deep learning Bi-LSTM methodology: Forecasting and validation. Environ. Sci. Pollut. Res. 2022, 29, 12875–12889. [Google Scholar]
- Barzegar, R.; Aalami, M.T.; Adamowski, J. Short-term water quality variable prediction using a hybrid CNN-LSTM deep learning model. Stoch. Environ. Res. Risk Assess. 2020, 34, 415–433. [Google Scholar]
- Wu, J.H.; Wang, Z.C. A Hybrid Model for Water Quality Prediction Based on an Artificial Neural Network, Wavelet Transform, and Long Short-Term Memory. Water 2022, 14, 26. [Google Scholar] [CrossRef]
- Wang, Z.C.; Wang, Q.Y.; Wu, T.H. A novel hybrid model for water quality prediction based on VMD and IGOA optimized for LSTM. Front. Environ. Sci. Eng. 2023, 17, 17. [Google Scholar]
- Zhang, Y.; Li, C.; Jiang, Y.; Sun, L.; Zhao, R.; Yan, K.; Wang, W. Accurate prediction of water quality in urban drainage network with integrated EMD-LSTM model. J. Clean. Prod. 2022, 354, 12. [Google Scholar]
- Madhukumar, N.; Wang, E.; Fookes, C.; Xiang, W. 3-D Bi-directional LSTM for Satellite Soil Moisture Downscaling. IEEE Trans. Geosci. Remote Sens. 2022, 60, 18. [Google Scholar]
- Chaudhuri, A.; Hu, W.H. A fast algorithm for computing distance correlation. Comput. Stat. Data Anal. 2019, 135, 15–24. [Google Scholar]
- Huo, X.M.; Székely, G.J. Fast Computing for Distance Covariance. Technometrics 2016, 58, 435–447. [Google Scholar]
- Yu, Y.; Si, X.; Hu, C.; Zhang, J. A Review of Recurrent Neural Networks: LSTM Cells and Network Architectures. Neural Comput. 2019, 31, 1235–1270. [Google Scholar]
- Salehi, A.W.; Khan, S.; Gupta, G.; Alabduallah, B.I.; Almjally, A.; Alsolai, H.; Siddiqui, T.; Mellit, A. A Study of CNN and Transfer Learning in Medical Imaging: Advantages, Challenges, Future Scope. Sustainability 2023, 15, 28. [Google Scholar] [CrossRef]
- Zhao, J.F.; Mao, X.; Chen, L.J. Learning deep features to recognise speech emotion using merged deep CNN. IET Signal Process. 2018, 12, 713–721. [Google Scholar]
- Shahid, F.; Zameer, A.; Muneeb, M. Predictions for COVID-19 with deep learning models of LSTM, GRU and Bi-LSTM. Chaos Solitons Fractals 2020, 140, 9. [Google Scholar]
- Zhang, Q.; Wang, R.; Qi, Y.; Wen, F. A watershed water quality prediction model based on attention mechanism and Bi-LSTM. Environ. Sci. Pollut. Res. 2022, 29, 75664–75680. [Google Scholar]
- Yao, S.; Zhang, Y.; Wang, P.; Xu, Z.; Wang, Y.; Zhang, Y. Long-Term Water Quality Prediction Using Integrated Water Quality Indices and Advanced Deep Learning Models: A Case Study of Chaohu Lake, China, 2019-2022. Appl. Sci. 2022, 12, 16. [Google Scholar] [CrossRef]
- Im, Y.; Song, G.; Lee, J.; Cho, M. Deep Learning Methods for Predicting Tap-Water Quality Time Series in South Korea. Water 2022, 14, 24. [Google Scholar] [CrossRef]
- Wu, H.; Cheng, S.; Xin, K.; Ma, N.; Chen, J.; Tao, L.; Gao, M. Water Quality Prediction Based on Multi-Task Learning. Int. J. Environ. Res. Public Health 2022, 19, 19. [Google Scholar] [CrossRef]
- Sherstinsky, A. Fundamentals of Recurrent Neural Network (RNN) and Long Short-Term Memory (LSTM) network. Phys. D-Nonlinear Phenom. 2020, 404, 28. [Google Scholar]
- Kang, H.; Yang, S.; Huang, J.; Oh, J. Time Series Prediction of Wastewater Flow Rate by Bidirectional LSTM Deep Learning. Int. J. Control Autom. Syst. 2020, 18, 3023–3030. [Google Scholar]
- Kota, V.R.; Munisamy, S.D. High accuracy offering attention mechanisms based deep learning approach using CNN/bi-LSTM for sentiment analysis. Int. J. Intell. Comput. Cybern. 2022, 15, 61–74. [Google Scholar]
- Bi, J.; Zhang, L.; Yuan, H.; Zhang, J. Multi-indicator water quality prediction with attention-assisted bidirectional LSTM and encoder-decoder. Inf. Sci. 2023, 625, 65–80. [Google Scholar]
- Zhang, M.; Zhang, Z.; Wang, X.; Liao, Z.; Wang, L. The Use of Attention-Enhanced CNN-LSTM Models for Multi-Indicator and Time-Series Predictions of Surface Water Quality. Water Resour. Manag. 2024, 38, 6103–6119. [Google Scholar]
- Rajaee, T.; Khani, S.; Ravansalar, M. Artificial intelligence-based single and hybrid models for prediction of water quality in rivers: A review. Chemom. Intell. Lab. Syst. 2020, 200, 25. [Google Scholar]
- Wu, W.; Xu, Z.; Zhan, C.; Yin, X.; Yu, S. A new framework to evaluate ecosystem health: A case study in the Wei River basin, China. Environ. Monit. Assess. 2015, 187, 15. [Google Scholar]
- An, T.; Feng, K.; Cheng, P.; Li, R.; Zhao, Z.; Xu, X.; Zhu, L. Adaptive prediction for effluent quality of wastewater treatment plant: Improvement with a dual-stage attention-based LSTM network. J. Environ. Manag. 2024, 359, 11. [Google Scholar]
- Romić, D.; Reljić, M.; Romić, M.; Babac, M.B.; Brkić, Ž.; Ondrašek, G.; Kovačić, M.B.; Zovko, M. Temporal Variations in Chemical Proprieties of Waterbodies within Coastal Polders: Forecast Modeling for Optimizing Water Management Decisions. Agriculture 2023, 13, 27. [Google Scholar] [CrossRef]
- Tao, Z.; Xia, Q.; Cheng, S.; Li, Q. An Efficient and Robust Cloud-Based Deep Learning With Knowledge Distillation. IEEE Trans. Cloud Comput. 2023, 11, 1733–1745. [Google Scholar]
- Kim, J.; Kim, H.; Kim, D.-J.; Song, J.; Li, C. Deep Learning-Based Flood Area Extraction for Fully Automated and Persistent Flood Monitoring Using Cloud Computing. Remote Sens. 2022, 14, 19. [Google Scholar] [CrossRef]
- Hassija, V.; Chamola, V.; Mahapatra, A.; Singal, A.; Goel, D.; Huang, K.; Scardapane, S.; Spinelli, I.; Mahmud, M.; Hussain, A. Interpreting Black-Box Models: A Review on Explainable Artificial Intelligence. Cogn. Comput. 2024, 16, 45–74. [Google Scholar]
- Infant, S.S.; Vickram, S.; Saravanan, A.; Muthu, C.M.M.; Yuarajan, D. Explainable artificial intelligence for sustainable urban water systems engineering. Results Eng. 2025, 25, 14. [Google Scholar]
- Wang, Z.; Xu, N.; Bao, X.; Wu, J.; Cui, X. Spatio-temporal deep learning model for accurate streamflow prediction with multi-source data fusion. Environ. Model. Softw. 2024, 178, 21. [Google Scholar]
Index | Unit | Mean | Standard Deviation | Minimum | Median | Maximum |
---|---|---|---|---|---|---|
Temperature | ℃ | 24.143 | 5.438 | 0.330 | 24.144 | 35.537 |
PH | - | 7.159 | 0.482 | 5.370 | 7.110 | 9.840 |
Dissolved Oxygen | mg | 6.908 | 2.221 | 0.533 | 7.036 | 22.588 |
CODMn | mg | 2.737 | 1.432 | 0.250 | 2.850 | 24.292 |
Ammonia Nitrogen | mg | 0.387 | 0.540 | 0.025 | 0.151 | 4.505 |
Total Phosphorus (TP) | mg | 0.087 | 0.086 | 0.005 | 0.057 | 1.418 |
TN | mg | 5.226 | 3.612 | 0.133 | 5.193 | 18.230 |
Conductivity | 341.310 | 246.410 | 0.002 | 213.100 | 1133.100 | |
Turbidity | NTU | 27.064 | 42.838 | 0.030 | 15.686 | 775.600 |
Water Quality Class | TN Concentration (mg/L) | Designated Use |
---|---|---|
Class I | ≤0.2 | Protected areas for source water and national reserves |
Class II | ≤0.5 | Centralized drinking water sources, fishery waters |
Class III | ≤1.0 | General drinking water and recreational uses |
Class IV | ≤1.5 | Industrial and agricultural water |
Class V | >1.5 | Poor-quality water for limited uses |
Range (Absolute Value) | Degree |
---|---|
0.8–1.0 | Extremely strong correlation |
0.6–0.79 | Good correlation |
0.4–0.59 | Moderate correlation |
0.2–0.39 | Weak correlation |
0.0–0.19 | Extremely weak or no correlation |
Program | Input Variables | Relevant Degree |
---|---|---|
1 | Conductivity | Extremely strong correlation |
2 | Conductivity, CODMn | Extremely strong and good correlation |
3 | Conductivity, CODMn, total phosphorus | Extremely strong and good correlation |
4 | Conductivity, CODMn, total phosphorus, ammonia nitrogen | Extremely strong, good, moderate correlation |
5 | Conductivity, CODMn, total phosphorus, ammonia nitrogen, pH | Extremely strong, good, moderate, and weak correlation |
6 | Conductivity, CODMn, total phosphorus, ammonia nitrogen, pH, dissolved oxygen | Extremely strong, good, moderate, and weak correlation |
7 | Conductivity, CODMn, total phosphorus, ammonia nitrogen, pH, dissolved oxygen, water temperature | Extremely strong, good, moderate, weak, extremely weak correlation |
Days | Error Indicator | LSTM | Bi-LSTM | CNN-LSTM | Attention-LSTM | At-CBiLSTM |
---|---|---|---|---|---|---|
1 | MAE | 0.079 | 0.076 | 0.064 | 0.068 | 0.062 |
MSE | 0.017 | 0.016 | 0.014 | 0.015 | 0.012 | |
MAPE | 0.51 | 0.501 | 0.441 | 0.453 | 0.415 | |
RMSE | 0.129 | 0.117 | 0.098 | 0.11 | 0.089 | |
2 | MAE | 0.078 | 0.065 | 0.061 | 0.063 | 0.051 |
MSE | 0.015 | 0.015 | 0.013 | 0.014 | 0.011 | |
MAPE | 0.472 | 0.452 | 0.422 | 0.441 | 0.405 | |
RMSE | 0.087 | 0.082 | 0.072 | 0.077 | 0.067 | |
3 | MAE | 0.074 | 0.064 | 0.054 | 0.06 | 0.032 |
MSE | 0.012 | 0.011 | 0.007 | 0.010 | 0.005 | |
MAPE | 0.393 | 0.375 | 0.355 | 0.365 | 0.218 | |
RMSE | 0.062 | 0.058 | 0.055 | 0.057 | 0.045 | |
4 | MAE | 0.078 | 0.069 | 0.06 | 0.065 | 0.053 |
MSE | 0.033 | 0.026 | 0.016 | 0.024 | 0.015 | |
MAPE | 0.480 | 0.385 | 0.345 | 0.351 | 0.305 | |
RMSE | 0.076 | 0.072 | 0.06 | 0.064 | 0.054 | |
5 | MAE | 0.093 | 0.085 | 0.072 | 0.083 | 0.063 |
MSE | 0.036 | 0.03 | 0.026 | 0.028 | 0.016 | |
MAPE | 0.585 | 0.505 | 0.459 | 0.475 | 0.335 | |
RMSE | 0.081 | 0.076 | 0.068 | 0.072 | 0.061 | |
6 | MAE | 0.13 | 0.115 | 0.082 | 0.098 | 0.080 |
MSE | 0.044 | 0.038 | 0.034 | 0.036 | 0.025 | |
MAPE | 0.637 | 0.575 | 0.545 | 0.560 | 0.395 | |
RMSE | 0.099 | 0.094 | 0.074 | 0.088 | 0.069 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chen, Y.; Yao, W.; Chen, Y. High-Precision Prediction of Total Nitrogen Based on Distance Correlation and Machine Learning Models—A Case Study of Dongjiang River, China. Water 2025, 17, 1131. https://doi.org/10.3390/w17081131
Chen Y, Yao W, Chen Y. High-Precision Prediction of Total Nitrogen Based on Distance Correlation and Machine Learning Models—A Case Study of Dongjiang River, China. Water. 2025; 17(8):1131. https://doi.org/10.3390/w17081131
Chicago/Turabian StyleChen, Yuanpei, Weike Yao, and Yiling Chen. 2025. "High-Precision Prediction of Total Nitrogen Based on Distance Correlation and Machine Learning Models—A Case Study of Dongjiang River, China" Water 17, no. 8: 1131. https://doi.org/10.3390/w17081131
APA StyleChen, Y., Yao, W., & Chen, Y. (2025). High-Precision Prediction of Total Nitrogen Based on Distance Correlation and Machine Learning Models—A Case Study of Dongjiang River, China. Water, 17(8), 1131. https://doi.org/10.3390/w17081131