Low Tensor Rank Constrained Image Inpainting Using a Novel Arrangement Scheme
Abstract
:1. Introduction
- First, we developed a novel rearrangement, named the quarter arrangement (QA) scheme, for permuting the image into three flexible forms of data. The first flexible QA scheme can permute an image into an unfolding matrix (with a low matrix rank structure). The second and the third flexible QA schemes can permute the color image into a balanced 3-order form of data (with low tubal rank structure) and a higher-order form of data (with low TT rank structure), respectively. Because the developed schemes are designed to exploit the internal structure similarity of the original data as much as possible, the rearranged data have the corresponding low-rank structure.
- Second, based on the above QA scheme, we developed three image inpainting models that exploit the unfolding matrix rank, tensor tubal rank, and TT multi-rank of the rearranged data to solve the image inpainting problem.
- Lastly, three efficient ADMM algorithms were developed to solve the above three models. Compared with numerous close image inpainting methods, the experimental results demonstrated the superior performance of our methods.
2. Related Work
2.1. Ket Augmentation
2.2. T-SVD Decomposition
2.3. Tensor Train Decomposition
3. Methods
3.1. Quarter Arrangement
3.2. Method 1: The Low Unfolding Matrix Rank-Based Method
Algorithm 1. The algorithm for solving the model (3) |
Input:, maximum number of iteration , convergence condition . |
Initialization: initial , by solving the matrix completion problem (11), , ,, t = 0. |
While and do The first flexible QA scheme: Turn an image into an order-N tensor , then unfold it. Solve (5)–(10) for , where * represents the optimal solution. Update , . End while |
Output: . |
3.3. Method 2: The Low Tubal Rank-Based Method
Algorithm 2. The algorithm for solving the model (13) |
Input: , the maximum number of iteration , convergence condition . |
Initialization: , , , , t = 0. |
While and do QA scheme: Turn an image into the balanced order-3 tensor . Update Update Update Update , Update , . End while |
Output: . |
3.4. Method 3: The Low TT Rank-Based Method
Algorithm 3. The algorithm for solving the model (16) |
Input:, the maximum number of iteration , convergence condition . |
Initialization: , by the LMaFit method [47]; , , . |
For n = 1 to N − 1 do t = 0. While and do QA scheme: permute image to order-N tensor . Update Update Update Update Update , Update , . End while End for |
Output: . |
4. Experimental Results and Analyses
4.1. Analyses of the Three Flexible QA Schemes
4.2. Analyses of the Methods Exploiting Both Low Rankness and Sparsity
4.3. Analyses of TTLR and TTLRTV Methods
4.4. Missing Ratio of 90%
4.5. Runtime and Complexity Analysis
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Pendu, M.; Jiang, X.; Guillemot, C. Light field inpainting propagation via low rank matrix completion. IEEE Trans. Image Process. 2018, 27, 1989–1993. [Google Scholar] [CrossRef] [PubMed]
- Yu, Y.; Peng, J.; Yue, S. A new nonconvex approach to low-rank matrix completion with application to image inpainting, Multidim. Syst. Signal Process. 2018, 30, 145–174. [Google Scholar]
- Gong, X.; Chen, W.; Chen, J. A low-rank tensor dictionary learning method for hyperspectral image denoising. IEEE Trans. Signal Process. 2020, 68, 1168–1180. [Google Scholar] [CrossRef]
- Su, X.; Ge, H.; Liu, Z.; Shen, Y. Low-Rank tensor completion based on nonconvex regularization. Signal Process. 2023, 212, 109157. [Google Scholar] [CrossRef]
- Ma, S.; Ai, J.; Du, H.; Fang, L.; Mei, W. Recovering low-rank tensor from limited coefficients in any ortho-normal basis using tensor-singular value decomposition. IET Signal Process. 2021, 19, 162–181. [Google Scholar] [CrossRef]
- Liu, Y.; Long, Z.; Zhu, C. Image completion using low tensor tree rank and total variation minimization. IEEE Trans. Multimedia. 2018, 21, 338–350. [Google Scholar] [CrossRef]
- Gong, W.; Huang, Z.; Yang, L. Accurate regularized Tucker decomposition for image restoration. Appl. Math. Model. 2023, 123, 75–86. [Google Scholar] [CrossRef]
- Long, Z.; Liu, Y.; Chen, L.; Zhu, C. Low rank tensor completion for multiway visual data. Signal Process. 2019, 155, 301–316. [Google Scholar] [CrossRef]
- Kolda, T.; Bader, B. Tensor decompositions and applications. SIAM Rev. 2009, 51, 455–500. [Google Scholar] [CrossRef]
- Kilmer, M.; Braman, K.; Hao, N.; Hoover, R.C. Third-order tensors as operators on matrices: A theoretical and computational framework with applications in imaging. SIAM J. Matrix Anal. Appl. 2013, 34, 148–172. [Google Scholar] [CrossRef]
- Semerci, O.; Hao, N.; Kilmer, M.; Miller, E.L. Tensor-based formulation and nuclear norm regularization for multienergy computed tomography. IEEE Trans. Image Process. 2014, 23, 1678–1693. [Google Scholar] [CrossRef] [PubMed]
- Zhou, P.; Lu, C.; Lin, Z.; Zhang, C. Tensor factorization for low-rank tensor completion. IEEE Trans. Image Process. 2018, 3, 1152–1163. [Google Scholar] [CrossRef] [PubMed]
- Oseledets, I.; Tyrtyshnikov, E. TT-cross approximation for multidimensional arrays. Linear Algebra Appl. 2010, 432, 70–88. [Google Scholar] [CrossRef]
- Oseledets, I. Tensor-train decomposition. SIAM J. Sci. Comput. 2011, 33, 2295–2317. [Google Scholar] [CrossRef]
- Hackbusch, W.; Kuhn, S. A new scheme for the tensor representation. J. Fourier Anal. Appl. 2009, 15, 706–722. [Google Scholar] [CrossRef]
- Zhao, Q.; Zhou, G.; Xie, S.; Zhang, L.; Cichocki, A. Tensor ring decomposition. arXiv 2016, arXiv:1606.05535. [Google Scholar]
- Zheng, Y.; Huang, T.; Zhao, X.; Zhao, Q.; Jiang, T. Fully-connected tensor network decomposition and its application to higher-order tensor completion. Proc. AAAI Conf. Artif. Intell. 2021, 35, 11071–11078. [Google Scholar] [CrossRef]
- Zhang, Z.; Aeron, S. Exact tensor completion using T-SVD. IEEE Trans. Signal Process. 2017, 65, 1511–1526. [Google Scholar] [CrossRef]
- Lu, C.; Feng, J.; Chen, Y.; Liu, W.; Lin, Z.; Yan, S. Tensor robust principal component analysis with a new tensor nuclear norm. IEEE Trans. Pattern Anal. Mach. Intell. 2020, 42, 925–938. [Google Scholar] [CrossRef]
- Du, S.; Xiao, Q.; Shi, Y.; Cucchiara, R.; Ma, Y. Unifying tensor factorization and tensor nuclear norm approaches for low-rank tensor completion. Neurocomput 2021, 458, 204–218. [Google Scholar] [CrossRef]
- Chen, Y.; Hsu, C.; Liao, H. Simultaneous tensor decomposition and completion using factor priors. IEEE Trans. Pattern Anal. Mach. Intell. 2014, 36, 577–591. [Google Scholar] [CrossRef] [PubMed]
- Xue, J.; Zhao, Y.; Liao, W.; Chan, J.C.-W.; Kong, S.G. Enhanced sparsity prior model for low-rank tensor completion. IEEE Trans. Neural. Netw. Learn. Syst. 2019, 31, 4567–4581. [Google Scholar] [CrossRef]
- Liu, J.; Musialski, P.; Wonka, P.; Ye, J. Tensor completion for estimating missing values in visual data. IEEE Trans. Pattern Anal. Mach. Intell. 2013, 35, 208–220. [Google Scholar] [CrossRef] [PubMed]
- Qin, M.; Li, Z.; Chen, S.; Guan, Q.; Zheng, J. Low-Rank Tensor Completion and Total Variation Minimization for Color Image Inpainting. IEEE Access 2020, 8, 53049–53061. [Google Scholar] [CrossRef]
- Yokota, T.; Zhao, Q.; Cichocki, A. Smooth PARAFAC decomposition for tensor completion. IEEE Trans. Signal Process. 2016, 64, 5423–5436. [Google Scholar] [CrossRef]
- Yokota, T.; Hontani, H. Simultaneous visual data completion and denoising based on tensor rank and total variation minimization and its primal-dual splitting algorithm. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 3732–3740. [Google Scholar]
- Li, L.; Jiang, F.; Shen, R. Total Variation Regularized Reweighted Low-rank Tensor Completion for Color Image Inpainting. In Proceedings of the 25th IEEE International Conference on Image Processing (ICIP), Athens, Greece, 7–10 October 2018; pp. 2152–2156. [Google Scholar]
- Zhao, Q.; Zhang, L.; Cichocki, A. Bayesian CP factorization of incomplete tensors with automatic rank determination. IEEE Trans. Pattern Anal. Mach. Intell. 2015, 37, 1751–1763. [Google Scholar] [CrossRef]
- Wang, X.; Philip, L.; Yang, W.; Su, J. Bayesian robust tensor completion via CP decomposition. Pattern Recognit. Lett. 2022, 163, 121–128. [Google Scholar] [CrossRef]
- Zhu, Y.; Wang, W.; Yu, G.; Wang, J.; Tang, L. A Bayesian robust CP decomposition approach for missing traffic data imputation. Multimed Tools Appl. 2022, 81, 33171–33184. [Google Scholar] [CrossRef]
- Cui, G.; Zhu, L.; Gui, L.; Zhao, Q.; Zhang, J.; Cao, J. Multidimensional clinical data denoising via Bayesian CP factorization. Sci. China Technol. 2020, 63, 249–254. [Google Scholar] [CrossRef]
- Liu, Y.; Zhao, X.; Song, G.; Zheng, Y.; Ng, M.K.; Huang, T. Fully-connected tensor network decomposition for robust tensor completion problem. Inverse Probl. Imaging 2024, 18, 208–238. [Google Scholar] [CrossRef]
- Li, X.P.; Wang, Z.; Shi, Z.; So, H.C.; Sidiropoulos, N.D. Robust tensor completion via capped Frobenius norm. IEEE Trans. Neural Netw. Learn. Syst. 2024, 35, 9700–9712. [Google Scholar] [CrossRef] [PubMed]
- Bengua, J.; Phien, H.; Tuan, H.; Do, M.N. Efficient tensor completion for color image and video recovery: Low-Rank Tensor Train. IEEE Trans. Image Process. 2017, 26, 1057–7149. [Google Scholar] [CrossRef]
- Ma, S.; Du, H.; Hu, J.; Wen, X.; Mei, W. Image inpainting exploiting tensor train and total variation. In Proceedings of the 12th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI), Suzhou, China, 19–21 October 2019; pp. 1–5. [Google Scholar]
- Ma, S. Video inpainting exploiting tensor train and sparsity in frequency domain. In Proceedings of the IEEE 6th International Conference on Signal and Image Processing (ICSIP), Nanjing, China, 22–24 October 2021; pp. 1–5. [Google Scholar]
- Ma, S.; Du, H.; Mei, W. Dynamic MR image reconstruction from highly undersampled (k, t)-space data exploiting low tensor train rank and sparse prior. IEEE Access 2020, 8, 28690–28703. [Google Scholar] [CrossRef]
- Latorre, J. Image Compression and Entanglement. Available online: https://arxiv.org/abs/quant-ph/0510031 (accessed on 3 March 2018).
- Kilmer, M.; Martin, C. Factorization strategies for third-order tensors. Linear Algebra Appl. 2011, 435, 641–658. [Google Scholar] [CrossRef]
- Martin, C.; Shafer, R.; Larue, B. An order-p tensor factorization with applications in imaging. SIAM J. Sci. Comput. 2013, 35, A474–A490. [Google Scholar] [CrossRef]
- Oseledets, I. Compact matrix form of the d-dimensional tensor decomposition. In Proceedings of the International Symposium on Nonlinear Theory and Its Applications, Sapporo, Japan, 19–21 October 2009. [Google Scholar]
- Lingala, S.; Hu, Y.; Dibella, E.; Jacob, M. Accelerated dynamic MRI exploiting sparsity and low-rank structure: K-t SLR. IEEE Trans. Med. Imaging. 2011, 30, 1042–1054. [Google Scholar] [CrossRef] [PubMed]
- Recht, B.; Fazel, M.; Parrilo, P. Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization. SIAM Rev. 2010, 52, 471–501. [Google Scholar] [CrossRef]
- Signoretto, M.; Cevher, V.; Suykens, J. An SVD-free approach to a class of structured low rank matrix optimization problems with application to system identification. In Proceedings of the IEEE Conference on Decision and Control (CDC), Firenze, Italy, 10–13 December 2013. no. EPFL-CONF-184990. [Google Scholar]
- Liu, H.; Xiong, R.; Zhang, X.; Zhang, Y. Nonlocal gradient sparsity regularization for image restoration. IEEE Trans. Circ. Syst. Vid. 2017, 27, 1909–1921. [Google Scholar] [CrossRef]
- Feng, X.; Li, H.; Li, J.; Du, Q. Hyperspectral Unmixing Using Sparsity-Constrained Deep Nonnegative Matrix Factorization with Total Variation. IEEE Trans. Geosci. Remote Sens. 2018, 56, 6245–6257. [Google Scholar] [CrossRef]
- Wen, Z.; Yin, W.; Zhang, Y. Solving a low-rank factorization model for matrix completion by a nonlinear successive over-relaxation algorithm. Math. Program. Comput. 2012, 4, 333–361. [Google Scholar] [CrossRef]
- Ai, J.; Ma, S.; Du, H.; Fang, L. Dynamic MRI Reconstruction Using Tensor-SVD. In Proceedings of the 14th IEEE International Conference on Signal Processing, Beijing, China, 12–16 August 2018; pp. 1114–1118. [Google Scholar]
- Wang, Z.; Bovik, A.; Sheikh, H.; Simoncelli, E. Image quality assessment: From error visibility to structural similarity. IEEE Trans. Image Process. 2004, 13, 600–612. [Google Scholar] [CrossRef] [PubMed]
- Ko, C.; Batselier, K.; Yu, W.; Wong, N. Fast and Accurate Tensor Completion with Total Variation Regularized Tensor Trains. IEEE Trans. Image Process. 2020, 29, 6918–6931. [Google Scholar] [CrossRef]
Abbreviations | Full Terms |
---|---|
t-SVD | tensor singular value decomposition |
TT | tensor train |
TV | total variation |
KA | ket augmentation |
QA | quarter arrangement |
Symbols | Notations and Definitions |
---|---|
fiber | A vector defined by fixing every index but one of a tensor. |
slice | A matrix defined by fixing all but two indices of a tensor. |
frontal slice of a 3-order tensor . | |
Mode-n matrix, resulting from unfolding tensor by reshaping its mode-n fibers to the columns of . | |
f-diagonal tensor | Order-3 tensor is called f-diagonal if each frontal slice is a diagonal matrix [10]. |
orthogonal tensor | Tensor with the size of is called orthogonal tensor if where stands for identity tensor if the first frontal slice is the identity matrix and all other frontal slices ) are zero. |
Methods | PSNR (dB)/SSIM of Different Color Images Under Different Missing Patterns | ||||
---|---|---|---|---|---|
House | Lena | Airplane | Boats | ||
Random 50% | Lines | Random line | Random 80% | ||
Without rearrangement | MatrixLR | 9.38/0.8970 | 13.34/0.5850 | 7.118/0.1308 | 19.18/0.5680 |
TTLR | 28.61/0.871 | 13.34/0.585 | 7.11/0.130 | 19.25/0.519 | |
tSVDLR | 32.30/0.932 | 13.34/0.585 | 7.11/0.130 | 21.60/0.707 | |
UnfoldingLR | 7.83/0.093 | 13.34/0.585 | 7.11/0.130 | 6.32/0.102 | |
With rearrangement | TTLR | 30.21/0.9251 | 31.79/0.9559 | 25.77/0.8796 | 21.44/0.7144 |
tSVDLR | 29.79/0.8989 | 31.20/0.9561 | 18.91/0.8386 | 21.34/0.6879 | |
UnfoldingLR | 32.58/0.9416 | 33.45/0.9771 | 28.75/0.9464 | 23.46/0.8139 |
No. | Methods | PSNR (dB)/SSIM of Different Color Images Under Different Missing Patterns | ||||||
---|---|---|---|---|---|---|---|---|
House | Peppers | Lena | Airplane | Baboon | Boats | |||
Random 50% | Text | Lines | Random Line | Blocks | Random 80% | |||
Other methods | 1 | STDC | 32.04/0.9300 | 33.61/0.9813 | 28.56/0.8995 | 23.49/0.7756 | 27.01/0.9293 | 21.88/0.7340 |
2 | HaLRTC | 32.07/0.9423 | 25.84/0.9496 | 13.34/0.5850 | 19.94/0.6334 | 28.04/0.9397 | 20.56/0.6858 | |
3 | FBCP | 26.41/0.8701 | NAN | 14.56/0.5242 | 10.25/0.1954 | 18.71/0.5546 | 20.91/0.6947 | |
4 | TMac-TTKA | 23.18/0.8113 | 29.47/0.9681 | 29.93/0.9462 | 20.82/0.7521 | 28.04/0.9429 | 8.83/0.1229 | |
5 | SPCTV | 29.56/0.9133 | 23.38/0.9154 | 16.02/0.6107 | 18.58/0.6894 | 24.21/0.9144 | 20.98/0.7254 | |
6 | LRTV | 30.93/0.9382 | 36.98/0.9945 | 34.07/0.9724 | 26.82/0.9228 | 27.10/0.9319 | 21.62/0.7541 | |
Our methods | 1 | TTLRTV | 33.02/0.9579 | 37.27/0.9945 | 34.94/0.9823 | 28.82/0.9561 | 29.46/0.9559 | 22.37/0.7487 |
2 | tSVDLRTV | 32.20/0.9550 | 37.49/0.9950 | 34.70/0.9818 | 28.03/0.9507 | 29.56/0.9574 | 22.86/0.8021 | |
3 | UnfoldingLRTV | 35.61/0.9689 | 37.72/0.9952 | 34.87/0.9821 | 29.55/0.9639 | 29.59/0.9556 | 25.43/0.8863 |
No. | Methods | PSNR (dB)/SSIM of Different Color Images Under Different Missing Patterns | |||||
---|---|---|---|---|---|---|---|
House | Peppers | Lena | Airplane | Baboon | Boats | ||
Random 50% | Text | Lines | Random Line | Blocks | Random 80% | ||
1 | MatrixLR | 9.38/0.8970 | 33.23/0.9814 | 13.34/0.5850 | 7.118/0.1308 | 27.62/0.9343 | 19.18/0.5680 |
2 | TV | 29.70/0.8816 | 34.14/0.9913 | 29.21/0.9107 | 22.85/0.8463 | 23.18/0.9066 | 20.32/0.6103 |
3 | TTLR | 30.21/0.9251 | 34.86/0.9892 | 31.79/0.9559 | 25.77/0.8796 | 25.42/0.9239 | 21.44/0.7144 |
4 | tSVDLR | 29.79/0.8989 | 33.86/0.9840 | 31.20/0.9561 | 18.91/0.8386 | 28.03/0.9373 | 21.34/0.6879 |
5 | UnfoldingLR | 32.58/0.9416 | 36.86/0.9938 | 33.45/0.9771 | 28.75/0.9464 | 22.22/0.9238 | 23.46/0.8139 |
6 | TTLRTV | 33.02/0.9579 | 37.27/0.9945 | 34.94/0.9823 | 28.82/0.9561 | 29.46/0.9559 | 22.37/0.7487 |
7 | tSVDLRTV | 32.20/0.9550 | 37.49/0.9950 | 34.70/0.9818 | 28.03/0.9507 | 29.56/0.9574 | 22.86/0.8021 |
8 | UnfoldingLRTV | 35.61/0.9689 | 37.72/0.9952 | 34.87/0.9821 | 29.55/0.9639 | 29.59/0.9556 | 25.43/0.8863 |
Methods | CFN-RTC | TTC | TTLR | tSVDLR | UnfoldingLR |
---|---|---|---|---|---|
PSNR | 19.65 | 8.70 | 19.05 | 20.54 | 19.44 |
SSIM | 0.4507 | 0.1745 | 0.4688 | 0.4862 | 0.4455 |
Methods | RNC-FCTN | TTCTV | TTLRTV | tSVDLRTV | UnfoldingLRTV |
PSNR | 8.79 | 8.75 | 21.46 | 22.11 | 21.65 |
SSIM | 0.0929 | 0.1817 | 0.5483 | 0.6007 | 0.5706 |
Methods | Runtime (s) | |||
---|---|---|---|---|
House | Lena | Airplane | Boats | |
Random 50% | Lines | Random Lines | Random 80% | |
MratrixLR | 4.95 | 0.17 | 0.16 | 5.01 |
STDC | 5.43 | 5.13 | 5.17 | 5.16 |
HaLRTC | 8.00 | 0.88 | 0.84 | 6.84 |
FBCP | 188.32 | 86.45 | 132.09 | 219.33 |
SPCTV | 19.25 | 16.37 | 16.03 | 17.69 |
LRTV | 19.08 | 20.17 | 21.04 | 21.05 |
TTLRTV | 145.5 | 143.2 | 142.6 | 142.3 |
tSVDLRTV | 15.23 | 15.07 | 15.17 | 15.14 |
UnfoldingLRTV | 9.49 | 8.53 | 8.69 | 8.72 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ma, S.; Fan, Y.; Fang, S.; Yang, W.; Li, L. Low Tensor Rank Constrained Image Inpainting Using a Novel Arrangement Scheme. Appl. Sci. 2025, 15, 322. https://doi.org/10.3390/app15010322
Ma S, Fan Y, Fang S, Yang W, Li L. Low Tensor Rank Constrained Image Inpainting Using a Novel Arrangement Scheme. Applied Sciences. 2025; 15(1):322. https://doi.org/10.3390/app15010322
Chicago/Turabian StyleMa, Shuli, Youchen Fan, Shengliang Fang, Weichao Yang, and Li Li. 2025. "Low Tensor Rank Constrained Image Inpainting Using a Novel Arrangement Scheme" Applied Sciences 15, no. 1: 322. https://doi.org/10.3390/app15010322
APA StyleMa, S., Fan, Y., Fang, S., Yang, W., & Li, L. (2025). Low Tensor Rank Constrained Image Inpainting Using a Novel Arrangement Scheme. Applied Sciences, 15(1), 322. https://doi.org/10.3390/app15010322