**6. Conclusions and Future Work**

In this paper, we presented a fast image quality metric, based on statistical features of the sign–magnitude transform to estimate the quality of iris images acquired by handheld devices in visible light. We suggest that this method can be used to decide whether an input iris sample should be enrolled in a dataset or rejected, and a new sample should be captured based on the quality score to improve the speed and the recognition rate of the reference iris recognition system.

We conducted extensive experiments to demonstrate these improvements using three performance methods for measuring the iris recognition accuracy on three large datasets acquired in unconstrained environments in visible light. The experiments showed that the proposed approach improved the accuracy of the reference iris recognition system.

However, we would like to point out that the inclusion of quality filtering in an iris recognition system can increase the computational costs of iris image recognition, and some iris images may be rejected unnecessarily. This could be caused by an error in the quality metric, by too conservative of a setting of the quality threshold, or by quality factors related to the subject covariates. In our future work, we will propose a metric for iris image quality assessment that takes into account all of these factors. Furthermore, another future work is to develop an algorithm to monitor criteria, such as recognition performance, time and number of photos required per person, and customer satisfaction, in order to dynamically adapt the threshold for quality filtering to achieve optimal performance.

It may also be promising to examine the use of the proposed quality metric to assess the quality of other biometric images, such as facial image, and NIR biometric images.

**Author Contributions:** Conceptualization, M.J. and M.P.; Investigation, M.J.; methodology M.J., M.P., D.S.; validation M.J. and D.S.; Writing—original draft preparation, M.J.; Writing—review and editing, M.J., M.P., D.S.; Visualization, M.J., M.P., D.S.; Supervision, M.P. and D.S. All authors have read and agreed to the published version of the manuscript.

**Funding:** This research was partially funded by the Exzellenzstrategie des Bundes und der Länder (the Excellence Strategy of the German Federal and State Governments), the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation)—Project-ID 251654672—TRR 161 and the Research Council of Norway within project no. 221073 HyPerCept–Color and quality in higher dimensions.

**Acknowledgments:** The authors thank Jon Yngve Hardeberg, Katrin Franke, and Sokratis Katsikas for their helpful discussions.

**Conflicts of Interest:** The authors declare no conflict of interest.
