Next Article in Journal
Design and Evaluation of Personalized Services to Foster Active Aging: The Experience of Technology Pre-Validation in Italian Pilots
Previous Article in Journal
Health-Related Telemonitoring Parameters/Signals of Older Adults: An Umbrella Review
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Texture-Hidden Anti-Counterfeiting QR Code and Authentication Method

1
School of Electronic Information, Wuhan University, Wuhan 430072, China
2
School of Cyber Science and Engineering, Wuhan University, Wuhan 430072, China
3
School of Artificial Intelligence, Hubei Business College, Wuhan 430079, China
*
Author to whom correspondence should be addressed.
Sensors 2023, 23(2), 795; https://doi.org/10.3390/s23020795
Submission received: 30 November 2022 / Revised: 3 January 2023 / Accepted: 6 January 2023 / Published: 10 January 2023
(This article belongs to the Section Intelligent Sensors)

Abstract

:
This paper designs a texture-hidden QR code to prevent the illegal copying of a QR code due to its lack of anti-counterfeiting ability. Combining random texture patterns and a refined QR code, the code is not only capable of regular coding but also has a strong anti-copying capability. Based on the proposed code, a quality assessment algorithm (MAF) and a dual feature detection algorithm (DFDA) are also proposed. The MAF is compared with several current algorithms without reference and achieves a 95% and 96% accuracy for blur type and blur degree, respectively. The DFDA is compared with various texture and corner methods and achieves an accuracy, precision, and recall of up to 100%, and also performs well on attacked datasets with reduction and cut. Experiments on self-built datasets show that the code designed in this paper has excellent feasibility and anti-counterfeiting performance.

1. Introduction

Anti-counterfeiting techniques can prevent or identify counterfeits to a certain extent, and traditional techniques can be classified into the following categories: visual anti-counterfeiting, electronic identification anti-counterfeiting, electronic code anti-counterfeiting, and texture anti-counterfeiting. Visual anti-counterfeiting mainly uses laser holography [1,2,3,4], special inks [5,6,7,8], temperature changes [9,10], security lines [11], chemical substances [12], and so on. They have special materials or unique formulations that not only cost a lot but also lose security once their information is leaked. Electronic identification anti-counterfeiting [13] commonly includes radio frequency, magnetic recording, integrated circuit cards, etc. These techniques work in conjunction with data management systems but they are not as generalizable as relying on specialized devices and are limited in application scenarios and scope. Based on the uniqueness of texture, all digital images of texture anti-counterfeiting marks [14,15] are obtained by high-definition photography, and then a database is established to upload, number, and save them. Consumers take photos with their mobile phones and retrieve the corresponding images saved in a database for human-eye comparison, but there is a lack of objective and intelligent means of identification. Traditional techniques suffer from various problems, such as excessive cost, poor user experience, and poor objectivity.
Different degrees of random toner adsorption occur in a digital graphic during printing, ensuring its physical non-replicability. The special design of the graphic can further amplify the distortion phenomenon, which can be identified by the algorithm. QR codes are widely used due to their strong coding ability and large information capacity. As QR codes are composed of a large number of black and white blocks, they are insensitive to the noise and distortion generated during printing. Therefore, the fusion of anti-counterfeiting graphics with QR codes is gaining increasing attention. At present, the fusion can be summarized as physical unclonable function (PUF) [16,17,18,19], watermark [20,21,22,23], copy detection pattern (CDP) [24,25,26,27,28], and halftone [29,30,31,32]. Although the above methods can achieve certain anti-counterfeiting effects, there are still some aspects that can be improved.
For example, [16] utilizes natural texture features and printing micro-features, then calculates the feature similarity between the code to be tested and the sample code through the feature extraction algorithm; however, this requires high-precision printing and capturing equipment. The authors in [20] adopt the improved discrete wavelet transform and singular value decomposition algorithm to hide the digital watermark in the 2D code. In [21], the researchers embed specific random micro-textures into a 2D code and then convert them into a security layer. Any degradation process due to counterfeiting will change the statistics of the micro-textures and thus achieve anti-counterfeiting. However, watermarking algorithms can show some robustness to relatively low-intensity attacks, while for high-intensity attacks, the watermarking information is altered or significantly lost. A new 2LQR code is proposed in [24] to replace the black block of the QR code with a black and white module of the same size with anti-copy ability. This system includes public and private storage levels; the public level ensures that the decoding program can decode, and the private level uses the sensitivity of the module to the print capture process to distinguish real from fake codes. However, this code would attract visual errors, where it is easy for the naked eye to find the specific structure of the module, while also requiring precision in the capture device, which is not conducive to commercial generalization. The authors in [29] propose a two-dimensional code anti-replication scheme based on the spectral and spatial bar code channel model. Two sets of spectral and spatial features are extracted from the channel model and identified in a cascaded combination. However, the feature extraction process of this method is relatively complicated. In [30], the researchers add a double security authentication to the position of embedded information and then reduce the interval distance through fourth-order modulation. In this way, it is easier to create differences after printing and capturing, thus achieving the anti-replication effect. Figure 1 shows some representative anti-counterfeiting patterns.
To address the above issues, this paper designs a texture-hidden QR code, based on the codec mechanism of QR codes, using Gaussian distribution and information-hiding technology. The licensee generates a digital image of the code and then uses an officially authorized printer to legally print the authentic product and paste it on the goods or documents for circulation. The forger does not have access to the original code digital image, so the process involves scanning, reprinting, and pasting the fake product or document on top of the original. Users upload codes by taking photos on their mobile phones, which can be used to verify authenticity. The code has abundant texture details and specific frequency characteristics which improve the anti-replication capability while maintaining the generality of the QR codes. An efficient quality assessment algorithm is also proposed to address possible blur in the authentication processing. The proposed method describes the DFT low-frequency region of a code and computes and compares its magnitude and azimuthal features to determine the presence and degree of blur. Finally, this paper proposes a dual feature detection algorithm that includes both a decodability analysis and DFT spectral features. Specially designed anti-counterfeiting textures will show different diffusion patterns after printing. The details of fake codes will be glued together in large quantities, the code points will be destroyed, and the decoding will fail. To ensure accuracy, this paper calculates the spectral similarity between the tested code and the sample code and further compares the eigenvalues in four feature regions. The contributions of this paper are as follows.
(1)
This paper designs a texture-hidden anti-counterfeiting QR code to solve the problem of the easily illegally copied QR code.
(2)
An effective quality assessment algorithm is proposed to judge the type and degree of blur.
(3)
The proposed dual-feature detection algorithm is shown to cope with different forgery means, capture devices, and attack scenarios.
The paper is structured as follows: Section 2 introduces the design process of anti-counterfeiting QR codes, Section 3 describes the quality assessment algorithm and the dual feature detection algorithm for the code, Section 4 presents the experimental effects in detail, and Section 5 concludes the paper.

2. The Design of Anti-Counterfeiting QR Codes

During the printing process of a digital pattern, the toner will be scattered randomly. Genuine codes only need to be printed once, while forged codes need to be printed at least twice, which can cause severe distortion. Realizing the above mechanism, this paper designs a texture-hidden QR code based on Gaussian distribution and information hiding to achieve an anti-copying effect. The code generation process is shown in Figure 2 and consists of three stages. The first stage is the generation process of an anti-counterfeiting texture. Random patterns generated from the Gaussian distribution function exhibit strong texture properties. Bilinear interpolation is used so that its gray level variation is moderately continuous, with more irregular gray level differences. Halftone operations are then performed by using error diffusion to bring the frequency of the digital code closer to the frequency sampled from the scanning and printing devices, and to improve signal aliasing during replication. The second stage is the refined process of the QR code. The semantic decoding process of QR codes first detects the position detection region and then detects the black and white code blocks. In this paper, the code points outside the position detection region and the calibration region are reduced, which does not affect the decoding. The third stage is the fusion of the above specially designed texture and the refined QR code at the code points. The anti-counterfeit code visually implements the information hiding of code points and has specific frequency characteristics, thus improving the anti-copy capability while maintaining the commonality of the QR code.
Step 1: Generate Gaussian random textures
A random matrix is built from a Gaussian random generation function and visualized as a fine random texture pattern. The pattern and its enlarged detail are shown in Figure 3. Random patterns have the characteristics of a strong texture, and at the same time, have differences in each part, which increase the difficulty of counterfeiting. If the random variable g t follows a Gaussian distribution with expected value μ and standard deviation σ , its probability density function is:
g t = 1 σ 2 π e t μ 2 2 σ 2
where μ determines the location of the distribution; σ determines the magnitude of the distribution; and g t is reconstructed into the matrix G x , y .
Step 2: Use bilinear interpolation operation
Bilinear interpolation is to perform linear interpolation in both directions separately. Interpolation has the effect of low-pass filtering which can be anti-aliasing and effectively reduce some of the visual distortions caused by image scaling. Figure 4 shows the bilinear interpolated texture pattern and its enlarged detail. It can be seen that continuously varying grayscale levels are formed between pixels in the image details, with smooth patterns and increased low-frequency information. Bilinear interpolation takes the distance to the last four pixels as a reference weight and computes the integrated score by linearly interpolating it twice to obtain the pixel value of the current point. It can be expressed as follows:
W x , y y 2 y y 2 y 1 R 1 + y y 1 y 2 y 1 R 2
R 1 x 2 x x 2 x 1 P G x 1 , y 1 + x x 1 x 2 x 1 P G x 2 , y 1
R 2 x 2 x x 2 x 1 P G x 1 , y 2 + x x 1 x 2 x 1 P G x 2 , y 2
where G x 1 , y 1 , G x 2 , y 1 , G x 1 , y 2 , and G x 2 , y 2 are the known points, and P G x 1 , y 1 , P G x 2 , y 1 , P G x 1 , y 2 , and P G x 2 , y 2 are values of proportion; the closer points are more important.
Step 3: Adopt halftone treatment
Since the printing device can only print black and white, the continuous gray tone image must be processed into a binary halftone image in the printing process, and the unit coverage area of small black and white dots that human eyes cannot distinguish is used to simulate the gray level change of the image so that the binary image looks as close as possible to the original image. In this paper, the error diffusion method proposed by [33] is adopted for the halftone processing of texture images, that is, every pixel in the image and its neighboring pixels are processed, and the error (the difference between the actual output and the original image) generated on a pixel is dispersed to the surrounding pixels in a certain proportion. After the above operation, H x , y is obtained. Figure 5 shows the texture pattern and its enlarged detail after the halftone treatment.
Step 4: Refine QR code
Figure 6 shows the QR code generated based on the Zxing open-source code and its refined code. First, the horizontal pixel values of the location detection area of the QR code are calculated and divided by the corresponding ratio (1:1:3:1:1) to obtain the minimum component module size. Then, the position and size of the calibration pattern are calculated according to the ratio characteristics of 1:1:1:1:1. Finally, the areas of the QR code are refined, except for the location detection area and calibration pattern, as follows:
R E x , y = R ( Q R x , y ) , Q R i s d a t a Q R x , y , O t h e r s
R ( Q R x , y ) = Q R x a : x + a , y a : y + a , Q R i s t h e l o c a l c e n t e r a r e a 255 , O t h e r s
where Q R x , y represents the value of the original QR code and a + 1 is the reduced local pixel size.
Step 5: Combine texture patterns and refined QR code
Figure 7 shows the combined anti-counterfeiting texture code and its details. The code points in the code data region are minor and nearly integrated into the anti-counterfeiting background. Since the position detection area is complete and the original QR code has a maximum error correction capability of 30%, the semantics of the anti-counterfeiting code can still be decoded by the code scanning device. The position detection area, calibration area, and black and white code points of the refined QR code are unchanged, while the rest of the code is replaced with Gaussian random texture patterns. The formula for this is as follows:
C x , y = R E x , y , R E x , y i s d a t a H x , y , O t h e r s

3. The Proposed Authenticity Approach

3.1. The Capturing Process of Anti-Counterfeiting Codes

Figure 8 shows the capturing process of anti-counterfeiting codes. Anti-counterfeiting codes are generated, printed, and attached to product packages by manufacturers. Counterfeiters do not have access to the original digital image and can only scan a high-quality printed code, which is then printed a second time after a series of image treatments. Consumers scan anti-counterfeiting codes through their mobile phones and other devices which then authenticates them.
As shown in Figure 9, the authentication process of the proposed anti-counterfeiting code consists of a quality assessment and dual-feature authentication. In the first stage, the image quality of the anti-counterfeiting code needs to be evaluated. Only anti-counterfeiting codes that meet the sharpness criteria will be tested. If they do not meet the criteria, the consumer is prompted to re-shoot. The second stage consists of a decodability analysis and DFT spectral feature detection. The former identifies whether the anti-counterfeiting code to be tested can be deciphered. If it cannot be decoded, it is directly judged as a fake. If it can be decoded, it is sent to the latter, which uses the decoded QR code semantic as an index to find the corresponding sample code in the sample database and compares it to compute the DFT spectral properties. If the similarity is greater than a threshold, it is judged as a fake, otherwise, it is judged as a genuine product. The combination of decodability and spectral features will improve detection efficiency while ensuring accuracy.

3.2. Quality Assessment Algorithm

The captured anti-counterfeiting codes may be blurred due to various factors such as environment, equipment, habit, etc. which typically manifest as defocus blur and motion blur. A clear code is particularly crucial for subsequent authentication. This paper transforms the code in the DFT frequency domain and calculates the magnitude and azimuthal characteristics of the low-frequency region to determine the type and degree of blur. The defocus blur is due to the incomplete coincidence of the image plane with the detector’s receptive plane, resulting in the lens not being clustered. Motion blur is caused by the relative motion between the image and the camera that captured it. The texture and edges of the image will become blurred, and its general expression is:
q x , y = p x , y h x , y + n x , y
where q x , y is the output image; p x , y is the input image; h x , y is the point spread function; n x , y is the noise; and is the convolution operation.
The main difference between the two blurs is h x , y , in defocus blur, h x , y is symmetrically distributed about the center of the circle, and in motion blur, h x , y is asymmetric and related to both the angle and the length of the motion.
For defocus blur, h x , y can be expressed as:
h x , y = 4 π r 2 , x 2 + y 2 r 2 4 0 , o t h e r s    
where r is the radius of the blur spot.
For motion blur, h x , y can be expressed as:
h x , y = R θ + D
where R θ is a rotation matrix, which represents the rotation angle, and D is the displacement of the movement.
The proposed anti-counterfeiting code is characterized by slow and continuous grayscale variations between pixels, as well as rich low-frequency information. Therefore, it is used to evaluate the quality of the anti-counterfeiting code based on the DFT spectral image features. It can be translated into a frequency domain operation and can be written, regardless of noise, as:
Q u , v = P u , v H u , v
where P u , v , Q u , v , and H u , v are the 2D DFT transforms of p x , y , q x , y , and h x , y , respectively.
Figure 10 shows one clear and four blurred codes, including their DFT spectrograms. It is obvious that the low-frequency region of a clear code is concentrated in an approximate rectangle. For defocus blur codes, the larger the blur radius, the fainter it appears to the naked eye, the more regular the circle formed by the low-frequency region of its Fourier spectrum, and the smaller the spectrum radius. For motion blur codes, the low-frequency region of the spectrum has an elliptical fringe with a fringe width inversely proportional to the displacement length and a fringe direction perpendicular to the displacement direction. Based on the blurred images above, it can be seen that the shape of the low-frequency region is strongly characterized by its magnitude and azimuthal features, as shown in Figure 11. The quality assessment algorithm, MAF, in this paper is described in detail below:
Step 1: Transform to the DFT domain and calculate low-frequency values in four directions. (1) Perform DFT frequency domain transformation on the anti-counterfeiting code, (2) move the DC component to the center of the spectrum, (3) calculate the normalization of the spectrum amplitude, (4) perform threshold process to obtain a binarized spectrogram, and (5) calculate C X , C Y , C X , and C Y (white areas in the spectrogram) in the four directions of X-axis, Y-axis, X’-axis, and Y’-axis.
Step 2: Determine how blurred the image is and calculate the quality score of the code according to the following formula:
M A F = M F + A F
M F = C X + C Y + C X + C Y 4
A F = min C X , C Y , C X , C Y
where M F and A F describe the magnitude feature and the azimuthal feature of the low-frequency region of the anti-counterfeiting code, respectively. The M A F is the quality score which is proportional to the sharpness of the code. If the score is greater than the threshold k , the code is determined to be clear, otherwise, the code is determined to be blurred.
Step 3: Determine the type of blurred code by using the following relation:
p r o 1 = C X C Y
p r o 2 = C X C Y
where p r o 1 and p r o 2 are the ratios of low-frequency regions on both axes, and the type of blur can be judged based on the values of p r o 1 and p r o 2 . The d i f is the error value; if p r o 1 > 1 d i f and p r o 2 > 1 d i f , the code is a defocused blur, otherwise the code is a motion blur.

3.3. Dual Feature Detection Algorithm

3.3.1. Decodability Analysis

In this paper, random texture patterns are combined with refined QR codes to generate anti-counterfeiting codes. Due to the integration of QR codes, anti-counterfeiting codes do not require other special position symbols or marked information and can be directly decoded by a decoding program.
Printing or capturing comes with channel distortion and noise, and the decodability varies after the code is passed through different channels. For genuine codes that have been printed and captured only once, the resulting distortion and noise are small, and codes can still be recognized by the decoding device. For illegal channels that undergo multiple replicas, more severe distortions and noise are generated, with obvious forgery traces, and substantial texture details are lost. Stochastic diffusion phenomena exhibit different diffusion patterns, which can cause decoding to fail. Therefore, the decodability of the code to be tested can be used as an evaluation metric, and those codes that cannot be decoded are identified as fakes.
In the actual process of counterfeiting, the counterfeiters resort to superior-quality enlarged copies. On the one hand, consumers generally do not know the size of the code. The original size of the code is generally small, and a copy enlarged within two times will not make a particularly noticeable difference. On the other hand, product packaging includes not only a single outer package but also a boxed package. The use of enlarged anti-counterfeiting codes on the whole box packaging is not against the law and will deceive consumers. As shown in Figure 12, it can be seen that the texture details of the genuine code are rich, while the counterfeit code is glued and cannot be decoded by a code decoding device and can therefore be directly identified as a fake. The enlarged copies reconstruct texture details better than the equal copy, part of the codes can be decoded, and the first-level feature fails. Therefore, to ensure accuracy, it is also necessary to send the decoded code to the second-level detection, the DFT spectral feature.

3.3.2. DFT Spectral Feature

The code can be regarded as a 2D discrete signal. The Discrete Fourier Transform (DFT) decomposes the signal into several sinusoidal plane waves and converts the code from the spatial domain to the frequency domain. It can be represented by the following equation:
F u , v = x = 0 M 1 y = 0 N 1 f x , y e j 2 π u M x + v N y
where x and y are values of the space domain; u and v are values of the frequency domain, u takes 0 to M − 1, v takes 0 to N − 1; and F u , v is the transform coefficients of f x , y . F u , v can also be expressed as:
F u , v = F u , v e j u , v
u , v = a r c t a n I u , v R u , v
F u , v = R 2 u , v + I 2 u , v 1 2
where u , v is the transformed phase angle; R u , v and I u , v denote, respectively, real and imaginary parts of F u , v ; F u , v is periodic and symmetric, which states that the Fourier spectral magnitude is concentrated at the origin:
F u , v = F u + A , v = F u , v + B = F u + A , v + B
F u , v = F u , v
The frequency of an image represents the grayscale gradient, which represents the difference between a point and its neighborhood on a grayscale image. Figure 13 shows the spectrogram of a genuine anti-counterfeiting code and Figure 14 shows the spectrograms of the counterfeit codes at different scales. Regions where the grayscale varies gradually are represented by low-frequency coefficients and mainly include contours. The region where the grayscale changes sharply are represented by high-frequency coefficients, which mainly include noise and edges. Important feature points of the genuine code are clear and symmetric, but the points of the counterfeit code essentially vanish.
In this paper, the macro structure information and micro structure information of the code are extracted for authentication. The Normalized Correlation (NC) can be used to measure the overall spectral similarity between the code to be tested T and the corresponding sample code to be obtained S to yield the score of the objective evaluation, the formula for which is:
N C = x = 1 M y = 1 N T x , y S x , y x = 1 M y = 1 N T x , y T x , y x = 1 M y = 1 N S x , y S x , y
where M and N are the row and column of the spectrogram; the N C is between 0 and 1.
A larger N C indicates a higher similarity between two images. However, the overall features of some counterfeit codes are not significantly different from the genuine ones, but the differences in local features are more pronounced. Thus, this paper proposes the Pixel Ratio ( P R ) to extract four characteristic regions from the spectrogram. The higher the proportion of low-frequency in the characteristic region, the more likely it is to be a genuine code. The P R can describe this feature according to the following formula:
P R = x = 1 d y = 1 d C w x , y 4 d 2 256 w = 1 , 2 , 3 , 4
where w is the w-th characteristic region; C w x , y is the pixel value of the w-th characteristic region, which ranges from 0 to 255; and d is the pixel length of a characteristic region; the P R is between 0 and 1.
Authentication can be achieved most effectively if the two metrics ( N C and P R ) are combined, here named N C _ P R . Only if N C k 1 and P R k 2 is the code judged to be a genuine code, otherwise it is judged to be a counterfeit code. Here, both k 1 and k 2 are thresholds and can be selected to have values in a certain range.

4. Experimental Results

4.1. The Design Parameters of Anti-Counterfeiting Codes

The design parameters of the anti-counterfeiting code include the size of the code points, the expectation value of the random distribution, and the size of the physical print. Proper parameters can make the decoding of the counterfeit code fail to the greatest extent, provided that the decoding of the genuine code is guaranteed. This paper experimentally tests the effect of different parameters on the decoding performance and finally selects the parameters.

4.1.1. Size of Code Points

After selecting version number 3 and error correction level H to generate the original QR code, the modules of the QR code need to be refined to code points. The larger the code points, the more visual it is, the faster the decoding, and the smaller the area of the texture pattern. Thus, it is necessary to choose an appropriate code point size. Figure 15 shows anti-counterfeiting codes with different code points sizes of 2 × 2, 3 × 3, 4 × 4, 5 × 5, 6 × 6, and 7 × 7 and their enlarged details.
Five genuine codes with code points sizes of 2 × 2, 3 × 3, 4 × 4, 5 × 5, 6 × 6, and 7 × 7 are generated, among which, the expectation of Gaussian distribution is selected as 120, the physical print size is 1.2 cm, and the pixel size is 580 × 580. After printing, the same printer is used for copying and enlarging to obtain 15 counterfeit codes, for a total of 30 genuine codes and 90 counterfeit codes in six groups. The decoding experiment is shown in Figure 16. It can be seen that when the pixel sizes of the code points are 2 × 2, 3 × 3, and 4 × 4, the genuine codes cannot be decoded 100%, so they cannot be applied to the anti-counterfeiting code. When the pixel sizes of the code points are 5 × 5, 6 × 6, and 7 × 7, all genuine codes can be fully decoded and the decoding rates of the fake codes are 13%, 53%, and 100%, respectively. Therefore, considering the comprehensive guarantee of the decoding robustness of genuine codes and the decoding fragility of counterfeit codes, the size of code points is selected as 5 × 5 pixels.

4.1.2. Expectation of Gaussian Distribution

In the generation stage of anti-counterfeiting texture patterns, a random distribution function is used to generate detailed random texture patterns. The different Gaussian random distribution expectation ( μ ) determines the mean level of the pixel value of the texture patterns, which affects the visual effect of texture patterns and the decoding performance after combining with the QR code. The higher the μ , the higher the mean of pixels and the whiter the visual appearance; the lower the μ , the lower the mean of pixels and the darker the pattern tone. Figure 17 shows the codes with a μ of 80, 100, 120, and 140.
Five codes with Gaussian expectations of 80, 100, 120, and 140 are generated. Among them, the QR code version and error correction level are 3 and H, the code point size is 5 × 5, the print size is 1.2 cm, and the pixel size is 580 × 580. After printing, the same printer was used to copy and enlarge the copy to obtain 15 fake codes. In total, there are 20 authentic and 60 counterfeit codes. The decoding experiments were then performed and the results are shown in Figure 18. As can be seen, only the μ value of 120 and above can guarantee the decoding rate of the genuine code is 100%. When the μ value is 120, the decoding rate of the fake code is 13%, when the mu value is 140, the decoding rate of the fake code is 20%. At this time, the image field of view is white, the code points are obvious, and the code points are not well hidden. Therefore, in order to consider the visual hiding effect of code points and ensure the gap between the genuine and fake decoding rate, the μ is selected as 120 in this paper.

4.1.3. Size of Physical Print

After the anti-counterfeiting code is generated, the code is printed and made into a label. Different print sizes can affect the decoding performance of the code. The larger the physical print size, the more complete the details and the easier the code to decode. Similarly, counterfeit codes that have been illegally copied can be easily decoded. Figure 19 shows the codes and their enlarged details for the print sizes 0.8 cm, 1 cm, 1.2 cm, 1.5 cm, and 2 cm. DPI is an important parameter that describes the printing accuracy of a printer. The higher the DPI, the more points are displayed within an inch, and the higher the printing accuracy. Current market circulation printers are generally 600 DPI or 1200 DPI. Different print sizes correspond to different pixel sizes, which are formulated as follows:
P i x = D P I u n i t
where u n i t is the actual physical printed size.
Five codes with print sizes of 0.8 cm, 1 cm, 1.2 cm, 1.5 cm, and 2 cm are generated and printed. Among them, the QR code version and error correction level are 3 and H, and the code point sizes are all 5 × 5. Therefore, the pixel sizes are 348 × 348, 464 × 464, 580 × 580, 696 × 696, and 928 × 928, respectively. After that, the same printer is used to copy and enlarge the copy, resulting in 15 fake codes. The five groups have a total of 25 genuine codes and 75 fake codes. The decoding results are shown in Figure 20, where it can be seen that the decoding accuracy of the genuine codes is not guaranteed when the print size is below 1.2 cm. In contrast, when the print size is larger than 1.5 cm, the decoding accuracy of the counterfeit codes is not guaranteed. Therefore, the print size of 1.2 cm is chosen in this paper to maximize the decoding performance for both genuine and fake products.

4.2. Datasets and Evaluation Metrics

4.2.1. Datasets

In this paper, three datasets are constructed: the genuine and counterfeit codes dataset, the attacked codes dataset, and the quality assessment codes dataset. Of these, the first one is used to test the performance of the algorithm, the second one includes genuine codes with different cutting proportions and sizes to test the anti-attacking of the algorithm, and the third one is used to test the performance of the proposed quality assessment algorithm.
(1)
Genuine and counterfeit codes dataset
Generate genuine and counterfeit paper codes: a computer generates 10 electronic codes with different semantics, and then prints out 10 paper codes. After that, ten printers of different brands or models, including the official one, are used to make an equal copy. Then, eight printers are used to make 1.5 times and 2 times enlarged copies of the original code, resulting in 260 counterfeit paper codes. Collect sample codes: the portable microscope camera, Anyty, is used to collect 10 genuine paper codes. Collect genuine and counterfeit codes: six mobile phones of different brands or models are used to photograph each paper genuine code five times in different environments. In addition, the mobiles are used to take pictures of each paper counterfeit code one time. The dataset consists of 10 groups, each with a different 16-bit random number for the semantics of the codes. The data of each group includes 1 sample code, 30 genuine codes, and 156 counterfeit codes (among them, 60 are 1 time copies, 48 are 1.5 times copies, and 48 are 2 times copies). The dataset information is shown in Table 1.
(2)
Attacked codes dataset
Since counterfeit codes will not be recognized as genuine after the attack, only an attack on the genuine codes is performed to check the authenticity algorithm. The size of the 300 genuine codes is reduced in various proportions to 12%, 24%, 36%, and 48% of the original code size. The 300 genuine codes are cut to different degrees, with cut areas of 1%, 3%, 5%, 7%, and 9% of the original code. Figure 21 shows the legend for codes with different reduced proportions and cut ratios. The dataset information is shown in Table 2.
(3)
Quality assessment codes dataset
Different degrees of defocus blur (1, 2, 3, 4, and 5) and motion blur (1, 2, 3, 4, and 5) are added to 300 clear digital genuine codes to test the performance of the quality assessment algorithm in this paper. The dataset is presented in Table 3. In the experiments, the printer information used for printing and copying is shown in Table 4, and the smartphone information used for collecting is shown in Table 5.

4.2.2. Evaluation Metrics

To evaluate the performance of the anti-counterfeiting codes and algorithms, three evaluation metrics are formulated as follows:
A c c u r a c y = T P + T N T P + T N + F P + F N
P r e c i s i o n = T P T P + F P
R e c a l l = T P T P + F N
where T P denotes the number of genuine codes judged to be genuine, F N denotes the number of genuine codes judged to be counterfeit, F P denotes the number of counterfeit codes judged to be genuine, and T N denotes the number of counterfeit codes judged to be counterfeit.
For the methods that need to be trained and tested, the following experiments randomly select half of the Table 1 dataset as the training set, however, the testing set is different, which is displayed in the following experiments. The final evaluation metric is the average accuracy of the five experiments randomly combined.

4.3. Quality Assessment Algorithm Experiments

The dataset shown in Table 3 is used to validate the effectiveness of the proposed quality assessment algorithm, which is compared with several common algorithms, including Variance, Information Entropy, Tenengrad Gradient, and Fuzzy Probability. The results are shown in Table 6. In this paper, the optimal threshold is chosen by the smaller intra-class variance and the larger inter-class variance. Variance, Information Entropy, and Fuzzy Probability are almost ineffective, and the scores of clear and blurry images are mixed in large numbers. The Tenengrad Gradient can be distinguished to some extent, but there is a large number of crossings. It can be seen that the proposed method outperforms other methods in terms of accuracy, precision, and recall. At the same time, the gap between the other four algorithms is not obvious.
Meanwhile, the quality evaluation method in this paper is able to further distinguish the blur types with 95% accuracy for blurred codes. The specific results are shown in Table 7. The time consumption for processing one code is shown in Figure 22.

4.4. Dual Feature Detection Algorithm Experiments

4.4.1. Decodability Analysis

Judging whether the code can be decoded correctly is the first part of the authenticity process which can quickly screen out counterfeit codes. In this paper, the genuine and counterfeit codes dataset (Table 1) is decoded through the QR code decoding program, and the result is shown in Table 8. It can be seen that the genuine codes can be successfully decoded, while the counterfeit codes with an equal ratio have a decoding rate of only 5.83%. However, the enlarged copy codes can reconstruct more details and the decoding rate is significantly improved. The decoding rate is 17.5% for the 1.5 times copy codes and higher for the 2 times copy codes, up to 22.50%. Thus, the genuine code is not necessarily the only one successfully decoded, and the algorithm needs to be further checked to ensure accuracy.
QR codes have some error correction capability and up to 30% of the area covered can still be correctly decoded. In this paper, patterns with anti-counterfeiting capability are incorporated into QR codes to ensure some degree of error correction capability. Figure 23a shows the decoding for codes with different reduced proportions. It can be seen that the smaller the proportion, the lower the decoding accuracy. The anti-counterfeiting codes in this paper are resistant to up to 48%. As can be seen in Figure 23b, codes with a maximum cut area of up to 5% can still be guaranteed to decode 100%. In the actual decoding process, the image sizes are also different due to the differences in mobile devices.

4.4.2. DFT Spectral Feature

The comparison of DFT spectral features is the second part of the discrimination procedure to ensure accuracy. The test consists of two metrics: NC and PR, and a code is identified as genuine only if both metrics reach a threshold. Figure 24a,b shows scatter plots of the two metrics for genuine and counterfeit codes. It can be seen that the distinction between codes is obvious, but there is some crossover in the middle region. To improve the accuracy, the appropriate threshold should be chosen experimentally. Figure 24c,d shows the accuracy results for the genuine and counterfeit codes when the two metrics are chosen with different thresholds. When the NC is in the range of 0.88 to 0.90 and the PR is in the range of 0.4 to 0.5, the accuracy can reach a high level. After selecting the threshold values for the NC and PR, the algorithm is further evaluated, and the result is shown in Table 9. It can be seen that with double feature detection, NC_PR, achieves 100% accuracy, precision, and recall, which is better than using either the NC value or the PR value alone.

4.4.3. Robustness of Dual Feature Detection Algorithm

(1)
No attacked test
To further verify the superiority of the proposed dual feature algorithm, the genuine and counterfeit codes dataset shown in Table 1 is used to compare it with several existing popular texture methods and corner methods. For the methods that need to be trained and tested, the following experiments randomly select half of the Table 1 dataset as the testing set. As can be seen in Table 10, the accuracy, precision, and recall ratios of the proposed method all reach 100%. The identification effects of the corner methods (SIFT [34], SURF [35], and BRISK [36]) are relatively general. Some representative texture algorithms (2LQR [24], LBP [37], CLBP [38], CLBC [39] ECLBP [40], COV_LBPD [41], MRELBP [42], JRLP [43], MCDR [44], RALBGC [45], LDEP [46], and LGONBP [47]) can also achieve good results with accuracies above 99%, and some algorithms are close to or reach 100%. In terms of time, to process an image of the same size, the proposed algorithm DFDA needs 0.25 s, which is at the upper level. The shortest time is obtained by [24], only 0.01 s and the longest time is from [46], 146.3 s, followed by [34], 17.08 s. The rest of the methods are within 2 s.
(2)
Attacked test
This paper experimentally verifies the robustness of the proposed anti-counterfeiting code and the corresponding algorithm in view of the possible abrasion phenomenon during the use of the code. The dataset in Table 2 is used as the test set. Table 11 shows the accuracy results of the different methods for different reduced proportions. The accuracy of [34,35,36,37,38,39,46,47] and other methods is above 90% when the size is 12%, 24%, or 36% of the original figure. The accuracy of [34,38,39,40,46,47] and other methods is more than 90% when the size of the original image is 64%. When the size is 48% of the original image, only the accuracy of [40] is above 90% (94.39%). The proposed method maintains 100% accuracy during the size ratio reduction from 12% to 48%, which is a significant advantage. Table 12 shows the results of different methods under different cut ratio attacks. The methods in [34,35,36,37,38,42,45] can still maintain more than 90% accuracy under a 1% cut attack, the methods in [34,35,36,45] can still maintain more than 90% accuracy under a 3% cut attack, and the methods in [34,36] can still maintain more than 90% accuracy with a 5% cut attack. The accuracy of the proposed method is 100% during the size reduction from 1% to 5%, which significantly outperforms other methods.

5. Conclusions

In this paper, a texture-hidden anti-counterfeiting QR code and a related authentication scheme for mobile devices are proposed. The authentication scheme includes a quality assessment algorithm and dual feature detection. In the test of quality assessment algorithms, the proposed algorithm is compared with several current common algorithms without reference and the superiority of the proposed algorithm is verified. In the authenticity test, the proposed dual feature method is compared with various texture and corner methods. The proposed method achieves an accuracy, precision, and recall of up to 100%, and it also performs well on attacked datasets with reduction and cut. In the following research work, it is necessary to expand the data set (attacks of various physical sizes, higher forgery technologies, and complex application scenarios) to more comprehensively evaluate the practicality of the codes and authentication schemes to achieve convenient pre-sale anti-counterfeit measures.
The dataset used in this paper is obtained by smartphones shooting under indoor conditions, but the collection of 2D codes will also encounter some abnormal conditions, such as insufficient illumination, severe shaking, etc. How to efficiently recover codes and authenticate them is a problem that requires further work. In addition, with multiple forgery devices available, how to authenticate codes using only genuine samples is an important research direction.

Author Contributions

Conceptualization, T.W., H.Z., C.Y. and J.J.; methodology, T.W. and H.Z.; software, T.W., H.Z. and C.Y.; validation, T.W.; formal analysis, T.W., H.Z. and J.J.; investigation, T.W. and H.Z.; resources, H.Z. and J.J.; data curation, T.W., C.Y. and J.J.; writing—original draft preparation, T.W. and H.Z.; writing—review and editing, T.W., C.Y. and H.Z.; visualization, T.W., H.Z., C.Y. and J.J.; supervision, H.Z.; project administration, H.Z.; funding acquisition, H.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research is supported by the National Key Research and Development Program of China (Grant No. 2020YFF0304902) and the Hubei Provincial Natural Science Foundation of China (Grant No. 2022CFB529).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Liu, J.; Wen, H. Optical scanning tilt holography. IEEE Trans. Ind. Inform. 2019, 15, 6139–6145. [Google Scholar] [CrossRef]
  2. Lee, I.H.; Li, G.; Lee, B.Y. Selective photonic printing based on anisotropic Fabry-Perot resonators for dual-image holography and anti-counterfeiting. Opt. Express 2019, 27, 24512–24523. [Google Scholar] [CrossRef] [PubMed]
  3. Zhang, Y.; Poon, T.; Tsang, P.W.M.; Wang, R.; Wang, L. Review on feature extraction for 3-D incoherent image processing using optical scanning holography. IEEE Trans. Ind. Inform. 2019, 15, 6146–6154. [Google Scholar] [CrossRef]
  4. Hu, Y.; Zhang, T.; Wang, C.; Liu, K.; Sun, Y.; Li, L.; Lv, C.; Liang, Y.; Jiao, F.; Zhao, W.; et al. Flexible and biocompatible physical unclonable function anticounterfeiting label. Adv. Funct. Mater. 2021, 31, 2102108–2102116. [Google Scholar] [CrossRef]
  5. Kumar, P.; Singh, S.; Gupta, B.K. Future prospects of luminescent nanomaterial based security inks: From synthesis to anti-counterfeiting applications. Nanoscale 2016, 8, 14297–14340. [Google Scholar] [CrossRef]
  6. Zuo, M.; Qian, W.; Li, T. Full-color tunable fluorescent and chemiluminescent supramolecular nanoparticles for anti-counterfeiting inks. ACS Appl. Mater. Interfaces 2018, 10, 39214–39222. [Google Scholar] [CrossRef]
  7. Chen, L.; Chen, Y.; Fu, H.G. Reversible emitting anti-counterfeiting ink prepared by anthraquinone-modified supramolecular polymer. Adv. Sci. 2020, 7, 2000803. [Google Scholar] [CrossRef]
  8. Xu, J.; Zhang, B.; Jia, L. Dual-mode, color-tunable, lanthanide-doped core-shell nanoarchitectures for anti-counterfeiting inks and latent fingerprint recognition. ACS Appl. Mater. Interfaces 2019, 11, 35294–35304. [Google Scholar] [CrossRef]
  9. Wang, C.; Jin, Y.; Yuan, L.; Wu, H.; Ju, G.; Li, Z.; Liu, D.; Lv, Y.; Chen, L.; Hu, Y. A spatial/temporal dual-mode optical thermometry platform based on synergetic luminescence of Ti4+-Eu3+ embedded flexible 3D micro-rod arrays: High-sensitive temperature sensing and multi-dimensional high-level secure anti-counterfeiting. Chem. Eng. J. 2019, 374, 992–1004. [Google Scholar] [CrossRef]
  10. Liu, H.; Song, W.; Chen, X.; Mei, J.; Zhang, Z.; Su, J. Temperature-responsive molecular liquids based on dihydrophenazines for dynamic multicolor-fluorescent anti-counterfeiting and encryption. Mater. Chem. Front. 2021, 5, 2294–2302. [Google Scholar] [CrossRef]
  11. Krishna, G.; Pooja, G.; Ram, B.; Radha, V.; Rajarajeswari, P. Recognition of fake currency note using convolutional neural networks. Int. J. Innov. Technol. Explor. Eng. 2019, 8, 58–63. [Google Scholar]
  12. Takhar, S.S.; Liyanage, K. Framework for a chemical substance reporting system. In Proceedings of the International Conference on Industrial Technology and Management (ICITM), Oxford, UK, 7–9 March 2018; pp. 18–20. [Google Scholar]
  13. Lee, W.H.; Chou, C.M.; Wang, S.W. An NFC anti-counterfeiting framework for ID verification and image protection. Mobile Netw. Appl. 2016, 21, 646–655. [Google Scholar] [CrossRef]
  14. Zheng, Z.; Xu, B.; Ju, J.; Guo, Z.; You, C.; Lei, Q.; Zhang, Q. Circumferential local ternary pattern: New and efficient feature descriptors for anti-counterfeiting pattern identification. IEEE Trans. Inf. Forensics Secur. 2022, 17, 970–981. [Google Scholar] [CrossRef]
  15. Lin, Y.; Zhang, H.; Feng, J.; Shi, B.; Zhang, M.; Han, Y.; Wen, W.; Zhang, T.; Qi, Y.; Wu, J. Unclonable micro-texture with clonable micro-shape towards rapid, convenient, and low-cost fluorescent anti-counterfeiting labels. Small 2021, 17, 2100244. [Google Scholar] [CrossRef] [PubMed]
  16. Yan, Y.; Zou, Z.; Xie, H.; Gao, Y.; Zheng, L. An IoT-based anticounterfeiting system using visual features on QR code. IEEE Internet Things J. 2021, 8, 6789–6799. [Google Scholar] [CrossRef]
  17. Joshi, S.; Khanna, N. Single classifier-based passive system for source printer classification using local texture features. IEEE Trans. Inf. Forensics Secur. 2018, 13, 1603–1614. [Google Scholar] [CrossRef] [Green Version]
  18. Lu, W.; Chen, J.; Zhang, J.; Huang, J.; Weng, J. Secure halftone image steganography based on feature space and layer embedding. IEEE Trans. Cybern. 2022, 52, 5001–5014. [Google Scholar] [CrossRef]
  19. Frank, T.; Liu, J.; Gat, S.; Haik, O.; Bat Mor, O.; Roth, I.; Allebach, J.P.; Yitzhaky, Y. A machine learning approach to design of aperiodic, clustered-dot halftone screens via direct binary search. IEEE Trans. Image Process. 2022, 31, 5498–5512. [Google Scholar] [CrossRef]
  20. Pan, J.S.; Sun, X.-X.; Chu, S.-C.; Abraham, A.; Yan, B. Digital watermarking with improved SMS applied for QR code. Eng. Appl. Artif. Intell. 2021, 97, 104049–104061. [Google Scholar] [CrossRef]
  21. Nguyen, H.P.; Retraint, F.; Morain-Nicolier, F.; Delahaies, A. A watermarking technique to secure printed matrix barcode—Application for anti-counterfeit packaging. IEEE Access 2019, 7, 131839–131850. [Google Scholar] [CrossRef]
  22. Xie, R.; Hong, C.; Zhu, S.; Tao, D. Anti-counterfeiting digital watermarking algorithm for printed QR barcode. Neurocomputing 2015, 167, 625–635. [Google Scholar] [CrossRef]
  23. Nguyen, H.P.; Delahaies, A.; Retraint, F.; Nguyen, D.H.; Pic, M.; MorainNicolier, F. A watermarking technique to secure printed QR codes using a statistical test. In Proceedings of the IEEE Global Conference on Signal and Information Processing, Montreal, QC, Canada, 14–16 November 2017; pp. 288–292. [Google Scholar]
  24. Tkachenko, I.; Puech, W.; Destruel, C.; Strauss, O.; Gaudin, J.-M.; Guichard, C. Two-level QR code for private message sharing and document authentication. IEEE Trans. Inf. Forensics Secur. 2016, 11, 571–583. [Google Scholar] [CrossRef]
  25. Cui, Z.; Li, W.; Yu, C.; Yu, N. A new type of two-dimensional anticounterfeit code for document authentication using neural networks. In Proceedings of the ICCSP International Conference on Cryptography, Security and Privacy, Nanjing, China, 10–12 January 2020; pp. 68–73. [Google Scholar]
  26. Tkachenko, I.; Puech, W.; Strauss, O.; Destruel, C.; Gaudin, J.M. Printed document authentication using two level or code. In Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Shanghai, China, 20–25 March 2016; pp. 2149–2153. [Google Scholar]
  27. Chaban, R.; Taran, O.; Tutt, J.; Belousov, Y.; Pulfer, B.; Holotyak, T.; Voloshynovskiy, S. Printing variability of copy detection patterns. In Proceedings of the 2022 IEEE International Workshop on Information Forensics and Security (WIFS), Shanghai, China, 12–16 December 2022. [Google Scholar]
  28. Tutt, J.; Taran, O.; Chaban, R.; Pulfer, B.; Belousov, Y.; Holotyak, T.; Voloshynovskiy, S. Mathematical model of printing-imaging channel for blind detection of fake copy detection patterns. In Proceedings of the 2022 IEEE International Workshop on Information Forensics and Security (WIFS), Shanghai, China, 12–16 December 2022. [Google Scholar]
  29. Chen, C.; Li, M.; Ferreira, A.; Huang, J.; Cai, R. A copy-proof scheme based on the spectral and spatial barcoding channel models. IEEE Trans. Inf. Forensics Secur. 2020, 15, 1056–1071. [Google Scholar] [CrossRef]
  30. Xie, N.; Zhang, Q.; Chen, Y.; Hu, J.; Luo, G.; Chen, C. Low-cost anti-copying 2D barcode by exploiting channel noise characteristics. IEEE Trans. Multimed. 2021, 23, 3752–3767. [Google Scholar] [CrossRef]
  31. Patil, V.; Kundu, S. Realizing robust, lightweight strong PUFs for securing smart grids. IEEE Trans. Consum. Electr. 2022, 68, 5–12. [Google Scholar] [CrossRef]
  32. Qureshi, M.; Munir, A. PUF-RAKE: A PUF-based robust and lightweight authentication and key establishment protocol. IEEE Trans. Depend. Secur. 2022, 19, 2457–2475. [Google Scholar] [CrossRef]
  33. Hersch, D.R. Spectral neugebauer-based color halftone prediction model accounting for paper fluorescence. Appl. Opt. 2014, 53, 5380–5390. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  34. Low, D.G. Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
  35. Bay, H.; Ess, A.; Tuytelaars, T. Speeded-up robust features (SURF). Comput. Vis. Image Underst. 2008, 110, 346–359. [Google Scholar] [CrossRef]
  36. Leutenegger, S.; Chli, M.; Siegwart, R.Y. BRISK: Binary robust invariant scalable keypoints. In Proceedings of the 2011 International Conference on Computer Visio, Barcelona, Spain, 6–13 November 2011. [Google Scholar]
  37. Ojala, T.; Pietik, M.; Harwood, D. A comparative study of texture measures with classification based on feature distributions. Pattern Recogn. 1996, 29, 51–59. [Google Scholar] [CrossRef]
  38. Guo, Z.; Zhang, L.; Zhang, D. A completed modeling of local binary pattern operator for texture classification. IEEE Trans. Image Process. 2010, 19, 1657–1663. [Google Scholar] [PubMed]
  39. Zhao, Y.; Huang, S.; Jia, W. Completed local binary count for rotation invariant texture classification. IEEE Trans. Image Process. 2012, 4492–4497. [Google Scholar] [CrossRef] [PubMed]
  40. Sima, J.; Dong, Y.; Wang, T.; Zheng, L.; Pu, J. Extended contrast local binary pattern for texture classification. Int. J. New Technol. Res. 2018, 4, 15–20. [Google Scholar]
  41. Hong, X.; Zhao, G.; Pietikainen, M.; Chen, X. Combining LBP difference and feature correlation for texture description. IEEE Trans. Image Process. 2014, 23, 2557–2568. [Google Scholar] [CrossRef] [PubMed]
  42. Liu, L.; Lao, S.; Fieguth, P.W.; Guo, Y.; Wang, X.; Pietik¨ainen, M. Median robust extended local binary pattern for texture classification. IEEE Trans. Image Process. 2016, 25, 1368–1381. [Google Scholar] [CrossRef] [PubMed]
  43. Wang, T.; Dong, Y.; Yang, C.; Wang, L.; Liang, L.; Zheng, L.; Pu, J. Jumping and refined local pattern for texture classification. IEEE Access 2018, 6, 64416–64426. [Google Scholar] [CrossRef]
  44. Dong, Y.; Feng, J.; Yang, C.; Wang, X.; Zheng, L.; Pu, J. Multi-scale counting and difference representation for texture classification. Visual Comput. 2017, 34, 1315–1324. [Google Scholar] [CrossRef]
  45. Khadiri, I.E.; Kas, M.; Merabet, Y.E. Repulsive-and-attractive local binary gradient contours: New and efficient feature descriptors for texture classification. Inform. Sci. 2018, 467, 634–653. [Google Scholar] [CrossRef]
  46. Dong, Y.; Wang, T.; Yang, C.; Zheng, L.; Song, B.; Wang, L.; Jin, M. Locally directional and extremal pattern for texture classification. IEEE Access 2019, 22, 87931–87942. [Google Scholar] [CrossRef]
  47. Song, T.; Feng, J.; Luo, L.; Gao, C.; Li, H. Robust texture description using local grouped order pattern and non-local binary pattern. IEEE Trans. Circuits Syst. Video Technol. 2021, 31, 189–202. [Google Scholar] [CrossRef]
Figure 1. Anti-counterfeiting patterns related to QR codes.
Figure 1. Anti-counterfeiting patterns related to QR codes.
Sensors 23 00795 g001
Figure 2. The generation process of a texture-hidden anti-counterfeiting QR code.
Figure 2. The generation process of a texture-hidden anti-counterfeiting QR code.
Sensors 23 00795 g002
Figure 3. Gaussian random texture and its enlarged detail.
Figure 3. Gaussian random texture and its enlarged detail.
Sensors 23 00795 g003
Figure 4. Bilinear interpolated texture and its enlarged detail.
Figure 4. Bilinear interpolated texture and its enlarged detail.
Sensors 23 00795 g004
Figure 5. Halftone texture and its enlarged detail.
Figure 5. Halftone texture and its enlarged detail.
Sensors 23 00795 g005
Figure 6. Original QR code and its refined QR code.
Figure 6. Original QR code and its refined QR code.
Sensors 23 00795 g006
Figure 7. Anti-counterfeiting QR code and its local enlarged detail.
Figure 7. Anti-counterfeiting QR code and its local enlarged detail.
Sensors 23 00795 g007
Figure 8. The capturing process of anti-counterfeiting codes.
Figure 8. The capturing process of anti-counterfeiting codes.
Sensors 23 00795 g008
Figure 9. The authentication process of an anti-counterfeiting code.
Figure 9. The authentication process of an anti-counterfeiting code.
Sensors 23 00795 g009
Figure 10. A clear code, four blurred codes, and their DFT spectrograms ((a) is the clear code; (b) and (c) are the defocused blur codes with a radius of 3 and 5, respectively; and (d) and (e) are the motion blur codes with displacements of 10 and 30, respectively).
Figure 10. A clear code, four blurred codes, and their DFT spectrograms ((a) is the clear code; (b) and (c) are the defocused blur codes with a radius of 3 and 5, respectively; and (d) and (e) are the motion blur codes with displacements of 10 and 30, respectively).
Sensors 23 00795 g010
Figure 11. DFT spectrum of an anti-counterfeiting code.
Figure 11. DFT spectrum of an anti-counterfeiting code.
Sensors 23 00795 g011
Figure 12. Genuine code and counterfeit codes in different scales and their enlarged details ((a) is the genuine code and its enlarged detail; (bd) are the counterfeit codes in different scales and their enlarged details).
Figure 12. Genuine code and counterfeit codes in different scales and their enlarged details ((a) is the genuine code and its enlarged detail; (bd) are the counterfeit codes in different scales and their enlarged details).
Sensors 23 00795 g012
Figure 13. Spectrograms of a genuine anti-counterfeiting code.
Figure 13. Spectrograms of a genuine anti-counterfeiting code.
Sensors 23 00795 g013
Figure 14. Spectrograms of counterfeit codes at different scales.
Figure 14. Spectrograms of counterfeit codes at different scales.
Sensors 23 00795 g014
Figure 15. Anti-counterfeiting codes with different code point sizes and their enlarged details.
Figure 15. Anti-counterfeiting codes with different code point sizes and their enlarged details.
Sensors 23 00795 g015
Figure 16. Decoding accuracy of codes in different code point sizes.
Figure 16. Decoding accuracy of codes in different code point sizes.
Sensors 23 00795 g016
Figure 17. Anti-counterfeiting codes with different μ.
Figure 17. Anti-counterfeiting codes with different μ.
Sensors 23 00795 g017
Figure 18. Decoding accuracy of codes with different μ values.
Figure 18. Decoding accuracy of codes with different μ values.
Sensors 23 00795 g018
Figure 19. Anti-counterfeiting codes with different sizes of physical print and their enlarged details.
Figure 19. Anti-counterfeiting codes with different sizes of physical print and their enlarged details.
Sensors 23 00795 g019
Figure 20. Decoding accuracy of codes of different physical print sizes.
Figure 20. Decoding accuracy of codes of different physical print sizes.
Sensors 23 00795 g020
Figure 21. Genuine codes with different reduced proportions and cut ratios ((a) has no attack; (be) have cut attack; and (fj) have a reduced attack).
Figure 21. Genuine codes with different reduced proportions and cut ratios ((a) has no attack; (be) have cut attack; and (fj) have a reduced attack).
Sensors 23 00795 g021
Figure 22. The time consumptions of image quality assessment algorithms for processing one code.
Figure 22. The time consumptions of image quality assessment algorithms for processing one code.
Sensors 23 00795 g022
Figure 23. Decoding accuracy of codes on the Table 2 dataset.
Figure 23. Decoding accuracy of codes on the Table 2 dataset.
Sensors 23 00795 g023
Figure 24. Thresholds of DFT spectral feature ((a,b) are scatter plots of NC and PR for the two hundred genuine and counterfeit codes; (c,d) show the accuracies in different thresholds).
Figure 24. Thresholds of DFT spectral feature ((a,b) are scatter plots of NC and PR for the two hundred genuine and counterfeit codes; (c,d) show the accuracies in different thresholds).
Sensors 23 00795 g024
Table 1. Genuine and counterfeit codes dataset.
Table 1. Genuine and counterfeit codes dataset.
Sample CodesGenuine CodesCounterfeit Codes
1 Time1.5 Times2 Times
Number10300600480480
Table 2. Attacked codes dataset.
Table 2. Attacked codes dataset.
Reduced ProportionsCut Ratios
12%24%36%48%1%3%5%7%9%
Number300300300300300300300300300
Table 3. Quality assessment codes dataset.
Table 3. Quality assessment codes dataset.
ClearDefocus BlurMotion Blur
Number30015001500
Table 4. Printer-related brand, model, and DPI information.
Table 4. Printer-related brand, model, and DPI information.
BrandModelPrinting DPIReplicating DPI
Ricoh AficioMP90011200 × 1200600 × 600
Ricoh AficioMP60011200 × 12001200 × 1200
Ricoh AficioMP70011200 × 1200600 × 600
Ricoh AficioMP90021200 × 1200600 × 600
Ricoh AficioPro907EX1200 × 1200600 × 600
Fuji-XeroxDC-C7550I2400 × 24002400 × 2400
Fuji-XeroxApeosport-VC33751200 × 24001200 × 2400
Fuji-XeroxApeosport-IVC55701200 × 24001200 × 2400
EpsonWF-M21000a600 × 2400600 × 600
EpsonL151684800 × 2400600 × 600
Table 5. Smartphone-related brand, model, and camera pixel information.
Table 5. Smartphone-related brand, model, and camera pixel information.
BrandModelCamera Pixels
HUAWEIMate40 Pro50 million
HUAWEINova5 Pro48 million
REDMIK3064 million
APPLEXR12 million
APPLE1312 million
APPLE8Plus12 million
Table 6. Results of the image quality assessment algorithms on the Table 3 dataset.
Table 6. Results of the image quality assessment algorithms on the Table 3 dataset.
MethodsIntra-Class VarianceInter-Class VarianceAccuracy (%)Precision (%)Recall (%)
Variance78.0410.6772.6756.7276
Information Entropy0.060.0157.6741.4065
Tenengrad Gradient11.5217.4488.3375.6096
Fuzzy Probability0.010.0176.6762.3076
MAF512.911099.399691.8097
Table 7. Results of blur types.
Table 7. Results of blur types.
MethodThe Number of Defocus Blur Correctly DeterminedThe Number of Motion Blur Correctly DeterminedAccuracy
(%)
MAF919795
Table 8. Decoding accuracy of codes on the Table 1 dataset.
Table 8. Decoding accuracy of codes on the Table 1 dataset.
CodesTotal NumberDecoding NumberDecoding Rate (%)
Genuine300300100
1 time copy600355.83
1.5 times copy4808417.50
2 times copy48010822.50
Table 9. Accuracy, precision, and recall of the DFT spectral feature on the Table 1 dataset.
Table 9. Accuracy, precision, and recall of the DFT spectral feature on the Table 1 dataset.
MethodsAccuracy (%)Precision (%)Recall (%)
NC98.7887.88100
PR93.4259.5479.31
NC_PR100100100
Table 10. Accuracy ratios, precision ratios, recall ratios, and time of different methods from the Table 1 dataset.
Table 10. Accuracy ratios, precision ratios, recall ratios, and time of different methods from the Table 1 dataset.
MethodsAccuracy (%)Precision (%)Recall (%)Time (s)
2LQR [24]24.056.3280.540.01
SIFT [34]97.3091.2598.6517.08
SURF [35]95.3787.7657.331.62
BRISK [36]97.2887.6176.740.46
LBP [37]99.2510088.890.09
CLBP [38]99.5493.101000.18
CLBC [39]99.7996.421000.17
ECLBP [40]99.5098.0894.440.17
COV_LBPD [41]99.5098.431001.33
MRELBP [42]1001001001.64
JRLP [43]1001001000.61
MCDR [44]99.8810098.151.75
RALBGC [45]99.8710098.231.31
LDEP [46]100100100146.3
LGONBP [47]99.8810098.151.45
DFDA1001001000.25
Table 11. Recall ratios of different methods on the Table 2 reduction dataset.
Table 11. Recall ratios of different methods on the Table 2 reduction dataset.
Methods12%24%36%48%
2LQR [24]78.2476.2973.1570.54
SIFT [34]95.8395.8391.6687.37
SURF [35]95.1290.3389.560
BRISK [36]97.0291.4576.230
LBP [37]56.0742.9957.0037.38
CLBP [38]10099.0696.2634.57
CLBC [39]10010096.2633.64
ECLBP [40]98.1399.0699.0694.39
COV_LBPD [41]39.7129.9028.9721.02
MRELBP [42]62.6162.6140.1839.53
JRLP [43]7.475.7500
MCDR [44]34.57000
RALBGC [45]80.3770.0966.6350.46
LDEP [46]10010099.0669.19
LGONBP [47]92.5290.3490.0584.56
DFDA100100100100
Table 12. Recall ratios of different methods on the Table 2 cut dataset.
Table 12. Recall ratios of different methods on the Table 2 cut dataset.
Methods1%3%5%7%9%
2LQR [24]79.1575.2766.8156.2448.71
SIFT [34]95.6493.6490.6489.6485.64
SURF [35]92.6791.4588.7680.7076
BRISK [36]93.2592.2592.0575.2523.25
LBP [37]58.8742.0540.3315.2110.23
CLBP [38]93.4583.1766.3516.8214.68
CLBC [39]92.5277.5746.7238.3116.82
ECLBP [40]21.497.475.603.732.80
COV_LBPD [41]34.5718.2219.6217.7516.35
MRELBP [42]91.5882.5479.4373.8366.35
JRLP [43]00000
MCDR [44]00000
RALBGC [45]98.1391.5889.9877.5775.70
LDEP [46]85.0481.3054.2028.0311.21
LGONBP [47]89.7188.4587.8566.3526.26
DFDA10010010086.5675.32
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wang, T.; Zheng, H.; You, C.; Ju, J. A Texture-Hidden Anti-Counterfeiting QR Code and Authentication Method. Sensors 2023, 23, 795. https://doi.org/10.3390/s23020795

AMA Style

Wang T, Zheng H, You C, Ju J. A Texture-Hidden Anti-Counterfeiting QR Code and Authentication Method. Sensors. 2023; 23(2):795. https://doi.org/10.3390/s23020795

Chicago/Turabian Style

Wang, Tianyu, Hong Zheng, Changhui You, and Jianping Ju. 2023. "A Texture-Hidden Anti-Counterfeiting QR Code and Authentication Method" Sensors 23, no. 2: 795. https://doi.org/10.3390/s23020795

APA Style

Wang, T., Zheng, H., You, C., & Ju, J. (2023). A Texture-Hidden Anti-Counterfeiting QR Code and Authentication Method. Sensors, 23(2), 795. https://doi.org/10.3390/s23020795

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop