**3. Hand Vein Recognition Based on Block Mutual Information**

#### *3.1. Mutual Information Calculation*

Calculating the correlation of different bit planes and finding the best match is an important issue in this research. The correlation between different bit planes indicates the similarity of their contents, and their correlation can be characterized by mutual information [14].

For discrete random variables, let *X* be a random variable, *p*(*x*) is the probability that this variable *X* takes the value *x*, then the entropy H(*X*) describing its uncertainty is expressed as:

$$H(X) = -\sum\_{x \in X} p(x) \log p(x) \tag{3}$$

The introduction of mutual information is to measure the amount of information that contains another random variable in a random variable, which denotes closeness between two random variables. With two random variables *X* and *Y*, the probability distributions are *p*(*x*) and *p*(*y*), respectively, and the mutual information between them is expressed as:

$$I(X;Y) = -\sum\_{\mathbf{x}\in\mathcal{X}} \sum\_{y\in Y} p(\mathbf{x},y) \log \frac{p(\mathbf{x},y)}{p(\mathbf{x})p(y)}\tag{4}$$

Mutual information of images denotes the correlation between images [15], and it can be expressed as:

$$I(A;B) = \sum\_{a \in K\_b} \sum\_{b \in K\_b} p(a,b) \log \frac{p(a,b)}{p(a)p(b)}\tag{5}$$

In Equation (5), *p*(*a*) and *p*(*b*) are respectively the probability distributions of image *A* and image *B*, *p*(*a*, *b*) is the joint distribution probability, *Ka* and *Kb* are gray levels. The larger *I*(*A*; *B*), the higher correlation between two images.
