*3.1. Clustering Net*

The cluster net is aimed for clustering the heterogeneous fields *<sup>κ</sup>*(*<sup>x</sup>*,*<sup>s</sup>*); but the resulting clusters should inherit the properties of the solution corresponding to the *<sup>κ</sup>*(*<sup>x</sup>*,*<sup>s</sup>*), i.e., the heterogeneous fields grouped in the same cluster should have similar corresponding solution properties. This similarity will be measured by the adversary net which will be introduced in Section 3.3. We hence design the network demonstrated in Figure 4.

**Figure 4.** Cluster network.

The input *X* ∈ <sup>R</sup>*<sup>m</sup>*,*d*, where *m* is the number of samples and *d* is the dimension of one local heterogeneous field, of the network is the local heterogeneous fields which are parametrized by the random variable *s* ∈ Ω. The output of the network is the multiscale basis (first GMsFEM basis) which somehow represents the solution corresponding to the coefficient *<sup>κ</sup>*(*<sup>x</sup>*,*<sup>s</sup>*). This is a generative network which has an auto encoder structure. The dimension reduction function *F*(*X*) can be interpreted as some kind of kernel method which maps the input data to a new space which is easier to be separated. This can also be interpreted as the learning of a good metric function for the later performed K-mean clustering. We will perform K-means clustering algorithm in latent space *<sup>F</sup>*(*X*). *<sup>G</sup>*(·) will then transfer the latent space data to the space of multiscale basis function. This process can be taken as a generative process and we reconstruct the basis from the extracted features. The detailed algorithm is as follow (see Figure 5 for an illustration):

**Figure 5.** Deep learning algorithm.

Steps illustrated in Figure 5:

