*2.1. The Experimental Unit Testing and Calibration*

Since the experimental unit described in [31] had been significantly modified, before making the main measurements, we conducted preliminary tests to check the stability of temperature maintenance and sealing of all the unit components. The tests were made at 313 K, the autoclave was filled with CO2 until the pressure reached 20 MPa, after which we monitored the temperature and pressure values for 3 days. The maximum deviation of the temperature value in the autoclave from the pre-set value over the whole period was ±0.4 K. The pressure decrease after 72 h was about 0.15 MPa, which, by the NIST Chemistry WebBook data [43], corresponds to CO2 leakage in the amount of about 0.20 g (the calculated initial mass was 121.77 g). We considered such results satisfactory enough to proceed with the main experiments.

To calculate the concentrations of TODGA in the autoclave, we had to measure its effective volume. The geometric volume of the autoclave was 150 mL but it also contained the magnetic stir bar, the vial and the support. The autoclave effective volume was measured by filling it step by step with distilled water using 100–1000 μL and 1–10 mL Thermo Scientific Lite (Lenpipet Thermo Scientific, Moscow, Russia) calibrated mechanical pipettes. The measurements were made three times. Taking into account the pipette error, the effective volume of the autoclave was 145 ± 1 mL.

The TODGA retention time in the selected analysis conditions was 1.13 ± 0.02 min. The calibration tests showed high reproducibility of the results obtained in the considered experimental unit. In all the tests, the sample from the autoclave was injected at least twice (Figure 1), to exclude accidental error. Since the TODGA concentration in the autoclave decreased after the sample collection, the area of the next peak was always 0.3–0.5% smaller than that of the previous one. Taking this into account, we were able to estimate the approximate rate of TODGA sample dissolution in the experimental conditions. It was established that increasing the dissolution time from 10 to 60 min did not increase the chromatographic response. This indicates that the TODGA samples completely dissolved in less than 10 min.

**Figure 1.** Example of two-time injection of a TODGA sample from the autoclave.

By analyzing a number of TODGA samples, we plotted a calibration dependence of the amount of the substance introduced into the autoclave on the chromatographic peak area (Figure 2). The dependence equation takes the form:

$$n = \operatorname{Res} p \cdot 7.8804 \cdot 10^{-8} \tag{1}$$

where *Resp* is the chromatographic response (peak area).

**Figure 2.** Calibration dependence of TODGA content in the autoclave on the chromatograph response (the dashed line is linear approximation of calibration points).

Most of the calibration points were obtained at a pressure of 20 MPa. However, to confirm its performance in the extended CO2 density range, we obtained several points in the pressure range from 10 to 30 MPa. When the pressure in the system was reduced to 10 MPa, there were nonreproducible errors in the analyzed points, which are characteristic of conditions of incomplete substance dissolution. This effect was observed even with relatively small concentrations of TODGA. For this reason, the points obtained at 10 MPa were not used to construct calibration dependence.

The calibration experiments confirmed the high accuracy of the measurements made on the modified experimental unit. The maximum absolute deviation of the experimental value (*ni*) from the one calculated by calibration (*n*ˆ*i*) was 2.2 μmol, the determination coefficient: *R*<sup>2</sup> = 0.9998. The root mean square error (RMSE) of approximation was calculated as follows:

$$\text{RMSE} = \sqrt{\frac{\sum\_{i} \left(n\_i - \hat{n}\_i\right)^2}{N}} \tag{2}$$

where *N* is the number of experimental points. The RMSE value was 0.0012 mmol.
