Next Article in Journal
Structural and Electronic Properties of U5M+ and T5M+ (U = Uracil, T = Thymine, M = Ag and Au) Cluster Cations
Previous Article in Journal
Effect of Trace Rare Earth Element Cerium (Ce) on Microstructure and Mechanical Properties of High Strength Marine Engineering Steel
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Artificial Intelligence Modeling of Materials’ Bulk Chemical and Physical Properties

by
Jerry A. Darsey
Center of Molecular Design and Development, University of Arkansas at Little Rock, Little Rock, AR 72204, USA
Crystals 2024, 14(10), 866; https://doi.org/10.3390/cryst14100866
Submission received: 29 August 2024 / Revised: 23 September 2024 / Accepted: 27 September 2024 / Published: 30 September 2024
(This article belongs to the Special Issue The Application of AI and Machine Learning for Energy Material Design)

Abstract

:
Energies of the atomic and molecular orbitals belonging to one and two atom systems from the fourth and fifth periods of the periodic table have been calculated using ab initio quantum mechanical calculations. The energies of selected occupied and unoccupied orbitals surrounding the highest occupied and lowest unoccupied orbitals (HOMOs and LUMOs) of each system were selected and used as input for our artificial intelligence (AI) software. Using the AI software, correlations between orbital parameters and selected chemical and physical properties of bulk materials composed of these elements were established. Using these correlations, the materials’ bulk properties were predicted. The Q2 correlation for the single-atom predictions of first ionization potential, melting point, and boiling point were 0.3589, 0.4599, and 0.1798 respectively. The corresponding Q2 correlations using orbital parameters describing two-atom systems increased the capability to predict the experimental properties to the respective values of 0.8551, 0.8207, and 0.7877. The accuracy in predicting materials’ bulk properties was increased up to four-fold by using two atoms instead of one. We also present results of the prediction of molecules for materials relevant to energy systems.

1. Introduction

The volume of information that is available for any given chemical or physical system is becoming increasingly unmanageable. We are literally in danger of being “swamped” by this profusion of available data. Artificial intelligence (AI) is one of the only techniques by which we have any chance of sorting through this voluminous amount of data and extracting meaningful information about the system under study. An artificial neural network is one of the more powerful techniques of artificial intelligence. Being able to predict a material’s bulk physical characteristics before attempting synthesis would be very advantageous. By choosing a target bulk property, a researcher can determine combinations of atoms yielding that desired property using predictions based on small nanocluster systems. Predicting the physical properties of molecular systems would also be very beneficial, especially in systems relevant to energy production. Most molecules and atoms can be modeled and optimized with software such as Hyperchem 5.01 [1] and the Gaussian09 [2] quantum mechanical program. The scope of this research is to use the output data from a program such as Gaussian09 to train the artificial intelligence (AI). The training set consists of an input calculated with Gaussian09 and selected bulk properties as the output. This training set is used to teach our AI software correlations between the calculated input parameters and the bulk properties of a material. Once a correlation is found, the AI is able to predict the bulk material properties from any atom or combination of atoms composing the material, provided the energies of the atomic or molecular orbitals are known from ab initio SCF/DFT calculations.

2. Artificial Intelligence

Artificial intelligence (AI) via artificial neural networks (ANNs) consists in parallel types of computing systems. The main structure of ANNs is a highly interconnected network of simple processing elements known as nodes. A popular configuration for a supervised feed-forward back-propagating neural network is three main layers. The first layer, called the input layer, receives all of the input data. Next, there is a hidden layer; its function is to perform the desired mapping between the input and output layers. Finally, there is the output layer, where the network gives its predicted value. There are generally multiple input nodes connected to multiple hidden nodes connected to one or more output nodes (see Figure 1). In our study, we use the lowest 20 unoccupied molecular orbitals and the highest 20 occupied molecular orbitals that surround the orbital gap. Each input node’s value is adjusted by a weight before reaching the hidden layer where all products of inputs multiplied by weights are summed. This sum is transformed by a nonlinear activation function, which scales the summation to an appropriate nodal value. If the hidden node’s value is less than a specified threshold value, it will remain inactive and does not contribute to the next layer. However, if the hidden node’s value is greater than the threshold value, it will be adjusted by another weight and summed with all other adjusted hidden nodes that met the threshold criteria. The new summation will be transformed to represent the output layer, where the predicted value is compared to the correct value. These errors are back-propagated, before weight adjustment, through the network to the input layer. This forward and backward training process continues until a convergence is achieved. It is this feature of propagating the error backwards through the network which is responsible for the network being referred to as a backpropagation neural network.
Our AI software is not programmed in the conventional sense. This software uses a large amount of data as examples and works by trial and error. They are known as supervised networks. This trial-and-error phase is called training. The most popular general type of neural network uses the training of a three-layer, feed-forward, and fully connected network of nodes and incorporates a backpropagation algorithm. For additional information, see references [3,4,5] and references therein.

3. Methods

The molecules that were analyzed were designed and modeled using HyperChem 5.01 molecular modeling software. This program was used to construct the atoms in the periodic table beginning with element 19 through 35 and 37 through 53. Spin multiplicity for the lowest ground state, as determined from term symbols, was used for geometry optimizations on these elements. Gaussian09 was then used to perform Density Functional Theory (DFT) single-point energy calculations on the structurally optimized atoms. In the Gaussian09 program, the LanL2DZ basis set and the B3PW91 functional was used since most of the elements that were analyzed were slightly heavy in nature.
The LanL2DZ basis set incorporates parameters that account for the relativistic effects of these heavier elements.
Gaussian09 was used to perform Self-Consistent-Field Molecular Orbital (SCF/MO) calculations in order to calculate the energies of the lowest unoccupied atomic orbitals [LUAO] and the energies of the highest occupied atomic orbitals [HOAO]. After the Gaussian09 calculations were completed, data were collected for the lowest unoccupied atomic orbitals and the highest occupied atomic orbitals [3,4].
The artificial intelligence program employed was Nets [6], developed by the Artificial Intelligence Division of the NASA Johnson Space Center. The AI modeling created using Nets was utilized to discern correlations between chemical and physical properties of bulk materials and HOMO and LUMO energies. Once the network was taught how to respond to a set of specific examples, those weights were stored. The network was then tested for the accuracy of predictions. The backpropagation network processed the inputs and compared the resulting outputs to the experimental values. Output errors were propagated back through the system, causing the system to readjust the weights. The process is repeated over and over, continually tweaking the weights until the desired results are achieved. For additional detail on these methods, see references [5,7,8,9]. The first set of results involved modeling the bulk properties of atomic systems. The second set of results involved the prediction of bulk properties of various molecules, including molecules involved in energy systems. For clarification, when we refer to one-atom systems, we will use LUAO and HOAO. However, when discussing multi-atom systems, we will use LUMO and HOM.

4. Results

Atomic Systems

The predictive value of the models formed were represented by plotting the experimental value of the property on the horizontal axis and the ANN predicted value on the vertical axis. The optimal prediction would yield a best fit line with Q2 = 1.0 and slope = 1.0. Q2 is a measure of the data’s correlation. Statistical revisions have been performed on each plot by removing any prediction made in error greater than two times the standard deviation of the error population. The results depicted in this section are the result of Neil Mitchell’s Master’s thesis [10].
Figure 2a,b represent the ability of the ANN to predict the bulk materials’ first Ionization Potential based on data describing one- and two-atom systems from which the bulk materials are composed. Figure 2a represents the predictions made using orbital data obtained from a single atom of the corresponding bulk material. Figure 2b represents the predictions made using orbital data obtained from two atoms of the corresponding bulk material, modeled as homogeneous diatomic molecules. Note that the Q2 value for the single-atom modeling was 0.3589. The value of Q2 for the diatomic system increased significantly to 0.8551.
Figure 3a,b represent the ability of the AI to predict the melting point of bulk materials using orbital data obtained from atoms and homogeneous diatomic clusters. Figure 3a represents predictions made using atomic orbital parameters. Figure 3b represents the predictions made using diatomic orbital parameters. A similar increase in Q2 was also observed for the melting point predictions. The single-atom series yielded a Q2 of 0.4599 in contrast with the diatomic series melting point predictions which exhibited a Q2 of 0.8207.
Melting points and boiling points are purely physical properties and are not as apparently related to the electronic properties of the atom as ionization potential. However, a significant ability to predict the melting and boiling points was observed.
Figure 4a,b represent the ability of the AI to predict the boiling point of bulk materials using orbital data of atoms and homogeneous diatomic clusters. Figure 4a represents boiling point predictions made using single-atom orbital parameters resulting in a Q2 of 0.1798. Figure 4b represents the boiling point predictions made using diatomic orbital parameters with a Q2 value of 0.7877.

5. Summary

5.1. Atomic Systems

The ability to predict bulk properties of materials using parameters from very small “chunks” is very beneficial. The parameters used to train the artificial neural network can be obtained very accurately and inexpensively by using DFT calculations on these small atomic systems. The same calculations are computationally very expensive or impossible for large bulk systems. The procedures outlined in this paper allow researchers to decrease costs while increasing their capabilities to make predictions.
The modeling of chemical and physical properties of bulk systems using artificial intelligence has been successfully accomplished. Energy values for the atomic and diatomic HOMOs and LUMOs for each of the 34 elements composing the fourth and fifth periods of the periodic table were used as input for an artificial neural network. The experimentally determined first ionization potentials, melting points, and boiling points characteristic of the homogeneous bulk materials composed of the 34 elements were found to correlate with the orbital inputs. Up to a four-fold increase in the predictive capability of this technique was observed by going from a single-atom to a diatomic system to obtain the orbital parameters. We anticipate a continued increase in predictive ability in going to slightly larger systems while maintaining the computational advantages of using small clusters. Our future goal is to apply our procedure to molecular systems; we would like to determine the minimum number of molecules it would take to predict their bulk properties.

5.2. Molecular Systems

In this part of our study, we trained our AI software on 66 different molecules on which the corresponding boiling point (B.P.), melting point (M.P), refractive index (R.I), density (D), and dipole moment (D.M.) were provided and then asked for “trained” AI to predict the molecular weights for an additional four compounds. We used the learning algorithm technique of feed-forward backpropagation using software developed by NASA and modified by us [11]. Figure 5 illustrates the AI network used in the training and prediction of molecular weights (M.W.) from the physical properties of melting point (M.P.), boiling point (B.P.), density (D.), refractive index (R.I.), and dipole moment (D.M.).

6. Results

Molecular Systems

The results shown in this section are the result of Ashish Soman’s Master’s thesis [12]. We trained on 66 compounds and tested on 4 compounds at a time. The compounds used for training and testing are listed in Table 1. The five physical properties used for training are melting point (M.P.), boiling point (B.P.), density (D), refractive index (R.I.), and dipole moment (D.M.). These properties were found in the 76th edition of the CRC Handbook [13]. We varied the compounds used for training and testing purposes. Table 1 shows the molecular weights for the 70 compounds used for training and testing. Table 2 shows the results of training the neural network on organic compounds 1 through 14, 16 through 23, 25 through 47, 49 through 63, and 65 through 70 in RUN 1. Compounds 15, 24, 48, and 64 were used to test the training of our network.
In RUN 2, the neural network was trained on organic compounds from 1 through 12, 14 through 27, 29 through 55, and 57 through 69 with testing on compounds 13, 28, 56, and 70. In this case, the percent error for the molecular weight ranged from 0.55% to 7.00% with an average error of 3.61%.
In RUN 3, the neural network was trained on organic compounds from 1 through 6, 8 through 18, 20 through 51, 53 through 67, and 69 and 70 with testing on compounds 7, 19, 52, and 68. It was noted that the percent error for the molecular weight ranged from 2.04% to 7.96% with an average error of 4.42%.
In RUN 4, the neural network was trained on organic compounds from 1 through 7, 9 through 22, 24 through 32, 34 through 56, and 58 through 70 with testing on compounds 8, 23, 33, and 57. In this case, the percent error for the molecular weight ranged from 0.32% to 5.00% with an average error of 3.59%.
In RUN 5, the neural network was trained on organic compounds from 1 through 29, 31 through 44, 46 through 59, and 61 through 69 with testing on compounds 30, 45, 60, and 70. In this case, the percent error for the molecular weight ranged from 7.18% to 17.63% with an average error of 11.30%.
In RUN 6, the neural network was trained on compounds from 1 through 5, 7 through 21, 23 through 49, 51 through 54, and 56 through 70 with testing on compounds 6, 22, 50, and 55. In this case, the percent error for the molecular weight ranged from 0.80% to 5.81% with an average error of 3.71%. It should be noted that the overall average error in prediction of the molecular weights of the six runs reported was 4.97%. A correlation plot is shown in Figure 6. Table 3 shows the experimental values used for the AI training [13].
Another point of importance is that we made numerous runs in which we eliminated one or more of the physical properties from the input data. The results of all these runs were that we were unable to produce accurate predictions of the molecular weights. It required all five of the physical properties to produce results with an average error below 5%. We should also mention that we made many additional runs besides the six reported. For brevity, we included only the results of runs with the smallest average error, with the largest average error, and four runs with intermediate average error. Lastly, molecular weight was chosen to train on although in principle other properties could have been used for this training. Convenience and availability were the primary reason for this choice.
Also, prediction of the molecular weight is important because it is a characteristic property of a compound which can be deduced from the molecular formula. In this project, attempts were made to make an association of such a molecular property with boiling point, melting point, density, refractive index, and dipole moment which are all bulk properties of a compound.
Also, due to the limited resources of the library and the lack of a wider database, only 70 compounds were used for training and testing purposes. Attempts were also made using molecular weight in the training set and predicting melting point or boiling point. In either case, it could predict the corresponding properties, but the results were not that good. One cannot predict the dipole moment or refractive index based on the training of the remaining five physical properties. The reason for this might be that the numbers on which the network was trained were very large as compared to the testing data.
Again, it is not mandatory to use 28 nodes in the hidden layer. Since the network was trained on 66 organic compounds, it is always better to keep the number of nodes in the hidden layer, less than half the number of data sets. If there are a greater number of nodes in the hidden layer, say 40, 50, or 60, there is a possibility that instead of trying to make an association by learning, it will try to memorize the pattern. We want the network to learn and then generate an implicit functional relationship between the input data (five physical properties) and the output (molecular weight), instead of memorization.

7. Summary

Molecular Systems

We found that a neural network could successfully make an association between the five physical properties of melting point, boiling point, density, refractive index, and dipole moment and the molecular weight of the various organic molecules. We also noted that it was not able to make this association when one or more of these properties were eliminated from the input data. This implies that there is a functional relation between all of these properties and molecular weight. We also believe that hidden within these properties there is probably geometrical information about these molecules (i.e., conformational or spatial isomers). Future studies will attempt to extract this information.
Also, by analyzing the weight space, we feel that we can come out with some mathematical relationship between those five physical properties and molecular weight. If we are able to do so, one can go backwards and solve for any one of the physical properties if the other five are known. For additional details on an introduction to and the theory of neural network computing, see references [14,15,16,17,18].
Table 2 presents the results of the AI predictions of the molecular weights from the five physical properties (melting point, boiling point, density, refractive index, and dipole moment).
The following cross-validation plot (Figure 6) illustrates the results of the prediction of the molecular weights of various organic molecules using the physical properties of melting point, boiling point, refractive index, density, and dipole moment.
NOTE: In Table 3, the pressure at which the B.P. was determined is about one atmosphere for all the compounds. The density is a relative to water, otherwise it has the dimensions g/mL. A superscript indicates the temperature of the liquid and a subscript indicates the temperature of water (i.e., 4 °C) to which the density is referred. Compounds for which the superscripts are not mentioned are the ones whose density measurements were carried out, when the temperature of the liquid is 20 °C. The refractive index is reported for the D line of the spectrum. The temperature of determination of this physical property appears as a superscript [13].
The significance of these values may involve some ambiguity because of the possibility of different conformations or spatial isomers.

Funding

This research received no external funding.

Data Availability Statement

Data are contained within the article.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Hyperchem, version 5.01; Hypercube, Inc.: Waterloo, ON, Canada, 1994. Available online: http://www.hypercubeusa.com/News/PressRelease/NewPolicyOct1997/tabid/397/Default.aspx (accessed on 29 August 2024).
  2. Frisch, M.J.; Trucks, G.W.; Schlegel, H.B.; Scuseria, G.E.; Robb, M.A.; Cheeseman, J.R.; Scalmani, G.; Barone, V.; Mennucci, B.; Petersson, G.A.; et al. Gaussian, version 09; Gaussian, Inc.: Wallingford, CT, USA, 2009.
  3. Haykin, S. Neural Networks: A Comprehensive Foundation, 2nd ed.; Prentice Hall: Hoboken, NJ, USA, 1999. [Google Scholar]
  4. Käser, S.; Vazquez-Salazar, L.I.; Meuwly, M.; Töpfer, K. Neural network potentials for chemistry: Concepts, applications and prospects. Digit. Discov. 2023, 2, 28–58. [Google Scholar] [CrossRef] [PubMed]
  5. Sumpter, B.G.; Getino, C.; Noid, D.W. Theory and Applications of Neural Computing in Chemical Science. Annu. Rev. Phys. Chem. 1994, 45, 439–481. [Google Scholar] [CrossRef]
  6. Shelton, R.; Baffes, P.T. Nets Back-Propagation; version 4.0; Software Technology Branch, NASA, Johnson Space Center: Houston, TX, USA, 1989. Available online: https://ntrs.nasa.gov/api/citations/19920012191/downloads/19920012191.pdf (accessed on 29 August 2024).
  7. Darsey, J.A.; Sumpter, B.G.; Noid, D.W. Correlating Physical Properties of Both Polymeric and Energetic Materials and their Organic Precursors of Polymers Using Artificial Neural Networks. Int. J. Smart Eng. Syst. Des. 2000, 2, 283–298. [Google Scholar]
  8. Tu, X.; Geesman, E.; Wang, K.; Compadre, C.; Buzatu, D.; Darsey, J. Prediction of the partition coefficient of dillapiol and its derivatives by use of molecular simulation and artificial neural networks. Chim. Oggi 2002, 10, 51–54. [Google Scholar]
  9. Lane, T.R.; Harris, J.; Urbina, F.; Ekins, S. Comparing LD50/LC50 Machine Learning Models for Multiple Species. J. Chem. Health Saf. 2023, 30, 83–97. [Google Scholar] [CrossRef] [PubMed]
  10. Mitchell, N. A Systematic Approach for Developing Feed-Forward/Back-Propagating Neural Network Models for Predicting Bulk Chemical and Physical Properties of Transition Metals. Master’s Thesis, University of Arkansas at Little Rock, Little Rock, AR, USA, 2007. [Google Scholar]
  11. Baffer, P.T. NNETS Program; version 2.0; Johnson Space Center Report No. 23366; NASA: Houston, TX, USA, 1989.
  12. Soman, A. Application of Artificial Intelligence to Chemical Systems. Master’s Thesis, University of Arkansas at Little Rock, Little Rock, AR, USA, 1993. [Google Scholar]
  13. Lide, D.R. CRC Handbook CRC Handbook of Chemistry and Physics, 76th ed.; CRC Press: Cleveland, OH, USA, 1977. [Google Scholar]
  14. Eberhart, R.C.; Dobbins, R.W. Neural Network PC Tools. In A Practical Guide; Academic Press, Inc.: London, UK, 1990. [Google Scholar]
  15. Hert, J.; Krogh, A.; Palmer, R.G. Introduction to the Theory of Neural Computing; Addison-Wesley Publishing, Co.: Redwood City, CA, USA, 1991. [Google Scholar]
  16. Wasserman, P.D. Neural Computing Theory and Practice; Van Nostrand Reinhold: New York, NY, USA, 1989. [Google Scholar]
  17. Zurada, J.M. Introduction to Artificial Neural Systems; Weit Publishing Company: New York, NY, USA, 1992. [Google Scholar]
  18. Darsey, J.A.; Noid, D.W.; Wunderlich, B.; Tsoukalas, L. Neural-net extrapolations of heat capacities of polymers to low temperatures. Makromol. Chem. Rapid Commun. 1991, 12, 325. [Google Scholar] [CrossRef]
Figure 1. Architecture of an ANN backpropagation network with three active layers.
Figure 1. Architecture of an ANN backpropagation network with three active layers.
Crystals 14 00866 g001
Figure 2. Experimental vs. predicted first ionization potentials for (a) single-atom series and (b) diatomic series.
Figure 2. Experimental vs. predicted first ionization potentials for (a) single-atom series and (b) diatomic series.
Crystals 14 00866 g002
Figure 3. Experimental vs. predicted melting points for (a) single-atom series and (b) diatomic series.
Figure 3. Experimental vs. predicted melting points for (a) single-atom series and (b) diatomic series.
Crystals 14 00866 g003
Figure 4. Experimental vs. predicted boiling points for (a) single atom series and (b) diatomic series.
Figure 4. Experimental vs. predicted boiling points for (a) single atom series and (b) diatomic series.
Crystals 14 00866 g004
Figure 5. Topology of network used in simulation of molecular weights (M.W.) of various molecules where M.P. is melting point, B.P. is boiling point, D. is density, R.I. is refractive index, and D.M. is dipole moment.
Figure 5. Topology of network used in simulation of molecular weights (M.W.) of various molecules where M.P. is melting point, B.P. is boiling point, D. is density, R.I. is refractive index, and D.M. is dipole moment.
Crystals 14 00866 g005
Figure 6. Cross-validation plot of the results of prediction of molecular weights.
Figure 6. Cross-validation plot of the results of prediction of molecular weights.
Crystals 14 00866 g006
Table 1. Molecular weights of various molecules.
Table 1. Molecular weights of various molecules.
No.Compound NameMol. Weight (U)
1Dichloro-fluoro-methane102.92
2Dibromo-methane173.85
3Nitro-methane61.04
4Pentachloro-ethane202.3
5Chloro-ethylene62.5
6Ethanal44.05
7Chloro-ethane64.52
8Fluoro-ethane48.06
9Iodo-ethane155.97
10Acetyl amine59.07
11Dimethyl sulfoxide78.13
12Dimethyl-amine45.09
13Propyne40.07
142-chloro-propene76.53
15Propene42.08
162,2-dichloro-propane112.99
171-propanol60.11
18Trimethyl-amine59.11
19Furan68.08
20Thiophene84.14
211,2-butadiene54.09
22Butanal72.12
23Cyclopentene68.13
24Pyridine79.10
25Bromo-benzene157.02
26Nitro-benzene123.11
27Phenol94.11
28p-chloro-toulene126.59
29Toulene92.15
30o-xylene106.17
31Dibutyl-ether130.23
32Quinoline129.16
33Isoquinoline129.16
34Phenyl-benzene154.21
35Tribromo-methane252.75
36Iodo-methane141.94
37Ethanethiol62.13
38Propanone58.08
39Butane58.13
40Dipropyl-ether102.18
41Fluoro-methane34.03
421,1-dichloro-ethane98.96
431,1-difluoro-ethane66.05
442-propanol60.11
451-nitro-propane89.09
462-chloro-propane78.54
47Aniline93.13
48Butanal72.12
49m-dichloro-benzene147.01
50m-fluoro-toulene110.13
51Ethane30.07
52Propadiene40.07
53Propene42.08
54Acetylene26.04
552-chloro-ethanol80.52
561,3-cyclohexadiene80.14
571-Hexyne82.15
581,4-dichloro-butane127.03
59Ethanoic acid60.05
601,3-dichloro-propane112.99
612-chloro-2-methyl-propane92.57
62m-chloro-nitrobenzene157.56
63p-chloro-nitrobenzene157.56
641,3-cyclopentadiene66.10
651,3-butadiene54.09
664-chloro-phenol128.56
671,3-cyclohexadiene80.14
68Phenyl-methanol108.15
69Acetophenone120.16
70p-fluoro-nitrobenzene141.10
Table 2. Results of predictions of molecular weights.
Table 2. Results of predictions of molecular weights.
Compound NameExperimental M.W. (U)Predicted M.W. (U)
RUN 1:
Propene42.0840.833
Pyridine79.1081.36
Butanal72.1273.10
1,3-cyclopentadiene66.1062.32
RUN 2:
Propyne40.0740.85
p-chloro-toulene126.59127.3
1,3-cyclohexadiene80.1485.82
p-fluoro-nitrobenzene141.10133.59
RUN 3:
Chloro-ethane64.5263.20
Furan68.0873.50
Propadiene40.0738.70
Phenyl-methanol108.15112.88
RUN 4:
Fluoro-ethane48.0647.90
Cyclopentene68.1371.11
Isoquinoline129.16135.72
1 Hexyne82.1578.29
RUN 5:
o-xylene106.17113.8
1-nitro-propane89.0999.35
1,3-dichloro-propane112.9996.06
p-fluoro-nitro-benzene141.10128.5
RUN 6:
Ethanal44.0542.10
Butanal72.1272.70
m-fluoro-toluene110.13103.73
2-chloro-ethanol80.5283.60
Table 3. Physical properties of the organic compounds.
Table 3. Physical properties of the organic compounds.
No.Compound NameM.P. (°C)B.P. (°C)D (g/cc)R.I.D.M. (Debyes)
1Dichloro-fluoro-methane−135.09.01.405 91.3724 91.29
2Dibromo-methane−52.5597.02.49701.5420 901.43
3Nitro-methane−28.50100.81.13711.3817 203.46
4Pentachloro-ethane−29.00162.01.67961.5025 200.92
5Chloro-ethylene−153.8−13.40.19061.3700 201.45
6Ethanal−121.020.800.78 181.3316 202.69
7Chloro-ethane−136.412.270.89781.3676 202.05
8Fluoro-ethane−143.2−37.70.00221.2656 201.94
9Iodo-ethane−108.072.301.93581.5133 201.91
10Acetyl amine82.30221.20.99 851.4278 783.76 i
11Dimethyl sulfoxide18.45189.01.10141.4770 203.96
12Dimethyl-amine−93.007.400.680 01.3500 171.03
13Propyne−101.5−23.20.7 −501.386 −400.78
142-chloro-propene−137.422.650.90171.3973 201.66
15Propene−185.2−47.40.51931.357 −700.37
162,2-dichloro-propane−33.8069.301.11201.4148 202.27
171-propanol−126.597.400.80351.3850 201.68 i
18Trimethyl-amine−117.22.870.671 01.3631 00.61
19Furan−85.6531.360.95141.4214 20.66
20Thiophene−38.2584.161.06491.5289 200.55
211,2-butadiene−136.210.850.676 01.421 1.30.40
22Butanal−99.0075.700.81701.3843 202.72 i
23Cyclopentene−135.144.240.77201.4225 200.20
24Pyridine−42.00115.50.98191.5095 202.19
25Bromo-benzene−30.82156.01.49501.5597 201.70
26Nitro-benzene5.7210.81.20371.5562 204.22
27Phenol43.070.861.05761.5418 411.45
28p-chloro-toulene7.5162.01.06971.5150 202.21
29Toulene−95.0110.60.86691.4961 200.36
30o-xylene−25.18144.40.88021.5055 200.62
31Dibutyl-ether−95.30142.00.76891.3992 201.17 i
32Quinoline−15.60238.11.09291.6268 202.29
33Isoquinoline26.50243.31.09861.6148 202.73
34Phenyl-benzene71.00255.90.86601.5880 770.00
35Tribromo-methane8.30149.52.88991.5976 200.99
36Iodo-methane−66.4542.402.27901.5382 201.62
37Ethanethiol−144.435.000.83911.4310 201.58 i
38Propanone−95.3556.200.78991.3588 202.88
39Butane−138.4−0.500.601 01.354 −19<0.05
40Dipropyl-ether−12291.000.73601.3809 201.21 i
41Fluoro-methane−141.8−78.40.8 −601.1727 201.85
421,1-dichloro-ethane−16.9857.281.17571.4164 202.06
431,1-difluoro-ethane−117.0−24.70.95001.301 −722.07
442-propanol−89.5082.400.78551.3776 201.66 i
451-nitro-propane−108.0130.51.01 241.4016 203.56 i
462-chloro-propane−117.235.740.86171.3777 202.17
47Aniline−6.30184.11.02171.5863 201.53
48Butanal−99.075.70.81701.3843 202.72 i
49m-dichloro-benzene−24.7173.01.28841.5459 201.72
50m-fluoro-toulene−87.7116.00.99861.4691 201.86
51Ethane−183.3−88.60.57201.0377 00.00
52Propadiene−136.0−34.51.78701.41680.00
53Propene−185.3−47.40.51931.357 −700.37
54Acetylene−80.8−84.00.6 −321.0005 00.00
552-chloro-ethanol−67.5128.01.20021.4419 201.78 i
561,3-cyclohexadiene−89.080.500.84051.4755 200.44
571-Hexyne−131.971.300.71551.3989 200.83 i
581,4-dichloro-butane−37.3153.91.14081.4542 202.22 i
59Ethanoic acid16.604117.91.04921.3716 201.74
601,3-dichloro-propane−99.5120.41.18781.4487 202.1 i
612-chloro-2-methyl-propane−25.452.00.84201.3857 202.13
62m-chloro-nitrobenzene24.00235.01.34 501.5374 803.73
63p-chloro-nitrobenzene83.6242.01.3 901.538 1002.83
641,3-cyclopentadiene−97.240.000.80211.4440 200.42
651,3-butadiene−108.91−4.410.62111.429 −250.00
664-chloro-phenol43.20219.81.27 401.5579 402.11
671,3-cyclohexadiene−89.080.50.84051.4755 200.44
68Phenyl-methanol−15.3205.31.04191.5396 201.71
69Acetophenone20.5202.01.02811.5372 203.02
70p-fluoro-nitrobenzene27.0206.01.33001.5316 202.87
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Darsey, J.A. Artificial Intelligence Modeling of Materials’ Bulk Chemical and Physical Properties. Crystals 2024, 14, 866. https://doi.org/10.3390/cryst14100866

AMA Style

Darsey JA. Artificial Intelligence Modeling of Materials’ Bulk Chemical and Physical Properties. Crystals. 2024; 14(10):866. https://doi.org/10.3390/cryst14100866

Chicago/Turabian Style

Darsey, Jerry A. 2024. "Artificial Intelligence Modeling of Materials’ Bulk Chemical and Physical Properties" Crystals 14, no. 10: 866. https://doi.org/10.3390/cryst14100866

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop