Next Article in Journal
Marketing Innovations in Industry 4.0 and Their Impacts on Current Enterprises
Previous Article in Journal
Peracetic Acid Sterilization Induces Divergent Biological Response in Polymeric Tissue Engineering Scaffolds
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Prediction of Surface Roughness of 304 Stainless Steel and Multi-Objective Optimization of Cutting Parameters Based on GA-GBRT

1
School of Mechanical Engineering, Guizhou University, Guiyang 550025, China
2
School of Mining and Civil Engineering, Liupanshui Normal College, Liupanshui 553004, China
3
Key Laboratory of Advanced Manufacturing Technology of Ministry of Education, Guizhou University, Guiyang 550025, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2019, 9(18), 3684; https://doi.org/10.3390/app9183684
Submission received: 16 August 2019 / Revised: 16 August 2019 / Accepted: 28 August 2019 / Published: 5 September 2019

Abstract

:
Establishing and controlling the prediction model of a machined surface quality is known as the basis for sustainable manufacturing. An ensemble learning algorithm—the gradient boosting regression tree—is incorporated into the surface roughness modeling. In order to address the problem of a high time cost and tendency to fall into a local optimum solution when the grid search and conjugate gradient method is adopted to obtain the super-parameters of the ensemble learning algorithm, a genetic algorithm is employed to search for the optimal super-parameters in the training process, and a genetic-gradient boosting regression tree (GA-GBRT) algorithm is developed. A fitting goodness of fit is taken as the fitness function value of the genetic algorithm and combined with k-fold cross-validation, as such, the initial model parameters of the gradient boosting regression tree are optimized. Compared to the optimized artificial neural network (ANN) and support vector regression (SVR) and combined with the cutting experiment of 304 stainless steel with a micro-groove tool, a genetic algorithm multi-objective optimization model with the highest cutting efficiency and a supreme surface quality was constructed by applying the GA-GBRT model. The response relationship reveals the non-linear interaction that occurs between the cutting parameters and the surface roughness of 304 stainless steel that is machined by the micro-groove tool. As indicated by the results obtained from the multi-objective optimization, the cutting efficiency can be enhanced by increasing the cutting speed and depth within a small range of surface quality variations. The GA-GBRT model is validated to be reliable in making a prediction of the surface roughness and optimizing the cutting parameters with turning and milling data.

Graphical Abstract

1. Introduction

Surface roughness is regarded as a significant indicator of integrity for cutting surfaces and has attracted widespread attention in the most recent decades. Nevertheless, due to the lack of understanding of the cutting mechanism, in practice, the desirable surface roughness in the processing process remains, to a large extent, reliant on human experience. The surface roughness can also be made adjustable by special processing methods, such as low temperature cutting and heat cutting, but these special processing methods can cause a processing distortion. In contrast, the processing error resulting from the combination of different processing parameters is significantly smaller than if the special processing method is adopted [1]. Not only does the unreasonable selection of process parameters or tools increase production costs, it also causes the surface quality to deteriorate. The addition of the artificial intelligence model makes it possible to construct a nonlinear model of the process parameters and response values quickly. Besides, it facilitates the optimization of the process parameters to satisfy the production efficiency requirements of the enterprise, thereby achieving the control of the surface quality and improvement of the service life of the parts [2]. Moreover, the relatively high precision prediction model plays a significant role in exploring the influence exerted by innovative tools and new processes on the surface integrity [3].
At present, various advanced modeling techniques and simulation methods have been widely applied in material processing, such as multiple regression [4], response surface method (RSM) [5], convolutional neural network (CNN) [6], neural network (ANN) [7], support vector machine (SVM) [8], and adaptive neuro-fuzzy inference system (ANFIS) [9], etc. By applying Grey–Taguchi-based response surface methodology (GT-RSM), Adalarasan et al. [10] optimized the plasma arc cutting parameters of 304 stainless steel. When the coated cemented carbide tool was employed to process (HSM) stainless steel at high speed, Hamdan et al. [11] adopted the Taguchi optimization method to obtain a minimum cutting force and achieve excellent surface roughness. Besides, they conducted an analysis of the optimization process with the assistance of both the signal to noise ratio response analysis (S ≤ N) and Pareto variance (ANOVA). Nevertheless, such methods can only be capable of achieving better accuracy with a small range of process parameters [12,13]. Patole et al. [14] employed the response surface method to predict the surface roughness and cutting force of AQL4340 under the micro-lubrication of nanofluids. Upadhyay et al. [15] constructed a multivariate regression model of surface roughness, which has a maximum error of 7.45% when surface roughness is predicted. Polynomial-based regression methods have the capability of quantifying the relationship between input parameters and response variables when a nonlinear relationship between the cutting parameters and surface roughness is established [16]. However, in recent years, the supervised learning based methods have achieved higher desirable accuracy and better reliability for surface roughness prediction modeling [17]. Mia and Dhar [7] proposed an average surface roughness prediction model that is premised on artificial neural networks and revealed the effect exerted by the hard turning of a high pressure coolant on surface roughness. Kovac et al. [18] conducted a combination of fuzzy logic and regression analysis to examine the influence exerted by machining parameters on the surface roughness of milling. Xie et al. [8] supplemented the cutting energy consumption (SCEC) and took PSO-SVM as a prediction model, on the basis of which a high prediction accuracy was achieved. Jurkovic et al. [17] performed a comparison of the ability between three machine learning methods (SVR, multiple regression, and ANN) in the process of high speed turning. Multivariate regression has been identified as the best in predicting surface roughness. Nevertheless, with the increase of the data set size and a reasonable optimization of structure, ANN is superior to multiple regression. In many cases, a model may be appropriate for fixed data but not suited to other data.
Plenty of machine learning algorithms have a single function, which makes it susceptible to over-fitting. Considering this, in order to improve the overall generalization ability, researchers combined their models to develop an integrated algorithm. Typical examples include Random Forest (RF) and Gradient boosting regression tree (GBRT). Agrawal [4] discovered that the random forest method is superior to the multiple regression model in making a prediction of the surface roughness of turning. Abu-Mahfouz et al. [19] applied SVM to study the surface roughness prediction as a classification problem and drew a comparison with k-nearest neighbors, decision trees, and random forest classifiers, which led to the discovery that the classifiers based on simplified features and SVM are effective. Nevertheless, the models based on supervised regression methods can be more effectively nonlinear with regard to the interaction between cutting parameters and surface roughness. Pimenov et al. [20] employed the random forest method to establish a surface roughness model with tool wear, machining time, and cutting power as input, which maximized the machining efficiency of the tool. Thus, a conclusion was drawn that random forest (RF) is more accurate than the standard multilayer perceptron (MLP) and regression trees. The GBRT algorithm is effective as an integrated algorithm, which combines multiple groups of basic learners to enhance the robustness to noise. It removes the need to normalize the data during the learning process and minimize the prediction error. The training time is low in complexity and the prediction process is fast [21]. At the present time, the GBRT algorithm has been validated as superior to least squares, neural network, random forest, and other calculation methods in various fields such as traffic [22], energy [23], agricultural engineering [24], and industry [25]. Despite this, little of the literature indicates that GBRT has been applied to the modeling process of surface roughness.
As the complexity of modeling technology increases, the model faces higher requirements on not only the convergence speed but also the optimization ability of intelligent algorithms. There are various hyperparameters in the GBRT model, as a result of which, the predictive performance of SVR is affected. In a large number of studies, Grid search (GS) or Conjugate Gradient (CG) was used for the purpose of parameter optimization. However, the GS algorithm is only capable of producing better results under the conditions that the parameter range is sufficiently large and the step size is sufficiently small. Besides, the time period is extremely long. The drawbacks of the CG algorithm are manifested in the following points. Firstly, the initial value dependence of the optimization effect is heavy. Secondly, the number of iterations is difficult to determine. Thirdly and lastly, it is prone to falling into the local optimal solution. Chandrasekaran et al. [26] introduced a range of different computational tools, such as artificial neural networks, fuzzy logic, genetic algorithms (GA), ant colony algorithm (ACO), and particle swarm optimization (PSO) algorithms, which can be easily applied based on complexity. How the variable optimization works is modeled during the process. The emergence of evolutionary algorithms addresses the problem of hyperparameter optimization, which can be effective in avoiding local optimization and ensuring that a globally optimal combination of parameters can be achieved.
The theoretical model between the new process or the new cutting tool and the machined surface quality is more complex and can only be constructed in the ideal environment. Zhang et al. [27] proposed a visual roughness measurement method based on transfer kernel learning for the purpose of solving the roughness observation problem of the grinding surface. Surface roughness prediction and optimization presents a new challenge when high hardness steels are turned under minimum coolant lubrication conditions. Mia et al. [28] addressed this problem using the least squares support vector machine (LS-SVM) method. Li et al. [29] integrated six different machine learning algorithms to predict the surface roughness of three-dimensional printed samples. In recent years, artificial intelligence has been applied to accurately perform analyses and explore various new processes and technologies, due to its superior performance in constructing a relational model of input and output.
In the machining of the 304 stainless steel toughness material, the soft Ferrite phase will cause the tool-chip contact length of the rake tool surface to be longer, thus resulting in more friction. The sharp rise in temperature severely affects the service life and the surface quality of workpiece [11]. A reasonable control of the contact characteristics of the rake tool surface is conducive to improving the service life of the tool and the quality of machined surface [30]. The newly designed micro-groove tool is based on the temperature field morphology. A new micro-groove is innovatively added to the original commercial groove tool, and has a smaller scale than that of the ordinary groove type. As demonstrated by previous research and experiments on the machining of a variety of materials, micro-groove tool can be effective in reducing the cutting temperature, changing the distribution of the cutting energy, and improving the service life of the tool and machining efficiency [31,32]. The complex rake face shape changes the stress state of the deformation zone, which can reduce the plastic deformation energy (EPD) and potentially enhance the machinability (PMA). As indicated by Astakhov [33], the geometric shape of the tool can be designed with a high performance with the assistance of the finite element model of metal cutting. The design concept of the micro-groove tool in this paper conforms to the method proposed by Astakhov. However, for a tool of a certain shape, when one or more of the machining conditions are changed, the tool that originally produced desirable test results may not be capable of performing satisfactorily under certain machining conditions. Therefore, for new processes and new tools, the analysis of the impact made by the parameters and reasonable setting of the cutting parameters can achieve good surface quality and high material removal rate within a certain range [13].
To figure out the impact made by the self-developed micro-groove cutter on the surface roughness of 304 stainless steel and guide the selection of cutting parameters, in this paper, a surface roughness prediction model based on GA-GBRT model is proposed. Firstly, the initial parameters of the model are optimized by combining genetic algorithms to enhance the generalization ability of GBRT. Secondly, a comparison is drawn with the existing ANN and SVR model prediction results. The comparison results confirm the validity of the GA-GBRT prediction roughness. Finally, based on GA-GBRT, the genetic algorithm multi-objective optimization function is involved to solve the optimization model, and the impact of the cutting parameters of the micro-groove cutting AISI 304 on the average surface roughness is investigated.

2. Research Methodology

2.1. Gradient Boosting Regression Tree

Gradient Boosting Regression Tree represents an integrated algorithm with strong learning strategies. Despite an original aim to solve classification problems, it was successfully applied in the field of regression [21].
F m ( x ) = F m 1 ( x ) + h m ( x )
where h m ( x ) represents the basis function of the weak learner. In GBRT, the basis function h m is a small regression tree of fixed size, for which the GBRT model F m ( x ) can be considered as the sum of m small regression trees. As illustrated in Figure 1, for each iteration m, a new tree generated. A simple tree is determined by the deviation between the experimental measurement and all the previous models (i.e., gradient) prediction. Then, the regression tree is incorporated into the GBRT model.
The algorithm is purposed to estimate the response Y i , t + k corresponding to the training set, which means that the better h m needs to satisfy the formula:
F m ( x i , t ) = F m 1 ( x i , t ) + h m ( x i , t ) = Y i , t + k .
Converted to
h m ( x i , t ) = Y i , t + k F m 1 ( x i , t )
h m in Equation (1) is a model for fitting the residual r m , i , t = Y i , t + k F m 1 ( x i , t ) at m iterations. The current residual can be expressed as a negative gradient of the squared error loss function:
1 2 ( Y i , t + k F m 1 ( x i , t ) ) 2 F m 1 ( x i , t ) = Y i , t + k F m 1 ( x i , t ) .
It can be concluded from Equation (4) that h m is converted to a negative gradient of the squared loss function. In addition, Equation (4) proves that the gradient lifting algorithm of the least square error loss function is a gradient descent algorithm. By substituting the squared error with different loss functions and their gradients, they are generalized to other loss functions. The gradient lifting and gradient boosting regression trees are described in detail in the literature [21,34]. For each lift and iteration, a regression tree is fitted to the current residual. The addition of enough regression trees into the final model can give rise to arbitrarily small training errors. To avoid overfitting, a simple regularization strategy is adopted to scale the contribution of each regression tree by a factor of m.
F m ( x ) = F m 1 ( x ) + V h m ( x ) V [ 0 ;   1 ]
The parameter V is known as the learning rate, as V reduces the step size during the gradient descent. The interaction taking place between V and the number of iterations M is significant. Smaller V requires more iterations, as a result of which, more basis functions are required to converge the training error. Empirical evidence suggests that a small V value has the capability to achieve better prediction accuracy. Hastie et al. [21] recommended setting the learning rate to a small value and choosing to stop M early. The interaction between V and M can be found in the literature [34].

2.2. Artificial Neural Networks

The network structure is illustrated in Figure 2. The input includes the feed rate, cutting speed and depth of cut, and the average surface roughness is output. In the neural network model, the number of nodes and the number of layers as well as their relationship determine the accuracy of the prediction [7]. Therefore, in the development of the ANN model, all possible architectures must be constructed and tested with higher precision. Then, the ANN structure is preferably used in line with the evaluation criteria RMSE. The neural network structure of 3-n-1 was trained by using the default Levenberg–Marquardt (LM), Bayesian regulation (BR), and scaled conjugate gradient (SCG) neural network in the toolbox, and the performance of the network was evaluated with the assistance of RMSE.

2.3. Support Vector Regression

Although Vapnik [21] proposed a support vector machine (SVM) based on a linear regression function estimation (as shown in Equation (6)) in the late 1960s, it is only recently that SVM has attracted more attention due to the promising application prospect it has in the data-driven field. Capable of an excellent supervised learning performance, SVM represents a dichotomy classification method based on a multidimensional feature vector. SVM was originally developed as a linear classification method and was extended to nonlinear classifiers later, before ending up being extended to regression problems.
f ( x ) = w × φ ( x ) + b
f ( x ) is a linear regression function, where w denotes the slope of the regression line and b means the offset of the regression line. φ ( x ) indicates a kernel function that maps the high-dimensional input space x to a more dimensional or infinite dimensional space, and f ( x ) can be calculated by minimizing Equation (7).
1 2 w T w + 1 n i = 1 n C ( f ( x i ) , y i )
where 1 / 2 w T w describes the complexity of the model. C represents a penalty factor, and y and n refer to the target and number of samples, respectively.

3. Microgroove Tool Preparation and Data Collection

3.1. Microgroove Tool Preparation

As shown in Figure 3a, firstly, a simulation experiment was conducted in the 3D cutting simulation module of the material forming software V11.2. In a finite element simulation, the temperature field appearance of the rake face is the focus of this paper. The initial temperature is set to 20 °C. In order to reach the steady-state temperature quickly and to extract the temperature field, the heat transfer coefficient is set to 2000 W/(m2 K). Friction is a significant factor affecting the cutting force, cutting temperature, and specific cutting energy. This paper applies a simple shear friction hypothesis model in finite element Deform-3D [35]: τ = μ τ 0 , τ indicates the shear stress, μ denotes the friction coefficient, and τ 0 represents the shear yield stress. Due to the low thermal conductivity shown by difficult-to-machine materials, ductile-fracture fractures and saw-toothed chips are formed under certain conditions of a strain–stress relationship [36,37,38,39,40,41], for which Deform’s default Normalized Cockroft and Latham [42] fracture criteria are applied in this paper: 0 ε ¯ f ( σ * / σ ¯ ) d ε ¯ = C , where σ * represents the maximum main tensile stress, ε ¯ f indicates the fracture strain, and C denotes the material constant. The effective stress and effective strain are denoted as σ ¯ and ε ¯ respectively. In order to avoid the separation criterion and severe mesh deformation near the edge radius, the Lagrange mesh repartition technique is applied. The flow stress model stems from 304 stainless steel (machining-AMTC) data in DEFORM-3D material library.
In addition, the temperature field data and coordinate data of the region with the highest rake face temperature were extracted by the post-processing portion. The filtered temperature field point cloud data was reversely located on the MATLAB data processing platform. Meanwhile, the spatial position of the point cloud data in the initial state of the tool was restored, thus determining a reasonable temperature boundary. The data file was imported into the UG 3D modeling platform, and the temperature field mesh surface was obtained in line with the non-uniform rational B-spline (NURBS) surface modeling principle to establish the microgroove geometry. Due to the high concentration of mechanical loads and thermal loads near the cutting edge, the tool-chip interface was subjected to tremendous pressure and high temperature, as a result of which, the shape of the micro-groove was determined and the strength check was conducted on ANSYS. Finally, a micro-groove tool [31,32] was obtained by means of powder compaction. Figure 3b,c show a 500-fold magnified micro-morphology of the flank wear area of conventional and micro-groove tools. As demonstrated by the results, the wear of the flank of the traditional tool is more severe than that of the micro-groove tool, and the adhesion of microchips formed on the flank of conventional tool is more obvious.

3.2. Surface Roughness Data Collection

A series of dry turning tests were conducted on the C2-6136 HK CNC lathe. Based on the 304 stainless steel cutting tool supplied by Zigong Cemented Carbide Co., Ltd., the microgroove cutter was prepared. The preparation process and the geometry of the tool rake face are illustrated in Figure 3a. The working angle of the tool geometry is indicated in Table 1. The coating material was AlCrN. The workpiece was a 42 mm diameter AISI 304 stainless steel bar. Table 2 presents the basic performance parameters in relation to the tool material and the workpiece material, and Table 3 indicates the chemical properties possessed by the workpiece. The turning test platform and roughness testing process are shown in Figure 4.
As revealed by Figure 4, the surface roughness of the AISI 304 stainless steel bar was tested with the MAHR benchtop probe roughness test platform, and the roughness data shown in Table 4 were obtained. The data were collected at three separate locations along an angle of approximately 120° and the average roughness value was recorded. Any anomalous data was considered anomalous and was excluded from the average calculation. In order to analyze the surface roughness, the experimental cutting parameters were designed in a way that the cutting speed v was from 90 m/min to 210 m/min, the feed rate f varied from 0.05 mm/rev to 0.17 mm/rev, and the cutting depth ap ranged from 1 to 2 mm.

4. Model Prediction Results and Analysis

4.1. GA-GBRT Model

From the perspective of artificial intelligence, the task of predicting surface roughness can be transformed into a supervised regression problem. The input variables are the cutting parameters (cutting speed v, feed rate f, and depth of cut ap), and each sample obtains the corresponding target value (surface roughness Ra). The surface roughness regression problem that is based on GBRT is addressed by taking the following steps:
Input : training   dataset T = { ( x 1 , y 1 ) , ( x 2 , y 2 ) , , ( x N , y N ) } , x i X R n , y i Y R , i = 1 , 2 , , N , Maximum   number   of   iterations : M , loss   function : L ( y , f ( x ) ) Output : regression   tree   f ( x )
1 . initialization f 0 ( x ) = arg min c i = 1 N L ( y i , c ) 2 . For   m = 1 , 2 , , M 2.1 For   i = 1 , 2 , , N      Compute   the   negative   gradient   r m i = [ L ( y i , f ( x i ) ) f ( x i ) ] f ( x ) = f m 1 ( x )     End 2.2 Fitting   the   regression   tree   to   r m i to   obtain   the   leaf   node   region   of       the   m th   tree R m j , j = 1 , 2 , , J 2.3 For   j = 1 , 2 , , J      c m j = arg min c x i = R m j L ( y i , f m 1 ( x i ) + c )     End 2.4 Update   the   model   as f m ( x ) = f m 1 ( x ) + j = 1 J c m j I ( x R m j )   End 3. Output   the   final   model f ( x ) = f M ( x ) = m = 1 M j = 1 J c m j I ( x R m j )
Despite the capability of the GBRT method to automatically identify nonlinear interactions through decision tree learning and its success is referred to in many studies, it remains constrained by a number of drawbacks. Firstly, the GBRT method is incapable of explicit regularization. The regularization effect can be achieved by means of the learning rate, the number of iterations, and the maximum depth parameter. The interaction of these parameters in regularization is lacking in clarity. Secondly, the step size of the learning rate parameter is viewed as implicit regularization. Besides, in order to avoid over-fitting, there is a potential need for a smaller step size. However, a small learning rate parameter means that the computational cost of the application is high. Thirdly, regression tree learning is treated as a black box. In this paper, attempts are made to optimize the model parameters in the training set data by using genetic algorithm to improve the generalization ability of the GBRT model while achieving a higher test accuracy.
Figure 5 presents the schematic diagram of the GA-GBRT prediction model. According to the prediction accuracy of GBRT, the appropriate initial parameters (lifting iteration number M, maximum depth D of individual regression estimation, and learning rate V) are determined by a combination of genetic algorithm and k-degree cross-verification. The performance evaluation index is the goodness of fit R 2 between the predicted value y ^ i of GBRT and the observed value y i . As indicated by Figure 6, the k-fold cross-validation strategy is adopted to construct and validate the model. In this paper, in order to prevent the over-fitting of the prediction algorithm, the k value is set to 5, which is effective in reducing the running time for the computer. The training data of 50 groups were split into five groups on a random basis. In each folding cross-validation test, four sets of data were used for training, and the remaining group was used for testing. The average value of all k times was treated as the fitness value of the GBRT model for the selection of the optimal parameters (M, D, V), which are defined as follows:
R 2 = i = 1 n ( y i f i t y ¯ ) 2 i = 1 n ( y i y ¯ ) 2
y i f i t represents the fitted value of the first target. The range of R 2 is specified to be between 0 and 1, with values close to 1, indicating a superior performance of the regression model.
Out of the collected data, 50 groups were selected as the training group, and 10 of which were taken as the test group. The use of the GA-GBRT prediction module is illustrated in Figure 5. The main steps were as follows:
Step 1: The turning experimental data was collected, and the training group data was used to train the key parameters of the GBRT model.
Step 2: Parameter coding and population initialization: A chromosome sequence was generated on a random basis for increasing the number of iterations M, the maximum depth D of the individual regression estimator, and the learning rate V. The initial population was set to 30 and the maximum evolutionary algebra was set to 50.
Step 3: The fitness value of the goodness of fit of each individual was calculated.
Step 4: If the number of cycles failed to reach the maximum number of iterations, the population was selected, crossed, and mutated to produce a new generation of populations, and the GBRT model training continued.
Step 5: Steps 3 and 4 repeat themselves until the maximum evolution algebra was reached or the maximum number of iterations was exceeded to obtain the optimal model parameters.
The GBRT-GA algorithm relies on the use of different combinations of three main parameters (M, D, V). At the time of the 20th generation, the maximum value is obtained, the fitness value of the best individual R 2 = 0.968 , the optimal parameters are found (M = 82, D = 1, V = 0.14), and the number of iterations is lower than the default value of 100. As mentioned above, the number of iterations needs to be terminated early to achieve a higher accuracy, which conforms to the previous analysis. The default parameters recommended by SCKIT are M = 100, D = 3, V = 0.1, and the default loss function is least squares. The iteration error of the model of the default parameters on the training set and the test set is shown in Figure 7a, and the optimized model is presented in Figure 7b. As revealed by the figure, the unoptimized model is in the test set. The convergence effect is unsatisfactory, the MSE is 0.0097, and the optimized model produces the same effect as the training set. The MSE reaches 0.0076. Therefore, after the optimization by the genetic algorithm, the generalization performance of the algorithm on the training data set is improved when compared to the default parameter GBRT. The parameters obtained by the training are factored into the GBRT model to obtain the GBRT prediction machine, and the surface roughness of the test group data is predicted, as indicated by Figure 8. The error will increase with the rise of the feed, as the rising feed will increase the vibration of the machine tool, which will not only cause the surface roughness to increase, but also result in the instability of an experimental error. In order to reduce this error, the average value has been measured for three times in the experiment. It also provides the major means to reduce the error source when the surface roughness is measured in literatures [5,6,7].

4.2. Parameter Optimization of ANN Model

With three different structures and three different training methods taken into consideration, different neural network models were developed. As revealed by Table 5, the degree of accuracy exhibited by these models varies. Therefore, for the exceptional case of the average surface roughness prediction, the optimal structure and the best training algorithm were proposed. In the meantime, the corresponding error analysis of the prediction results was conducted. In the surface roughness prediction, the RMSE selection criterion indicated the 3-10-1 neural network structure of BR as an acceptable training algorithm. Besides, the comparison results of the three algorithms were discovered to be consistent with the results obtained by Mia [7]. The surface roughness prediction was performed on 10 sets of test data using BR, as shown in Figure 9. The prediction error increased with the rise of the feed, which was caused by the cutting force and machine tool vibration.

4.3. Parameter Optimization of SVR Model

As the machining process is non-linear, in order to enhance the performance of the support vector regression model, a radial basis function (RBF) is selected. The parameter gamma can be treated as the reciprocal of the radius of influence exerted by the selected samples of the support vector regression. A lower parameter C is capable to make the decision surface smoother, and a high C allows the model freedom to choose more samples as support vector. Grid search is a relatively simple approach to determining the gamma and C values for SVR regression model parameters. Figure 10 presents a heat map of the goodness of fit which is determined by the cross-validation process and demonstrates the effect created by the parameters gamma and C on the radial basis function, it can be seen from the figures that the smaller training error conforms to an irregular band-like distribution. With the upper and lower limits of the search (search interval) and the transition interval set, a set of values for the most accurate parameters gamma and C can be found in this study. The kernel coefficient gamma of the radial basis function is selected as [1 × 10−6, 10], and the penalty parameter C of the error term is [0, 1 × 106]. With the grid search conducted, when the goodness of fit ( R 2 ) reaches 0.840, the optimal parameters gamma = 1 × 10−5 and C = 1 × 106 are determined. Ten sets of test data were predicted using SVR, as shown in Figure 11. Whether in the smaller feed or the larger feed, the prediction error is found obvious.

4.4. Comparison and Analysis of Prediction Performance

Figure 12 shows a comparison of the correlations performed among the models trained by the GA-GBRT, BR, and SVR algorithms on the test set. Meanwhile, the correlation is defined by the correlation coefficient R. The solid line denotes the fitted value, and the broken line indicates that the actual value is identical to the predicted value. The R value ranges between 0 and 1, and if it is equal to 1, it signifies complete correlation. If it is equal to 0, it demonstrates that there is no relationship between the measured value and the predicted value. A value of R close to 1 indicates a close correlation. As revealed by Figure 12, when the test data of GA-GBRT is tested with 10 sets of test data, the R value is 0.9922, the R value of BR is 0.9785, and the R value of SVR is 0.9811. The model effects of the three algorithms are reliable. Basheer et al. [43] used the neural network to predict the average surface roughness, which led to the discovery that the R value was 0.977. This was basically consistent with the BR model. The GA-GBRT produced the best fitting effect with the R reaching 0.9922.
The performance of the model was subjected to assessment by using four evaluation indexes, namely, MAPE, RMSE, CV, and MAD. CV is defined as the ratio of the standard deviation to the mean and is expressed by Equation (9)
CV = i = 1 N ( y i y ^ i ) 2 N y ¯ × 100 .
The MAPE is calculated as Equation (10) and is defined as the percentage of the calculated absolute error to evaluate how accurate the regression model is. RMSE is the root mean square error, the root mean square error is Equation (11), and MAD is the mean absolute error. The calculation formula is Equation (12).
MAPE = 1 N i = 1 N | y i y ^ i | y i × 100
RMSE = i = 1 N ( y i y ^ i ) 2 N
MAD = 1 N i = 1 N | y ^ i y i |
where y ^ i and y i represent the predicted value and the actual observed value, respectively, y ¯ and N indicate the average of the observed values and the total number of samples, respectively. In this paper, RMSE is taken as the major evaluation indicator. If the RMSE fails to indicate statistical differences between the four models, the other three evaluation indicators will be considered.
The results of the four evaluation indicators for the five prediction models are presented in Figure 13, which reveals that GA-GBRT shows the minimum value in the four evaluation indexes of MAD, MAPE, RMSE, and CV. Therefore, GA-GBRT exhibits the highest prediction accuracy, followed by BR, SVR, LBM, and SCG. As GBRT combines multiple sets of basic learners so as to enhance the robustness to the noise of the training data, the prediction error is minimized, and the need for the data to be normalized during the training process is eliminated. In this paper, GA is applied to optimize the initial model parameters of GBRT, as indicated in Figure 7, the optimized model shows a strong generalization performance both in the training set and the test set. Compared with the other four machine learning algorithms, the prediction accuracy is superior, which makes it better suited to the prediction and optimization of surface roughness.
Table 6 compares the results of the evaluation indicators in the five prediction models on the published milling surface roughness data set [9]. The RMSE of the GA-GBRT prediction model is capable to reach 0.0889, which is followed by the BR model. The results of the five algorithms in the experimental data and the literature data are discovered to be consistent, which suggests that the five algorithms show excellent universal applicability to the prediction of surface roughness, while GA-GBRT displays the best accuracy.

4.5. Multi-Objective Optimization of Cutting Parameters Based on GA_GBRT

Surface roughness makes a major impact on the surface of the part, including wear resistance, fatigue strength, lubrication, and friction, as well as optical properties. The involvement of artificial intelligence models can be quick to construct relevant high-precision prediction models. Besides, it is conducive to optimizing process parameters so as to ensure enterprise production. Efficiency requirements can also be met to control the surface quality and extend the service life of parts. The higher the model accuracy, the better the reliability of the optimized surface roughness value.
Genetic algorithm is considered as ideal for the field of cutting parameter optimization due to its excellent global optimization ability. In order to determine the minimum surface roughness of the cutting parameters, the GA-GBRT model is simplified to R a = f ( v , f , a p ) = f ( x ) . In addition, the optimized variable European space is x = ( v , f , a p ) E 3 . Thus, the variable value is:
R 1 = { x E 3 | 90 V 210 , 0.05 f 0.18 , 1 a p 2 } .
The maximum target for material removal rate is M R R = 1000 v f a p , then the multi-objective optimization function is
{ min ( R a ) = f 1 ( v , f , a p ) = f 1 ( x ) max ( M R R ) = f 2 ( v , f , a p ) = f 2 ( x ) x E 3 .
With the highest cutting efficiency and the best surface quality regarded as the optimization problem, the multi-objective optimization function based on genetic algorithm is applied to solve the model. As shown in Figure 14, through the Pareto optimal front change curve, it can be seen that the overall trend is that, with the rise of the material removal rate, the roughness value will increase on a continued basis. In area A, the material removal rate increases significantly with a rise in the roughness. Furthermore, the lower roughness can be obtained in this area, and the surface precision of the workpiece is improved. Nevertheless, the material removal rate is excessively low, resulting in an overly low cutting efficiency. The recommended parameter is shown in area A of Table 7. In region C, the material removal rate changes at a relatively slow pace, despite the roughness growth being accelerated. In this region, a high material removal rate can be maintained, thereby achieving a high cutting efficiency. Despite this, the roughness value is large, and the recommended parameter is shown in area C of Table 7. In region B, as the roughness increases, the material removal rate rises linearly. In addition, the surface roughness and material removal rate reach the desired value simultaneously. The recommended parameters are shown in area B of Table 7.
The optimal solution to the three variables in the three regions A, B, and C is shown in Table 7. Combined with the main effect analysis in Figure 15, a conclusion can be drawn that the cutting speed exerts little effect on the surface roughness, when it affects the surface quality. In case the surface roughness is at low levels, the higher cutting speed is selected to ensure the highest cutting efficiency. The influence exerted by the cutting depth on the surface roughness is similar to the cutting speed, which ensures that the surface quality is within a small range of variation and increases the cutting depth to improve the cutting efficiency.

4.6. Effect of Cutting Parameters on Surface Roughness

An analysis of the response relationship existing between the average surface roughness and the cutting speed, feed rate, and depth of cut was carried out. The three three-dimensional plots as shown in Figure 16a–c represent roughness response plots under dry cutting conditions.
When the feed rate falls below 0.1 mm, the surface of AISI 304 stainless steel machined by micro-groove tool has a smoother surface finish. According to the previous research reports, the temperature [31] and energy distribution [32] of the micro-groove tool during the cutting process are improved, as a result of which the cutting force and friction coefficient is reduced. The cutting force and friction coefficient cause tool vibration, which is one of the leading reasons for the roughness increase [44]. Therefore, the friction coefficient decreases, thus improving the average surface roughness. The micro-groove design proposed in this paper is based on the temperature field morphology of finite element method, and combines the workpiece material and cutting parameters for the design of the tool rake surface shape, for which the machined surface roughness can be improved in a certain range of parameters. As reported in the literature [30], the cutting tools with different rake tool surface structures exert different influences on the surface quality, Gurbuz explained that because the improved rake face structure can reduce the energy in the cutting process, reducing the cutting force is conducive to improving the quality of the machined surface within certain cutting parameters.
On the contrary, when the feed rate is in excess of 0.1 mm, the surface roughness deteriorates while the performance of the traditional turning tool is improved. As indicated by Astakhov [33], the cutting tools with satisfactory test results may not perform well under different machining conditions. The micro-groove tool having a lower surface roughness in the low feed machining process is shown in Figure 16. The finishing machining is mostly realized in the low feed machining process [26]. In this sense, the micro-groove tool is suitable for AISI 304 stainless steel parts which require a higher surface finish.
As clearly revealed by Figure 16b, the feed rate has a notable effect on the surface roughness. A high feed rate will generate helicoids on the machined surface, and increasing the feed rate will result in not only a decline in dynamic stability during the cutting process but also a deterioration in surface quality. The theoretical relationship between the average surface roughness and the feed rate also demonstrates that the average surface roughness increases with the square of the feed rate when the tip radius of the tool remains unchanged. These results conform to prior studies [7]. In the three parameters, the change to feed rate is observed to exert the most significant effect on the surface roughness. On the other hand, Hamdan [11] conducted an analysis of the impact caused by cutting parameters on the machining of 304 stainless steel by Pareto ANOVA, which led to the discovery that the cutting speed and feed have only an insignificant effect on the surface roughness. As demonstrated by the variance analysis of the surface roughness of machined stainless steel conducted by Zerti [45], the impact of the feed on the surface roughness is about 80.71% and the cutting depth is only 2.99%. The effect exerted by other cutting parameters on the surface roughness is not significant. From Figure 16b,c, it can be seen that the surface roughness of the workpiece declines gradually before increasing with the rise in the depth of cut, with the change being insignificant. The effect of the cutting speed on surface roughness is limited as well.

5. Conclusions

To construct a self-developed micro-groove cutter dry-cut AISI 304 stainless steel surface roughness prediction model and optimize the cutting parameters, a surface roughness prediction method based on GA-GBRT is proposed. The cutting speed V, the feed amount f, and the cutting depth ap are taken as model variables to predict the surface roughness. The optimized GBRT model is compared against the optimized ANN (LM, BR, and SCG) and SVR on the test set. The following conclusions are drawn:
  • In comparison to the GBRT model with default parameters, the combination of GA and GBRT improved the generalization performance on training datasets and test datasets. The experimental results demonstrate that the minimum evaluation index of the GA-GBRT model is RMSE of 0.087. The other three indicators evidence the superior predictive performance of the model as well. It is thus concluded that GA-GBRT is a reasonable and promising surface roughness prediction method. The model is also capable of being applied to the surface of milling, drilling and grinding.
  • Based on the established GA-GBRT model, the Pareto optimal solution with the highest cutting efficiency and the best machining surface quality was obtained and analyzed. The optimal solution set that reaches a balance between the two was obtained (the value of area B in Table 7). The optimum process parameters for the multi-target conditions were determined for dry cutting of AISI 304 stainless steel for microgroove cutters. As indicated by the results of multi-objective optimization, the cutting efficiency can be enhanced by increasing the cutting speed and depth in a small range of surface quality variations.
  • The 3D surface map demonstrates that the feed rate has a significant effect on the surface roughness, followed by the depth of cut. Besides, the cutting speed exerted the most insignificant effect on the roughness. Through analysis, it was discovered that, compared with the traditional tool, the self-developed micro-groove tool could achieve a superior surface finish with any combination of cutting speed and cut of depth and in the low to moderate range of the feed rate machining process.

Author Contributions

Conceptualization, T.Z.; formal analysis, L.H.; funding acquisition, L.H.; investigation, L.H. and J.W.; methodology, T.Z.; software, J.W.; visualization, F.D.; writing—original draft, F.D. and Z.Z.; writing—review and editing, Z.Z.

Funding

This research was funded by National Natural Science Foundation of China, grant number 51765009 and 51665007, and Science and Technology Planning Project of Guizhou, grant number 20175788 and 20172596.

Acknowledgments

The authors acknowledge the technical support provided by the Engineering Training Center of Guizhou University.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Pimenov, D.Y.; Hassui, A.; Wojciechowski, S.; Mia, M.; Magri, A.; Suyama, D.I.; Bustillo, A.; Krolczyk, G.; Gupta, M.K. Effect of the Relative Position of the Face Milling Tool towards the Workpiece on Machined Surface Roughness and Milling Dynamics. Appl. Sci. 2019, 9, 842. [Google Scholar] [CrossRef]
  2. Davim, J.P. Modern Mechanical Engineering, Materials Forming, Machining and Tribology; Springer: London, UK, 2014. [Google Scholar] [CrossRef]
  3. He, C.L.; Zong, W.J.; Zhang, J.J. Influencing factors and theoretical modeling methods of surface roughness in turning process: State-of-the-art. Int. J. Mach. Tools Manuf. 2018, 129, 15–26. [Google Scholar] [CrossRef]
  4. Agrawal, A.; Goel, S.; Rashid, W.B.; Price, M. Prediction of surface roughness during hard turning of AISI 4340 steel (69 HRC). Appl. Soft Comput. 2015, 30, 279–286. [Google Scholar] [CrossRef] [Green Version]
  5. Rao, K.V.; Murthy, P.B.G.S.N. Modeling and optimization of tool vibration and surface roughness in boring of steel using RSM, ANN and SVM. J. Intell. Manuf. 2016, 29, 1533–1543. [Google Scholar] [CrossRef]
  6. Lin, W.-J.; Lo, S.-H.; Young, H.-T.; Hung, C.-L. Evaluation of Deep Learning Neural Networks for Surface Roughness Prediction Using Vibration Signal Analysis. Appl. Sci. 2019, 9, 1462. [Google Scholar] [CrossRef]
  7. Mia, M.; Dhar, N.R. Prediction of surface roughness in hard turning under high pressure coolant using Artificial Neural Network. Measurement 2016, 92, 464–474. [Google Scholar] [CrossRef]
  8. Xie, N.; Zhou, J.; Zheng, B. An energy-based modeling and prediction approach for surface roughness in turning. Int. J. Adv. Manuf. Technol. 2018, 96, 2293–2306. [Google Scholar] [CrossRef]
  9. Maher, I.; Eltaib, M.E.H.; Sarhan, A.A.D.; El-Zahry, R.M. Investigation of the effect of machining parameters on the surface quality of machined brass (60/40) in CNC end milling—ANFIS modeling. Int. J. Adv. Manuf. Technol. 2014, 74, 531–537. [Google Scholar] [CrossRef]
  10. Adalarasan, R.; Santhanakumar, M.; Rajmohan, M. Application of Grey Taguchi-based response surface methodology (GT-RSM) for optimizing the plasma arc cutting parameters of 304L stainless steel. Int. J. Adv. Manuf. Technol. 2015, 78, 1161–1170. [Google Scholar] [CrossRef]
  11. Hamdan, A.; Sarhan, A.A.D.; Hamdi, M. An optimization method of the machining parameters in high-speed machining of stainless steel using coated carbide tool for best surface finish. Int. J. Adv. Manuf. Technol. 2012, 58, 81–91. [Google Scholar] [CrossRef]
  12. Kilickap, E.; Yardimeden, A.; Çelik, Y.H. Mathematical modelling and optimization of cutting force, tool wear and surface roughness by using artificial neural network and response surface methodology in milling of Ti-6242S. Appl. Sci. 2017, 7, 1064. [Google Scholar] [CrossRef]
  13. Abbas, A.T.; Hamza, K.; Aly, M.F.; Al-Bahkali, E.A. Multiobjective Optimization of Turning Cutting Parameters for J-Steel Material. Adv. Mater. Sci. Eng. 2016, 2016, 6429160. [Google Scholar] [CrossRef]
  14. Patole, P.; Kulkarni, V. Optimization of process parameters based on surface roughness and cutting force in MQL turning of AISI 4340 using nano fluid. Mater. Today. Proc. 2018, 5, 104–112. [Google Scholar] [CrossRef]
  15. Upadhyay, V.; Jain, P.; Mehta, N. In-process prediction of surface roughness in turning of Ti-6Al-4V alloy using cutting parameters and vibration signals. Measurement 2013, 46, 154–160. [Google Scholar] [CrossRef]
  16. Erdakov, I.N.; Tkachev, V.M.; Novokreshchenov, V.V. Increase of wear resistance of steel plates for crushing stations. J. Frict. Wear 2014, 35, 514–519. [Google Scholar] [CrossRef]
  17. Jurkovic, Z.; Cukor, G.; Brezocnik, M.; Brajkovic, T. A comparison of machine learning methods for cutting parameters prediction in high speed turning process. J. Intell. Manuf. 2018, 29, 1683–1693. [Google Scholar] [CrossRef]
  18. Kovac, P.; Rodic, D.; Pucovsky, V.; Savkovic, B.; Gostimirovic, M. Application of fuzzy logic and regression analysis for modeling surface roughness in face milliing. J. Intell. Manuf. 2012, 24, 755–762. [Google Scholar] [CrossRef]
  19. Abu-Mahfouz, I.; El Ariss, O.; Esfakur Rahman, A.H.M.; Banerjee, A. Surface roughness prediction as a classification problem using support vector machine. Int. J. Adv. Manuf. Technol. 2017, 92, 803–815. [Google Scholar] [CrossRef]
  20. Pimenov, D.Y.; Bustillo, A.; Mikolajczyk, T. Artificial intelligence for automatic prediction of required surface roughness by monitoring wear on face mill teeth. J. Intell. Manuf. 2017, 29, 1045–1061. [Google Scholar] [CrossRef] [Green Version]
  21. Ziegel, E.R. The elements of statistical learning. Technometrics 2003, 45, 267–268. [Google Scholar] [CrossRef]
  22. Dabiri, S.; Abbas, M. Evaluation of the Gradient Boosting of Regression Trees Method on Estimating Car-Following Behavior. Transp. Res. Rec. 2018, 2672, 136–146. [Google Scholar] [CrossRef] [Green Version]
  23. Wei, Y.; Zhang, X.; Hou, N.; Zhang, W.; Jia, K.; Yao, Y. Estimation of surface downward shortwave radiation over China from AVHRR data based on four machine learning methods. Sol. Energy 2019, 177, 32–46. [Google Scholar] [CrossRef]
  24. Nieto, P.J.G.; Garcia-Gonzalo, E.; Arbat, G.; Duran-Ros, M.; de Cartagena, F.R.; Puig-Bargues, J. Pressure drop modelling in sand filters in micro-irrigation using gradient boosted regression trees. Biosyst. Eng. 2018, 171, 41–51. [Google Scholar] [CrossRef]
  25. Juez-Gil, M.; Erdakov, I.N.; Bustillo, A.; Pimenov, D.Y. A regression-tree multilayer-perceptron hybrid strategy for the prediction of ore crushing-plate lifetimes. J. Adv. Res. 2019, 18, 173–184. [Google Scholar] [CrossRef]
  26. Chandrasekaran, M.; Muralidhar, M.; Krishna, C.M.; Dixit, U.S. Application of soft computing techniques in machining performance prediction and optimization: A literature review. Int. J. Adv. Manuf. Technol. 2009, 46, 445–464. [Google Scholar] [CrossRef]
  27. Zhang, H.; Liu, J.; Chen, S.; Wang, W. Novel roughness measurement for grinding surfaces using simulated data by transfer kernel learning. Appl. Soft Comput. 2018, 73, 508–519. [Google Scholar] [CrossRef]
  28. Mia, M.; Morshed, M.S.; Kharshiduzzaman, M.; Razi, M.H.; Mostafa, M.R.; Rahman, S.M.S.; Ahmad, I.; Hafiz, M.T.; Kamal, A.M. Prediction and optimization of surface roughness in minimum quantity coolant lubrication applied turning of high hardness steel. Measurement 2018, 118, 43–51. [Google Scholar] [CrossRef]
  29. Li, Z.; Zhang, Z.; Shi, J.; Wu, D. Prediction of surface roughness in extrusion-based additive manufacturing with machine learning. Robot. Comput. Integr. Manuf. 2019, 57, 488–495. [Google Scholar] [CrossRef]
  30. Gürbüz, H.; Şeker, U.; Kafkas, F. Investigation of effects of cutting insert rake face forms on surface integrity. Int. J. Adv. Manuf. Technol. 2017, 90, 3507–3522. [Google Scholar] [CrossRef]
  31. Zou, Z.; He, L.; Jiang, H.; Zhan, G.; Wu, J. Development and analysis of a low-wear micro-groove tool for turning Inconel 718. Wear 2019, 420, 163–175. [Google Scholar] [CrossRef]
  32. Jiang, H.; He, L.; Yang, X.; Zou, Z.; Zhan, G. Prediction and experimental research on cutting energy of a new cemented carbide coating micro groove turning tool. Int. J. Adv. Manuf. Technol. 2017, 89, 2335–2343. [Google Scholar] [CrossRef]
  33. Astakhov, V.; Xiao, X. The principle of minimum strain energy to fracture of the work material and its application in modern cutting technologies. In Metal Cutting Technologies—Progress and Current Trends; De Gruyter Publishers: Berlin, Germany, 2016; pp. 1–35. [Google Scholar]
  34. Ridgeway, G.J.U. Generalized Boosted Models: A guide to the gbm package. Update 2007, 1, 2007. [Google Scholar]
  35. He, H.-B.; Li, H.-Y.; Yang, J.; Zhang, X.-Y.; Yue, Q.-B.; Jiang, X.; Lyu, S.-K. A study on major factors influencing dry cutting temperature of AISI 304 stainless steel. Int. J. Precis. Eng. Manuf. 2017, 18, 1387–1392. [Google Scholar] [CrossRef]
  36. Astakhov, V.P. Mechanical properties of engineering materials: Relevance in design and manufacturing. In Introduction to Mechanical Engineering; Springer: London, UK, 2018; pp. 3–41. [Google Scholar]
  37. Liu, J.; Bai, Y.L.; Xu, C.Y. Evaluation of Ductile Fracture Models in Finite Element Simulation of Metal Cutting Processes. J. Manuf. Sci. Eng.-Trans. ASME 2014, 136, 14. [Google Scholar] [CrossRef]
  38. Abushawashi, Y.; Xiao, X.; Astakhov, V.P. FEM simulation of metal cutting using a new approach to model chip formation. Int. J. Adv. Mach. Form. Oper. 2011, 3, 71–92. [Google Scholar]
  39. Vaziri, M.R.; Salimi, M.; Mashayekhi, M. Evaluation of chip formation simulation models for material separation in the presence of damage models. Simul. Model. Pract. Theory 2011, 19, 718–733. [Google Scholar] [CrossRef]
  40. Nasr, M.N.; Ammar, M.M. An evaluation of different damage models when simulating the cutting process using FEM. Procedia CIRP 2017, 58, 134–139. [Google Scholar] [CrossRef]
  41. Chen, G.; Ren, C.; Yang, X.; Jin, X.; Guo, T. Finite element simulation of high-speed machining of titanium alloy (Ti-6Al-4V) based on ductile failure model. Int. J. Adv. Manuf. Technol. 2011, 56, 1027–1038. [Google Scholar] [CrossRef]
  42. Cockcroft, M.G.; Latham, D.J. A Simple Criterion of Fracture for Ductile Metals; National Engineering Laboratory: Shenzhen, China, 1966. [Google Scholar]
  43. Basheer, A.C.; Dabade, U.A.; Joshi, S.S.; Bhanuprasad, V.; Gadre, V. Modeling of surface roughness in precision machining of metal matrix composites using ANN. J. Mater. Process. Technol. 2008, 197, 439–444. [Google Scholar] [CrossRef]
  44. Yallese, M.A.; Chaoui, K.; Zeghib, N.; Boulanouar, L.; Rigal, J.-F. Hard machining of hardened bearing steel using cubic boron nitride tool. J. Mater. Process. Technol. 2009, 209, 1092–1104. [Google Scholar] [CrossRef]
  45. Zerti, A.; Yallese, M.A.; Meddour, I.; Belhadi, S.; Haddad, A.; Mabrouki, T. Modeling and multi-objective optimization for minimizing surface roughness, cutting force, and power, and maximizing productivity for tempered stainless steel AISI 420 in turning operations. Int. J. Adv. Manuf. Technol. 2019, 102, 135–157. [Google Scholar] [CrossRef]
Figure 1. Schematic of tree-based gradient boosting.
Figure 1. Schematic of tree-based gradient boosting.
Applsci 09 03684 g001
Figure 2. Neural network 3-n-1 structure.
Figure 2. Neural network 3-n-1 structure.
Applsci 09 03684 g002
Figure 3. (a) Design block diagram of the microgroove tool, microscopic wear pattern of (b) the flank face of the conventional tool and (c)the microgroove tool.
Figure 3. (a) Design block diagram of the microgroove tool, microscopic wear pattern of (b) the flank face of the conventional tool and (c)the microgroove tool.
Applsci 09 03684 g003
Figure 4. (a) Turning AISI 304 stainless steel test, (b) surface roughness measurement.
Figure 4. (a) Turning AISI 304 stainless steel test, (b) surface roughness measurement.
Applsci 09 03684 g004
Figure 5. GA-GBRT model development flow chart.
Figure 5. GA-GBRT model development flow chart.
Applsci 09 03684 g005
Figure 6. K-fold cross-verification flow chart.
Figure 6. K-fold cross-verification flow chart.
Applsci 09 03684 g006
Figure 7. Comparison of training and Test performance between (a) before optimization and (b) after optimization of the gradient boosting regression tree (GBRT) algorithm.
Figure 7. Comparison of training and Test performance between (a) before optimization and (b) after optimization of the gradient boosting regression tree (GBRT) algorithm.
Applsci 09 03684 g007
Figure 8. Prediction results of genetic-gradient boosting regression tree (GA-GBRT).
Figure 8. Prediction results of genetic-gradient boosting regression tree (GA-GBRT).
Applsci 09 03684 g008
Figure 9. BR prediction results.
Figure 9. BR prediction results.
Applsci 09 03684 g009
Figure 10. Heat map of grid search parameters.
Figure 10. Heat map of grid search parameters.
Applsci 09 03684 g010
Figure 11. Support vector regression (SVR) prediction results.
Figure 11. Support vector regression (SVR) prediction results.
Applsci 09 03684 g011
Figure 12. Regression curve between predicted surface roughness and actual surface roughness.
Figure 12. Regression curve between predicted surface roughness and actual surface roughness.
Applsci 09 03684 g012
Figure 13. Performance comparison of five surface roughness prediction models: (a) MAD; (b) MAPE; (c) RMSE; (d) CV.
Figure 13. Performance comparison of five surface roughness prediction models: (a) MAD; (b) MAPE; (c) RMSE; (d) CV.
Applsci 09 03684 g013
Figure 14. Roughness and material removal rate optimize Pareto frontier.
Figure 14. Roughness and material removal rate optimize Pareto frontier.
Applsci 09 03684 g014
Figure 15. Main effects plot for surface roughness.
Figure 15. Main effects plot for surface roughness.
Applsci 09 03684 g015
Figure 16. Surface roughness as a function of cutting parameters, (a) ap is a constant equal to 1.5 mm, (b) v is a constant equal to 120 m/min, (c) f is a constant equal to 0.08 mm.
Figure 16. Surface roughness as a function of cutting parameters, (a) ap is a constant equal to 1.5 mm, (b) v is a constant equal to 120 m/min, (c) f is a constant equal to 0.08 mm.
Applsci 09 03684 g016
Table 1. Geometric and working angles of the micro-groove tool.
Table 1. Geometric and working angles of the micro-groove tool.
Geometric AngleTool AngleRake AngleClearance AngleMain Cutting Edge AngleEnd Cutting Edge AngleInclination Angle
Value (°)808795−5−5
Table 2. Tool and workpiece material performance parameters.
Table 2. Tool and workpiece material performance parameters.
MaterialDensity g/cm2Tensile Strength (MPa)Bending Strength (MPa)HardnessPoisson’s RatioElastic Modulus (GPa)
Tool14.6784.51.4589.5HRA0.23630–640
Workpiece7.9352029HRC0.247206
Table 3. Chemical composition of AISI 304 stainless steel.
Table 3. Chemical composition of AISI 304 stainless steel.
ElementSiMnPSNiCrCFe
Content%0.751.650.0450.038.5618.870.0870.025
Table 4. Experimental data.
Table 4. Experimental data.
Trial No.v (m/min)f (mm/rev)ap (mm)Response Ra (µm)Type
1900.0510.460Training
21200.0510.374Training
31500.0510.494Testing
41800.0510.425Training
52100.0510.510Training
6900.051.50.386Testing
71200.051.50.390Training
81500.051.50.382Training
91800.051.50.386Training
102100.051.50.396Training
11900.0520.381Training
121200.0520.359Training
131500.0520.387Training
141800.0520.404Training
152100.0520.421Testing
16900.0910.930Training
171200.0910.880Training
181500.0910.930Training
191800.0910.940Training
202100.0910.960Testing
21900.091.50.580Training
221200.091.50.579Training
231500.091.50.683Training
241800.091.50.697Training
252100.091.50.708Testing
26900.0920.811Training
271200.0920.816Training
281500.0920.819Training
291800.0920.838Training
302100.0920.856Training
31900.1311.326Testing
321200.1311.326Training
331500.1311.328Training
341800.1311.576Testing
352100.1311.710Training
36900.131.51.501Training
371200.131.51.423Training
381500.131.51.419Training
391800.131.51.413Training
402100.131.51.459Training
41900.1321.303Training
421200.1321.299Training
431500.1321.349Training
441800.1321.326Training
452100.1321.331Training
46900.1711.867Training
471200.1711.810Testing
481500.1711.896Testing
491800.1711.760Training
502100.1711.745Training
51900.171.51.850Training
521200.171.51.793Training
531500.171.51.890Training
541800.171.51.960Testing
552100.171.51.792Training
56900.1721.764Training
571200.1721.796Training
581500.1721.854Training
591800.1721.898Training
602100.1721.910Training
Table 5. Performance comparison of training algorithms.
Table 5. Performance comparison of training algorithms.
LMBRSCG
RMSEOptimized ConfigurationRMSEOptimized ConfigurationRMSEOptimized Configuration
0.12263-4-10.11303-10-10.18683-7-1
Table 6. Comparison of literature prediction indicators.
Table 6. Comparison of literature prediction indicators.
IndicatorsLBMBRSCGSVRGA-GBRT
Literature [9]MAD0.11430.07060.16640.10910.0670
RMSE0.14390.09720.19970.13740.0889
MAPE18.640910.538925.706017.872511.3358
CV17.592812.002925.485616.191310.9188
Table 7. Values of partial solutions.
Table 7. Values of partial solutions.
RegionCutting Speed v (m/min)Feed Rate f (mm/rev)Cutting Depth ap (mm)Pred-Ra (µm)Exp-Ra (µm)Material Removal Rate MRR
A1450.051.770.320.3412944
2070.061.930.530.4825688
2070.081.940.660.7030354
B2080.101.990.921.0439913
2080.111.991.091.1146178
2080.122.001.211.2550843
C2080.142.001.471.5058848
2080.152.001.541.5260813
2080.182.001.861.7573258

Share and Cite

MDPI and ACS Style

Zhou, T.; He, L.; Wu, J.; Du, F.; Zou, Z. Prediction of Surface Roughness of 304 Stainless Steel and Multi-Objective Optimization of Cutting Parameters Based on GA-GBRT. Appl. Sci. 2019, 9, 3684. https://doi.org/10.3390/app9183684

AMA Style

Zhou T, He L, Wu J, Du F, Zou Z. Prediction of Surface Roughness of 304 Stainless Steel and Multi-Objective Optimization of Cutting Parameters Based on GA-GBRT. Applied Sciences. 2019; 9(18):3684. https://doi.org/10.3390/app9183684

Chicago/Turabian Style

Zhou, Tao, Lin He, Jinxing Wu, Feilong Du, and Zhongfei Zou. 2019. "Prediction of Surface Roughness of 304 Stainless Steel and Multi-Objective Optimization of Cutting Parameters Based on GA-GBRT" Applied Sciences 9, no. 18: 3684. https://doi.org/10.3390/app9183684

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop