An Interpretable Recurrent Neural Network for Waterflooding Reservoir Flow Disequilibrium Analysis
Round 1
Reviewer 1 Report
This manuscript proposes an interpretable recurrent neural network (IRNN) based on the material balance equation, to characterize the flow disequilibrium and predict the production behaviors. The proposed model adopts the self-attention mechanism and gated recurrent unit (GRU) blocks to improve the model’s performance on both spatial and temporal scales. The authors compare the performance of IRNN and the traditional machine learning model on two reservoir experiments. The idea is interesting and the model development is well described and discussed. Thus, this manuscript could be accepted after the following minor issues are revised.
1. Abstract: It is suggested to use “self-attention” instead of “self attention”, and it needs to be revised in other parts of the manuscript.
2. Introduction: The authors should provide a brief description of the inflow module and drainage module in this section, which is helpful for readers to quickly understand this work.
3. Methodology: The authors should explain Figure 1 in more detail, and the motivation of the proposed model is expected to be introduced at length.
4. On page 5 lines 193-194, “The self attention block enables to assign different attention weights to the input data, E, thereby making IRNN focus on the important features that influence the productivity signals.” The grammatical errors in this sentence need to be corrected.
5. Results and Discussions: On page 11 lines 374-375, “Figure 7 shows the permeability and oil saturation distribution (on day 7,305) of Olympus model.” Please check this sentence. Figure 7 only demonstrates the permeability distribution.
Author Response
General Comments: This manuscript proposes an interpretable recurrent neural network (IRNN) based on the material balance equation, to characterize the flow disequilibrium and predict the production behaviors. The proposed model adopts the self-attention mechanism and gated recurrent unit (GRU) blocks to improve the model’s performance on both spatial and temporal scales. The authors compare the performance of IRNN and the traditional machine learning model on two reservoir experiments. The idea is interesting and the model development is well described and discussed. Thus, this manuscript could be accepted after the following minor issues are revised.
Response: Many thanks for your insightful comments; we have revised our manuscript based on your suggestions.
Comment 1: Abstract: It is suggested to use “self-attention” instead of “self attention”, and it needs to be revised in other parts of the manuscript.
Response 1: Thank you so much for your suggestion. We have changed “self attention” to “self-attention” in our manuscript.
Comment 2: Introduction: The authors should provide a brief description of the inflow module and drainage module in this section, which is helpful for readers to quickly understand this work.
Response 2: Thanks a lot for your valuable suggestion. We have added a brief introduction about the inflow module and the drainage module. “The inflow module aims to calculate the total inflow rate from every injector to the specific producer, and the drainage module is used to simulate the fluid change rate of the water drainage volume.” Please find this sentence on page 3 lines 130-132.
Comment 3: Methodology: The authors should explain Figure 1 in more detail, and the motivation of the proposed model is expected to be introduced at length.
Response 3: Thank you for your valuable suggestion. We have added a detailed description of Figure 1, and explained the motivation of IRNN.
“The structures of IRNN are shown in Figure 1, including the inflow module and the drainage module. To take the mutual interference of injector-producer groups into account, the self-attention mechanism is introduced in IRNN to make the model pay more attention to the vital injection-production information. Also, the recurrent neural network block is used in IRNN to enhance the ability to tackle time series data, by memorizing important historical dynamic signals. Thus, coupling the inflow module and the drainage module, IRNN can simulate the production behaviors based on the material balance equation.” Please find these descriptions on page 4 lines 186-193.
Comment 4: On page 5 lines 193-194, “The self attention block enables to assign different attention weights to the input data, E, thereby making IRNN focus on the important features that influence the productivity signals.” The grammatical errors in this sentence need to be corrected.
Response 4: Thanks a lot for pointing this out, and sorry for this mistake. We have revised this sentence. “The self-attention block enables to assign different attention weights to the input data, E, thereby making IRNN focus on the important features that influence the productivity signals.” has been changed to “The self-attention block can assign different attention weights to the input data, E, thereby making IRNN focus on the features that have important influences on the productivity signals.” Please find the sentence on page 5 lines 210-212.
Comment 5: Results and Discussions: On page 11 lines 374-375, “Figure 7 shows the permeability and oil saturation distribution (on day 7,305) of Olympus model.” Please check this sentence. Figure 7 only demonstrates the permeability distribution.
Response 5: Thank you for your comment, and sorry for this mistake. “Figure 7 shows the permeability and oil saturation distribution (on day 7,305) of Olympus model” has been changed to “Figure 7 shows the permeability distribution of Olympus model.” Please find this sentence on page 12 lines 418-419.c
Reviewer 2 Report
The paper proposes an interpretable recurrent neural network (IRNN) to simulate the production behaviors and characterize the intra-well flow disequilibrium in waterflooding reservoir. Training is based on water injection rates and bottom-hole pressure data, under the control of the material balance equation. The approach is interesting and sound.
I would add a discussion on the comparison in terms of computational time.
Moreover, some more details on mode parameters optimization would be welcomed.
Concerning computational time, I also do not really agree with the sentence on lines 69-71. In my experience commercial numerical simulators hardly ever take more than some hours to perform a simulation. Maybe the sentence should be contextualized with more details.
Additionally, I have some minor comments:
1) It appears that symbols j and φ are used interchangeably, which is quite confusing for the reader.
2) Fig. 2, 5, 7, and 10: A spatial scale is missing; variables and units should be indicated on the color bar.
3) Line 449: The sentence starts with “In contrast”, but I do not see really a contrast. Both the sentence from 446 to 449 and the following one say that IRNN outperforms MLP. Please rephrase.
4) Lines 417-430: wrong font size.
5) Line 75: a comma has to be removed.
Author Response
General Comments: The paper proposes an interpretable recurrent neural network (IRNN) to simulate the production behaviors and characterize the intra-well flow disequilibrium in waterflooding reservoir. Training is based on water injection rates and bottom-hole pressure data, under the control of the material balance equation. The approach is interesting and sound.
Moreover, some more details on mode parameters optimization would be welcomed.
Response: Many thanks for your insightful comments, positive judgment of our work and great effort in the revision. We have revised our manuscript and replied to your valuable comments point by point.
Comment 1: I would add a discussion on the comparison in terms of computational time.
Response 1: Thank you so much for pointing this out. We have added and discussed the computation error and time of MLP and IRNN on two models. Please find Table 3 and the descriptions on page 10 lines 351-366, and find Table 4 and the descriptions on page 14 lines 477-489.
Comment 2: Moreover, some more details on mode parameters optimization would be welcomed.
Response 2: Thanks a lot for your valuable comments. We have added more details on the model parameters of the candidate sets for Hyperopt library. Please find these descriptions on page 7 lines 269-272: “The candidate sets of hidden nodes of GRU block, hidden nodes of queries, keys and values of self-attention block and hidden nodes of fully connected layer are {10, 15, 20, 25, 30}. The candidates of batch size are {5, 10, 15, 20}. The candidate sets of the intial learing rate and coefficient of regularization item are {0.01, 0.02, 0.03, 0.04, 0.05}.”
Comment 3: Concerning computational time, I also do not really agree with the sentence on lines 69-71. In my experience commercial numerical simulators hardly ever take more than some hours to perform a simulation. Maybe the sentence should be contextualized with more details.
Response 3: Thank you a lot for this valuable comment, and sorry for the mistake. We have checked the reference, and changed the description to: “Additionally, with the increase of reservoir scale, the calculation cost of the numerical simulator also increases, and it takes tens of minutes or even hours to complete a simulation [6].” Please find this sentence on page 2 lines 69-71.
Comment 4: It appears that symbols j and φ are used interchangeably, which is quite confusing for the reader.
Response 4: Thank you a lot for pointing this out, and sorry for the ambiguous symbols. We have unified the symbols with , please find the revised contents on page 6 lines 227-232.
Comment 5: Fig. 2, 5, 7, and 10: A spatial scale is missing; variables and units should be indicated on the color bar.
Response 5: Thanks a lot for your valuable comments, and sorry for the mistake. We have added the spatial scale and the units.
Comment 6: Line 449: The sentence starts with “In contrast”, but I do not see really a contrast. Both the sentence from 446 to 449 and the following one say that IRNN outperforms MLP. Please rephrase.
Response 6: Thank you so much for pointing this out. We have changed “In contrast” to “In summary”, please find the revised sentence on page 15 line 496.
Comment 7: Lines 417-430: wrong font size.
Response 7: Thank you for pointing this out, and sorry for the mistake. We have changed the font size.
Comment 8: Line 75: a comma has to be removed.
Response 8: Thanks a lot for your comment, and sorry for this error. We have removed the comma.