Studying the capabilities of machine learning methods for the classification of the character of saturation of terrigenous reservoirs

Tyumen State University Herald. Physical and Mathematical Modeling. Oil, Gas, Energy


Release:

2019, Vol. 5. №1

Title: 
Studying the capabilities of machine learning methods for the classification of the character of saturation of terrigenous reservoirs


For citation: Muravev I. A., Zaharova I. G. 2019. “Studying the capabilities of machine learning methods for the classification of the character of saturation of terrigenous reservoirs”. Tyumen State University Herald. Physical and Mathematical Modeling. Oil, Gas, Energy, vol. 5, no 1, pp. 123-137. DOI: 10.21684/2411-7978-2019-5-1-123-137

About the authors:

Igor A. Muravev, Senior Lecturer, Department of Software, University of Tyumen; to.imuravev@gmail.com

Irina G. Zaharova, Cand. Sci. (Phys.-Math.), Professor, Department of Software, University of Tyumen; i.g.zakharova@utmn.ru

Abstract:

Identifying the properties of oil and gas reservoirs based on information obtained from well logging is one of the main areas of research in the field of geological and hydrodynamic modeling of the reservoir. The insufficient effectiveness of accurate mathematical models for analyzing well survey data, as well as the large volume and noise of these data, determines the relevance of using machine learning methods to identify reservoir features.

This article investigates the possibility of classification of terrigenous collectors using various methods, including support vector machine, decision tree, gradient boost, random forest, and multilayered neural network. The data set was formed on the basis of well logging curves for 24 wells of one reservoir. For training classification models, pre-normalized data from inductive logging, lateral log, neutron-neutron logging on thermal neutrons, borehole electrical measurements, resistivity logging, spontaneous potential logging, gamma logging, and resistance logging were used with five different gradient sondes. To assess the accuracy of classification models constructed using various methods, in each case, cross-validation was performed, the average value of accuracy and standard deviation were estimated. For the support vector method, the influence of the choice of core function (linear, polynomial, and sigmoid) was investigated. In the case of a neural network, its architecture varied, including the number of hidden layers and neurons, activation functions on different layers, and the probability of a dropout. The quality of the obtained classification models was also evaluated by the values of the elements of the confusion matrix.

The results of computational experiments have shown the effectiveness of the use of machine learning methods and, in particular, multilayer neural networks to identify with high accuracy (about 90%) of reservoirs with oil.

References:

  1. Agaev Kh. B. 2013. “The use of cluster analysis for the dissection of a geological section according to well logging data”. Karotazhnik, no 5 (227), pp. 3-11. [In Russian]
  2. Gafurov D. O. 2006. “Geological interpretation with the use of trained neural networks in the NeuroInformGeo data of GIS data from the Talakan oil and gas condensate field” Bulletin of the Tomsk Polytechnic University. Geo Assets Engineering, vol. 309, no 3, pp. 32-37. [In Russian]
  3. Koskov V. N., Koskov B. V. 2007. Geophysical Studies of Wells and Interpretation of GIS Data. Perm: Perm State Technical University. [In Russian]
  4. Paklin N. B., Muhamadiev R. S. 2005. “Using learning algorithms for interpreting GIS data”. Burenie and Neft, no 5, pp. 38-40. [In Russian]
  5. Chudinova D. Yu., Dulkarnaev M. R., Kotenev Yu. A., Sultanov Sh. Kh. 2017. “Differentiation of wells in areas with residual oil reserves using neural network modeling”. Exposition Oil and Gas, no 4 (57), pp. 46-50. [In Russian]
  6. Al-Mudhafar W. J. 2017. “Integrating well log interpretations for lithofacies classification and permeability modeling through advanced machine learning algorithms”. Journal of Petroleum Exploration and Production Technology, vol. 7, no 4, pp. 1023-1033. DOI: 10.1007/s13202-017-0360-0
  7. Altman N. S. 1992. “An introduction to kernel and nearest-neighbor nonparametric regression”. The American Statistician, vol. 46, no 3, pp. 175-185. DOI: 10.1080/00031305.1992.10475879
  8. Breiman L., Friedman J., Stone C. J., Olshen R. A. 2017. Classification and Regression Trees. New York: Routledge. DOI: 10.1201/9781315139470
  9. Breiman L. 2001. “Random forests”. Machine Learning, vol. 45, no 1, pp. 5-32. DOI: 10.1023/A:1010933404324
  10. Scikit-learn: Machine Learning in Python. “Compare the effect of different scalers on data with outliers”. Accessed 27 December 2018. https://scikit-learn.org/stable/auto_examples/preprocessing/plot_all_scaling.html 
  11. Fawcett T. 2006. “An introduction to ROC analysis”. Pattern Recognition Letters, vol. 27, no 8, pp. 861-874. DOI: 10.1016/j.patrec.2005.10.010
  12. Hagan M. T., Demuth H. B., Beale M. H. 1996. Neural Network Design. Boston: PWS Publishing Company, 
  13. Hastie T., Tibshirani R., Friedman J. “2009. “Boosting and additive trees”. In: The Elements of Statistical Learning, ch. 10, pp. 337-384. 2nd edition. New York: Springer. DOI: 10.1007/978-0-387-84858-7_10
  14. Iglewicz B., Hoaglin D. C. 1993. How to Detect and Handle Outliers. American Society for Quality Control (ASQC), Statistics Division.
  15. Kohavi R. 1995. “A study of cross-validation and bootstrap for accuracy estimation and model selection”. Proceedings of the 14th International Joint Conference on Artificial Intelligence, vol. 2, pp. 1137-1145.
  16. Pearson K. 1901. “On lines and planes of closest fit to systems of points in space”. The London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science,  ser. 6, vol. 2, pp. 559-572. DOI: 10.1080/14786440109462720
  17. Scikit-learn: Machine Learning in Python. “Quantile transformer”. Accessed 27 December 2018. https://scikit-learn.org/stable/modules/generated/sklearn.preprocessing.QuantileTransformer.html 
  18. Srivastava N., Hinton G., Krizhevsky A., Sutskever I., Salakhutdinov R. 2014. “Dropout: a simple way to prevent neural networks from overfitting”. Journal of Machine Learning Research, vol. 15, pp. 1929-1958.
  19. Suykens J. A. K., Vandewalle J. 1999. “Least squares support vector machine classifiers”. Neural Processing Letters, vol. 9, no 3, pp. 293-300. DOI: 10.1023/A:1018628609742
  20. Tan F., Luo G., Wang D., Chen Y. 2017. “Evaluation of complex petroleum reservoirs based on data mining methods”. Computational Geosciences, vol. 21, no 1, pp. 151-165. DOI: 10.1007/s10596-016-9601-4
  21. Tukey J. W. 1977. Exploratory Data Analysis. Pearson.