Abstrak/Abstract |
Porosity is a crucial aspect of reservoir characterization. It can be estimated by measuring rock samples in the laboratory
or through indirect methods based on wireline log data. However, both approaches are time-consuming and do not
cover large areas. Therefore, this study integrates seismic data with well-logging data for porosity estimation, enabling
the coverage of a larger area. Specifically, we estimate porosity in the F3 block of the Netherlands, where there is limited
well-logging data available as labelled data. To address this issue, we employ machine learning, consisting of both an
inversion generator and a forward generator, to estimate porosity from seismic data. The inversion generator facilitates
the process of converting seismic data into porosity, while the forward generator enables forward modelling, transforming porosity data back into seismic data. This forward generator incorporates geophysical knowledge to mitigate the limited amount of labelled data during the training process. Both generators utilize a convolutional neural network gated recurrent unit network (CNN-GRU). Additionally, various training schemes, such as closed-loop and cycle-GAN (Cycle-Generative Adversarial Network), were employed in this research and compared with an open-loop approach. The
data from the F3 block consists of six post-stack seismic lines with an inline spacing of 40 meters and three wells. Data
from two wells (F03-4 and F02-1) were used for training, while one well (F06-1) was used for testing. The results indicate
that the cycle-GAN produces the most accurate porosity estimates, with a mean square error (MSE) of 0.603 and a Pearson correlation coefficient (PCC) of 0.713. The cycle-GAN utilizes cycle-consistent loss to enhance parameter updates
and employs a discriminator to facilitate competitive learning with the generator, thereby improving accuracy. This
demonstrates that cycle-GAN can achieve more accurate porosity estimation in locations with insufficient labelled or
well-logging data compared to open-loop and closed-loop schemes. However, cycle-GAN requires more computational
time than the other methods due to the need to train multiple networks. Nonetheless, this additional training time is
justified by the improvement in PCC. |