INTRODUCTION
Seismic interpretation techniques have been developed
very fast in last twenty years because of increasing amount of seismic data and
development in hardware, acquisition, and processing. Seismic interpretation
algorithms give a lot of techniques to help interpreter for extraction valuable
information from seismic data (Chopra and Marfurt(2012)), so interpreter needs
to understand seismic data and determine his target to choice the best
techniques suitable for his case study. Fig.1 is a time line that shows at a glance
the developments that took place in seismic interpretation from 1956 to 2008
(Liner, 2008).
When using traditional
methods, it is often difficult to get a clear and unbiased view of faults and
stratigraphic features hidden in the 3-D data. Faults are (often) readily seen
on individual vertical cross-sections, but many of these must be examined to
determine the lateral extent of faulting. Stratigraphic changes are difficult
to detect on vertical seismic lines because of the limited profile they present
in this view. Time slices are more suitable for detecting and following faults
and stratigraphy laterally ((Bahorich and Farmer (1995)).
After 43
years of attribute development, it should not be surprising that many of these
attributes are redundant, and some are even useless (Barnes, 2007). Seismic
attributes can enable interpreter to understand seismic data very well and
generate new view for his model, but there are hundreds of seismic attributes
divided into many classes that make interpreters afraid of using new things. Subsequent
developments by Taner et al. (1979) of instantaneous attributes generated
initial excitement, but seismic attributes didn’t come into common usage until
the advent of 3D interpretation workstations when Bahorich and van Bemmel
(1994) showed that one could make maps of these attributes along
interpreter-generated surfaces.
Murfart (Marfart, 2014)
divides the future of attribute development into five categories – feature
recognition, prestack attribute development, multiattribute cluster analysis,
enhanced interpreter-computer interaction, and the statistical correlation of
attributes to completion techniques and reservoir production.
Fig. 1: Timeline showing
developments in seismic interpretation Modified from Liner (2008).
|
It is important to identify the interested
geobodies from seismic data but unfortunately the conventional seismic
interpretation cannot extract a lot of information from seismic data so modern seismic interpretation
needs to extract all available geological information in less time and high
accuracy. The only way to think in new
way is to look at data by new eyes and use all geophysical information that can
extract from seismic data using unconventional seismic interpretation.
In this
paper, we design a simple workflow that we hope that help seismic interpreters
to define petroleum geophysical prospecting and also help to extract geological
features from seismic data by both conventional and unconventional ways as
shown in Fig. 2.
Seismic
interpretation workflow has been developed for data preconditions and quality
control tests for seismic data by spectral analysis, band pass filter and mean
smooth filter using unconventional interpretations to enhance post stack
seismic data and compare results of similarity before and after applying data
preconditions for geobody extraction.
|
Fig. 2: Workflow for seismic interpretation.
METHODOLOGY
It is hard to determine the best workflow for
geophysical study so we need to merge experience with latest technology to
solve complex geological problems and enhance results of seismic
interpretation. In this paper, it is a trial to combine both conventional and
unconventional seismic interpretations to extract all available geological
information from seismic data as shown in Fig. 2 that represents seismic
interpretations workflow that enables interpreter to reduce risk and get
results very fast by mixing experience interpretation with latest technology.
It is important to reduce noise effect and increase
the signal to noise ratio, to enhance seismic data started by quality control
to study how noise effect on seismic data and generate spectrum analysis to
determine amount of noise and seismic bandwidth frequencies. Then, apply band
pass filter to remove high frequency and low frequency noises and mean smooth
filter to overcome random noises that affecting on the interested seismic data
band frequencies. After data preconditions applied similarity attributes have
been used to extract geobodies by available autopicking algorithms.