<
[ Information ] [ Publications ] [Signal processing codes] [ Signal & Image Links ]
[ Main blog: A fortunate hive ] [ Blog: Information CLAde ] [ Personal links ]
[ SIVA Conferences ] [ Other conference links ] [ Journal rankings ]
[ Tutorial on 2D wavelets ] [ WITS: Where is the starlet? ]
If you cannot find anything more, look for something else (Bridget Fountain)
Si vous ne trouvez plus rien, cherchez autre chose (Brigitte Fontaine)
Google
 
Web www.laurent-duval.eu lcd.siva.free.fr
Update: 2016/07/16

Addendum to: Seismic data compression: an expansive over-review

This page is part of an ongoing effort to collect and review (almost) all the works performed in the area of geophysical trace and seismic data compression. They kind of start with Pierre Bois (IFP), and Lawrence Wood, with simple subsampling and Hadamard transforms. The very first paper I know of is "Représentation condenséee des traces sismiques", by Bois and Grau, 1969, Revue du CÉTHEDEC (Centre d'études théoriques de la détection et des communications). L. Wood's Seismic data compression methods, 1974, Geophysics, is much more famous. There was a nice period around 1996, with the SDCI consortium (seismic data compression initiative), which I took part in, led by Chevron folks. Wavelets became quite popular. Maybe because Paul Donoho (Chevron) had David Donoho for a son. We will explore both lossless packing and lossy compression. We are in a "big science data" period where seismic exploration produces petabytes of seismic traces, and where seismic data management, storage, processing and visualization represent a HUGE problem. Seismic data processing ought to become data-centric. Main issues: to develop sound quality metrics, and to convince geophysicists that careful compression can IMPROVE seismic trace quality.
For a fair comparison of seismic data reduction, using "wavelet transforms", "wavelet packets" and "subband multirate filter banks" lossy compression scheme for seismic data, please read the paper Optimization of sub-band coding method for seismic data compression (or try here), by T. Røsten, T. A. Ramstad and L. Amundsen (Geophysical Prospecting, 2004). The seismic data compression codes, either by Tage or myself, performed better that usuam wavelets.
Abstract: Seismic data volumes, which require huge transmission capacities and massive storage media, continue to increase rapidly due to acquisition of 3D and 4D multiple streamer surveys, multicomponent data sets, reprocessing of prestack seismic data, calculation of post-stack seismic data attributes, etc. We consider lossy compression as an important tool for efficient handling of large seismic data sets. We present a 2D lossy seismic data compression algorithm, based on sub-band coding, and we focus on adaptation and optimization of the method for common-offset gathers. The sub-band coding algorithm consists of five stages: first, a preprocessing phase using an automatic gain control to decrease the non-stationary behaviour of seismic data; second, a decorrelation stage using a uniform analysis filter bank to concentrate the energy of seismic data into a minimum number of sub-bands; third, an iterative classification algorithm, based on an estimation of variances of blocks of sub-band samples, to classify the sub-band samples into a fixed number of classes with approximately the same statistics; fourth, a quantization step using a uniform scalar quantizer, which gives an approximation of the sub-band samples to allow for high compression ratios; and fifth, an entropy coding stage using a fixed number of arithmetic encoders matched to the corresponding statistics of the classified and quantized sub-band samples to achieve compression. Decompression basically performs the opposite operations in reverse order. We compare the proposed algorithm with three other seismic data compression algorithms. The high performance of our optimized sub-band coding method is supported by objective and subjective results.

Other people (once) involved in the compression of seismic data

Seismic data compression links and pages

Seismic data compression software

Seismic data compression data set

Personal paper reference list on seismic data compression

QuickSearch:   Number of matching entries: 0.

Search Settings

    AuthorTitleYearJournal/ProceedingsReftypeDOI/URL
    Duval, Laurent Compression de données sismiques : bancs de filtres et transformèes étendues, synthèse et adaptation (Seismic data compression : Filter banks and extended transforms, Synthesis and adaptation) 2000 School: Université Paris-Sud Orsay  phdthesis PDF-1  PDF-2  URL1  URL2 
    Abstract: Les algorithmes les plus souvent employés pour la compression de données sismiques utilisent des transformées en ondelettes ou en paquets d'ondelettes. Les coefficients issus de la transformation sont généralement quantifiés puis codés par les techniques classiques de codage entropique. Nous proposons ici un premier algorithme de compression basé sur la transformée en ondelettes. Cette transformation est assortie d'une technique de codage de type zerotree coding, d'emploi original en sismique. Cependant, l'emploi des transformées en ondelettes classiques reste une approche relativement rigide. Or, il est souvent souhaitable de pouvoir adapter les transformées aux propriétés de chaque type de signaux. Nous proposons donc un deuxième algorithme employant, à la place des ondelettes, un ensemble de transformées dites «transformées étendues». Ces transformées, issues de la théorie des bancs de filtres, sont paramétrées. Les LOT (lapped orthogonal transforms) de H. Malvar ou les GenLOT (generalized lapped orthogonal transforms) de de Queiroz et al. en sont des exemples connus. Nous proposons plusieurs critères d'optimisation de ces paramètres, permettant de construire des «transformées étendues» adaptées aux propriétés des signaux sismiques. Nous montrons que ces transformées peuvent être utilisées avec codage des coefficients analogue à un codage arborescent employé pour les ondelettes. Les deux algorithmes de compression proposés possèdent pour avantages la possibilité de choisir précisément le taux de compression des données, de comprimer les données par blocs (dans le cas des transformées étendues) et de pouvoir décomprimer partiellement les données, pour le contrôle-qualité ou la visualisation. Les performances des deux algorithmes proposées sont testées sur un ensemble de données sismiques réelles. Ces performances sont évaluées pour différentes mesures de qualité. Nous comparons également leurs performances à d'autres algorithmes de compression.
    BibTeX:
    @phdthesis{Duval_L_2000_phd_com_dsbftesa,
      author = {Duval, Laurent },
      title = {Compression de données sismiques : bancs de filtres et transformèes étendues, synthèse et adaptation},
      school = {Université Paris-Sud Orsay},
      year = {2000},
      abstract = {Les algorithmes les plus souvent employés pour la compression de données
    	sismiques utilisent des transformées en ondelettes ou en paquets
    	d'ondelettes. Les coefficients issus de la transformation sont généralement
    	quantifiés puis codés par les techniques classiques de codage entropique.
    	Nous proposons ici un premier algorithme de compression basé sur
    	la transformée en ondelettes. Cette transformation est assortie d'une
    	technique de codage de type zerotree coding, d'emploi original en
    	sismique. Cependant, l'emploi des transformées en ondelettes classiques
    	reste une approche relativement rigide. Or, il est souvent souhaitable
    	de pouvoir adapter les transformées aux propriétés de chaque type
    	de signaux. Nous proposons donc un deuxième algorithme employant,
    	à la place des ondelettes, un ensemble de transformées dites «transformées
    	étendues». Ces transformées, issues de la théorie des bancs de filtres,
    	sont paramétrées. Les LOT (lapped orthogonal transforms) de H. Malvar
    	ou les GenLOT (generalized lapped orthogonal transforms) de de Queiroz
    	et al. en sont des exemples connus. Nous proposons plusieurs critères
    	d'optimisation de ces paramètres, permettant de construire des «transformées
    	étendues» adaptées aux propriétés des signaux sismiques. Nous montrons
    	que ces transformées peuvent être utilisées avec codage des coefficients
    	analogue à un codage arborescent employé pour les ondelettes. Les
    	deux algorithmes de compression proposés possèdent pour avantages
    	la possibilité de choisir précisément le taux de compression des
    	données, de comprimer les données par blocs (dans le cas des transformées
    	étendues) et de pouvoir décomprimer partiellement les données, pour
    	le contrôle-qualité ou la visualisation. Les performances des deux
    	algorithmes proposées sont testées sur un ensemble de données sismiques
    	réelles. Ces performances sont évaluées pour différentes mesures
    	de qualité. Nous comparons également leurs performances à d'autres
    	algorithmes de compression.},
      note = {(Seismic data compression)}
    }
    
    Duval, Laurent & Røsten, T. Filter bank decomposition of seismic data with application to compression and denoising 2000 SEG Annual International Meeting, pp. 2055-2058  inproceedings DOI  URL  PDF 
    Abstract: The use of discrete wavelet
    based analysis, feature extraction, denoising, and
    compression methods have led to extremely
    interesting developments in the
    field of seismic data processing.
    Notwithstanding, discrete
    wavelets
    belong to a wider class of filter banks.
    The use of more general filter banks allows the design of filter
    coefficients matching the signal's properties.
    Consequently, general filter banks bring forth
    the performance of
    discrete wavelet based seismic data processing techniques.
    In this paper, we discuss basics of general filter bank theory,
    and its applications
    to seismic data compression and denoising. We show that properly
    designed filter banks are able to outperform discrete
    wavelets in both instances.
    BibTeX:
    @inproceedings{Duval_L_2000_p-seg_fil_bdsdacd,
      author = {Duval, Laurent  and Røsten, T.},
      title = {Filter bank decomposition of seismic data with application to compression and denoising},
      booktitle = {SEG Annual International Meeting},
      publisher = {Soc. Expl. Geophysicists},
      year = {2000},
      pages = {2055--2058},
      url = {http://dx.doi.org/10.1190/1.1815847},
      doi = {http://dx.doi.org/10.1190/1.1815847}
    }
    
    Duval, Laurent Simultaneous seismic compression and denoising using a lapped transform coder 2002 Proc. International Conference on Acoustic Speech and Signal Processing (ICASSP), pp. 1269-1272  inproceedings DOI  PDF 
    Abstract: Compression and denoising are two of the most successful applications of wavelets to signals and natural images. Both techniques have also been successfully applied to seismic signals, but compression is not widely accepted yet, since it is often believed to harm seismic information. Trying to look at compression and denoising in another direction, this work stresses on the idea that they could be viewed as two sides of the same coin. As a result, in the case of naturally noisy seismic data, compression could be seen as a denoising tool, instead of a mere noise source. We substantiate this statement on a noise-free seismic data model and actual seismic field data. We show that, depending on the amount of initial ambient noise in the data, a lapped transform coder with embedded zerotree coding may be able to effectively denoise seismic data, over a wide range of compression ratios
    BibTeX:
    @inproceedings{Duval_L_2002_p-icassp_sim_scdltc,
      author = {Duval, Laurent  C.},
      title = {Simultaneous seismic compression and denoising using a lapped transform coder},
      booktitle = {Proc. International Conference on Acoustic Speech and Signal Processing (ICASSP)},
      year = {2002},
      volume = {2},
      pages = {1269--1272},
      doi = {http://dx.doi.org/10.1109/ICASSP.2002.5744033}
    }
    
    Duval, Laurent & Bui-Tran, V. Compression denoising: using seismic compression for uncoherent noise removal 2001 Proc. EAGE Conf. Tech. Exhib.  inproceedings URL  PDF 
    Abstract: Wavelet related techniques have been proved successful in many seismic processing applications, such as
    filtering or compression. While seismic data compression is not yet widely accepted, we propose a compression
    based on filter banks as a means to remove uncoherent noise from seismic data, and thus improve
    the SNR. Results are demonstrated on synthetic data.
    BibTeX:
    @inproceedings{Duval_L_2001_p-eage_com_dscunr,
      author = {Duval, Laurent  C. and Bui-Tran, V.},
      title = {Compression denoising: using seismic compression for uncoherent noise removal},
      booktitle = {Proc. EAGE Conf. Tech. Exhib.},
      publisher = {European Assoc. Geoscientists Eng.},
      year = {2001},
      url = {http://earthdoc.eage.org/detail.php?pubid=4506}
    }
    
    Duval, Laurent & Bui-Tran, V. Compression de données sismiques par ondelettes et GenLOT (Generalized Lapped Orthogonal Transforms) 1999 Réunion théoriciens circuits langue française, pp. 23-24  inproceedings  
    BibTeX:
    @inproceedings{Duval_L_1999_p-rtclf_com_dsog,
      author = {Duval, L. C. and Bui-Tran, V.},
      title = {Compression de données sismiques par ondelettes et GenLOT (Generalized Lapped Orthogonal Transforms)},
      booktitle = {Réunion théoriciens circuits langue française},
      year = {1999},
      pages = {23--24}
    }
    
    Duval, Laurent, Bui-Tran, V., Nguyen, Truong Q. & Tran, T.D. Seismic data compression using GenLOT (Generalized Lapped Orthogonal Transforms): towards "optimality" 2000 Proc. Data Compression Conference (DCC), pp. 552  inproceedings DOI  PDF 
    Abstract: Seismic data compression is desirable in geophysics for both storage and transmission stages. Wavelet coding methods have generated interesting developments, including a real-time field test trial in the North Sea in 1995. Previous work showed that GenLOT with basic optimization also outperforms state-of-the-art biorthogonal wavelet coders for seismic data. In this paper, we focus on the problem of filter bank optimization using various properties of seismic data. It is often desirable to evaluate the compression performance of a transform on a set of data using a priori objective measures, to reduce extensive testings by selecting only good a priori transforms, and to tailor transforms to the statistical properties of the data set. In the scope of this work, we use symmetric AR models up to order 4 to obtain an average model of the horizontal and vertical signals of a seismic stack section. Rosten et al. (1999), have already shown that order 1 or 2 models give good results in filter bank optimization for non-unitary filter banks, using coding gain optimization. Several other criteria may be used for transform optimization. Following the theory in Tran and Nguyen (1999), we use a weighted combination of $C_o= k_C C_c C+k_S C_S+k_d C_D$ of coding gain, stopband attenuation and DC leakage minimization functions
    BibTeX:
    @inproceedings{Duval_L_2000_p-dcc_sei_dcgto,
      author = {Duval, L. C. and Bui-Tran, V. and Nguyen, T. Q. and Tran, T. D.},
      title = {Seismic data compression using GenLOT: towards "optimality"},
      booktitle = {Proc. Data Compression Conference (DCC)},
      year = {2000},
      pages = {552},
      doi = {http://dx.doi.org/10.1109/DCC.2000.838199}
    }
    
    Duval, Laurent, Bui-Tran, V., Nguyen, Truong Q. & Tran, T.D. GenLOT optimization techniques for seismic data compression 2000 Proc. International Conference on Acoustic Speech and Signal Processing (ICASSP), pp. 2111-2114  inproceedings DOI  PDF 
    Abstract: GenLOT coding has been shown an effective technique for seismic data compression, especially when compared to block-based algorithms (such as JPEG), or to wavelets. The transforms remove statistical redundancy and permit efficient compression, when used with advanced encoding techniques, such as the embedded zerotree coding framework. We derive a model for seismic data based on auto-regressive processes. This model is used to design GenLOT filter banks optimized for seismic data, using objective optimization criteria.
    BibTeX:
    @inproceedings{Duval_L_2000_p-icassp_gen_otsdc,
      author = {Duval, L. C. and Bui-Tran, V. and Nguyen, T. Q. and Tran, T. D.},
      title = {GenLOT optimization techniques for seismic data compression},
      booktitle = {Proc. International Conference on Acoustic Speech and Signal Processing (ICASSP)},
      year = {2000},
      volume = {4},
      pages = {2111--2114},
      doi = {http://dx.doi.org/10.1109/ICASSP.2000.859252}
    }
    
    Duval, Laurent & Galibert, P.-Y. Efficient coherent noise filtering: an application of shift-invariant wavelet denoising 2002 Proc. EAGE Conf. Tech. Exhib.  inproceedings URL  PDF  SLIDES 
    Abstract: Coherent noise or surface waves filtering represents one of the most complex issues in land seismic data
    processing. Wavelet based filtering has recently begun to challenge the popular and robust frequencywavenumber
    ($f-k_x-k_y$) filter. Wavelet filters provide fine time-scale representations and non linear filtering
    capabilities that yield in some instances better results on dispersive coherence noise. We propose in this
    work an improvement over the classical discrete wavelets filtering via the use of shift-invariant wavelets.
    Though relatively computationally expensive, their theoretical framework enables a closer approximation
    to the continuous wavelets, which results in finer filtering, less subject to aliasing and to wavelet ringing
    artifacts. Results are demonstrated on real seismic data sets. Improvements
    BibTeX:
    @inproceedings{Duval_L_2002_p-eage_eff_cnfasiwd,
      author = {Duval, L. C. and Galibert, P.-Y.},
      title = {Efficient coherent noise filtering: an application of shift-invariant wavelet denoising},
      booktitle = {Proc. EAGE Conf. Tech. Exhib.},
      year = {2002},
      url = {http://earthdoc.eage.org/detail.php?pubid=5867}
    }
    
    Duval, Laurent & Nagai, T. Seismic data compression using GULLOTS 2001 Proc. International Conference on Acoustic Speech and Signal Processing (ICASSP), pp. 1765-1768  inproceedings DOI   PDF 
    Abstract: Previous work has shown that GenLOT coding is a very effective technique for compressing seismic data. The role of a transform in a coder is to concentrate information and reduce statistical redundancy. When used with embedded zerotree coding, GenLOTs often provide superior performance to traditional block oriented algorithms or to wavelets. In this work we investigate the use of generalized unequal length lapped orthogonal transforms (GULLOT). Their shorter bases for high-frequency components are suitable for reducing ringing artifacts in images. While GULLOTs yield comparable performance to GenLOTs on smooth seismic signals like stacked sections, they achieve improved performance on less smooth signals such as shot gathers
    BibTeX:
    @inproceedings{Duval_L_2001_p-icassp_sei_dcgullot,
      author = {Duval, L. C. and Nagai, T.},
      title = {Seismic data compression using GULLOTS},
      booktitle = {Proc. International Conference on Acoustic Speech and Signal Processing (ICASSP)},
      year = {2001},
      volume = {3},
      pages = {1765--1768},
      doi = {http://dx.doi.org/10.1109/ICASSP.2001.941282}
    }
    
    Duval, L.C. & Nguyen, Truong Q. Seismic data compression: a comparative study between GenLOT and wavelet compression 1999 Vol. 3813, Proc. SPIE, Wavelets: Appl. Signal Image Process., pp. 802-810  inproceedings DOI URL  PDF 
    Abstract: Generalized Lapped Orthogonal Transform (GenLOT) based image coder is used to compress 2-D seismic data sets. Its performance is compared to the results using wavelet-based image coder. Both algorithms use the same state-of-the-art zerotree coding for consistency and fair comparison. Several parameters such as filter length and objective cost function are varied to find the best suited filter banks. It is found that for raw data, filter bank with long overlapping filters should be used for processing signals along the time direction whereas filter bank with short filters should be used for processing signal along the distance direction. This combination yields the best results.
    BibTeX:
    @inproceedings{Duval_L_1999_p-spie-wasip_sei_dccsbgwc,
      author = {Duval, Laurent and Nguyen, T. Q.},
      title = {Seismic data compression: a comparative study between GenLOT and wavelet compression},
      booktitle = {Proc. SPIE, Wavelets: Appl. Signal Image Process.},
      publisher = {SPIE},
      year = {1999},
      volume = {3813},
      pages = {802--810},
      url = {http://spie.org/x648.html?product_id=366837},
      doi = {http://dx.doi.org/10.1117/12.366837}
    }
    
    Duval, L.C., Nguyen, Truong Q. & Tran, T.D. On Progressive Seismic Data Compression using GenLOT 1999 Proc. Conf. Inform. Sciences Syst. (CISS), pp. 956-959  inproceedings URL  PDF 
    Abstract: Wavelet and subband coding
    have been shown effective techniques for seismic data compression,
    especially when compared to DCT-based algorithms (such as JPEG),
    which suffer from blocking artifact at low bit-rates.
    The transforms remove statistical redundancy and permit efficient

    compression.

    This paper presents a novel use of the Generalized Lapped Orthogonal
    Transforms (GenLOTs)
    for compression of 24-bits seismic data. The proposed
    implementation provides a better frequency partition than that of the
    wavelet scheme. It is used with in the Embedded Zerotree Wavelet (EZW)
    framework, an embedded quantization scheme,
    allowing
    exact bit rate compression and incorporating quality control features.
    The proposed coder has better performance in SNR comparing to the state
    of the art Set Partitionning in Hierarchical Trees algorithm (SPIHT).

    BibTeX:
    @inproceedings{Duval_L_1999_p-ciss_pro_sdcg,
      author = {Duval, L. C. and Nguyen, T. Q. and Tran, T. D.},
      title = {On Progressive Seismic Data Compression using GenLOT},
      booktitle = {Proc. Conf. Inform. Sciences Syst. (CISS)},
      year = {1999},
      pages = {956--959},
      url = {http://thanglong.ece.jhu.edu/CISS/fa6.html}
    }
    
    Duval, Laurent, Nguyen, Truong Q. & Tran, T.D. Seismic data compression and QC using GenLOT 1999 Proc. EAGE Conf. Tech. Exhib., pp. P103  inproceedings URL  PDF 
    Abstract: Modern seismic surveys with higher-precision numerization (24-bits A/D
    converters) have led to ever
    increasing amounts of seismic data.
    Management of these large datasets becomes critical, not only for
    transmission, but also for storage, processing and interpretation.

    Compression algorithms have been
    proposed in the geophysicists' community over the past few years as a
    way to effectively manage seismic data. For
    instance, wavelet based compression algorithms can represent seismic data
    using only a fraction of the original data size. Recently, several works on generalized
    lapped transform (GenLOT) demonstrate its advantage over wavelets for
    conventional image compression. Their features are well suited to seismic data
    properties and have led to better results than for wavelets, in
    terms of signal-to-noise ratio.

    In this study we compare GenLOT with wavelet compression results.
    We have implemented algorithms in a new embedded scheme allowing exact bit rate
    compression, and providing straight-forward
    quality control (QC) checks.
    Simulations are performed on several types of datasets, including

    shot, CDP gathers and stack sections.
    The paper is organized as follows: Section 2 briefly reviews seismic data
    compression, discusses transform-based coding and the idea of embedded zerotree
    coding. Here one can easily incorporate quality control issue as part of
    embedded zerotree coding algorithm. Section 3 reviews the GenLOT theory and
    motivates why GenLOT transform should give better performance as compared to wavelet transform. Section 4 discusses the simulation results, comparing the
    performance of the GenLOT-based and wavelet-based algorithms. Section 5
    concludes the paper and discusses future work.

    BibTeX:
    @inproceedings{Duval_L_1999_p-eage_sei_dcqcg,
      author = {Duval, L. C. and Nguyen, T. Q. and Tran, T. D.},
      title = {Seismic data compression and QC using GenLOT},
      booktitle = {Proc. EAGE Conf. Tech. Exhib.},
      publisher = {European Assoc. Geoscientists Eng.},
      year = {1999},
      pages = {P103},
      url = {http://earthdoc.eage.org/detail.php?pubid=32003}
    }
    
    Duval, Laurent, Oksman, J. & Nguyen, Truong Q. A new class of filter banks for seismic data compression 1999 SEG Annual International Meeting, pp. 1907-1910  inproceedings DOI URL  PDF
    Abstract: Reducing the volume of seismic data would substantially
    improve the system management for both transmission
    and storage purposes.
    We propose in this paper a new class of filter banks (Gen-
    LOTs) for seismic data compression. GenLOT is a gen-
    eralization of local transforms with overlapping windows.
    The transforms are used in an embedded coding scheme,
    incorporating control quality features and allowing exact
    bit rate compression.
    Comparing GenLOTs with wavelet in seismic compres-
    sion, the simulation verifies that GenLOTs offer better
    performance than wavelets at a constant distorsion rate,
    achieving much higher compression ratios. Furthermore,
    coherent noise is reduced significantly in GenLOTs-based
    coder, and allowed compression of stack sections by com-
    pression ratio of $150 : 1$ without visible loss.
    BibTeX:
    @inproceedings{Duval_L_1999_p-seg_new_cfbsdc,
      author = {L. C. Duval and J. Oksman and T. Q. Nguyen},
      title = {A new class of filter banks for seismic data compression},
      booktitle = {Annual International Meeting},
      publisher = {SEG, Soc. Expl. Geophysicists},
      year = {1999},
      volume = {18},
      number = {1},
      pages = {1907--1910},
      url = {http://dx.doi.org/10.1190/1.1820920},
      doi = {http://dx.doi.org/10.1190/1.1820920}
    }
    

    Created by JabRef on 30/04/2011.