[ Information ] [ Publications ] [Signal processing codes] [ Signal & Image Links ] | |
[ Main blog: A fortunate hive ] [ Blog: Information CLAde ] [ Personal links ] | |
[ SIVA Conferences ] [ Other conference links ] [ Journal rankings ] | |
[ Tutorial on 2D wavelets ] [ WITS: Where is the starlet? ] | |
If you cannot find anything more, look for something else (Bridget Fountain) |
|
|
Update: 2024/01/10: let us target the 50th anniversary of Laurence C. Wood 1975 "Seismic data compression methods" paper, published in Geophysics. We are still working on a topic related to reservoir modelling, with 3D mesh multiscale compression (properties and geometry): HexaShrink, an exact scalable framework for hexahedral meshes with attributes and discontinuities: multiresolution rendering and storage of geoscience models, published in Computational Geosciences
This page dwells in an ongoing effort to collect and review (almost) all the works performed in the area of geophysical trace and seismic data compression (aka seismic data coding or trace condensation). The very first paper I know of, on the compression of seismic traces, is "Représentation condenséee des traces sismiques" (condensed representation of seismic traces), by Pierre Bois (then at IFPEN) and Gérard Grau, 1969, revue du CÉTHEDEC (Centre d'études théoriques de la détection et des communications).
Before, folks like Norman Ricker talked about (seismic) wavelet representations, but that was more a model for linear processing than genuine compression for storage, transfer or storage. Indeed, the Ricker wavelet (or Mexican hat) does not convey the same scale-space structure as the normalized second derivative of a Gaussian function "Marr wavelet". Later, Lawrence Wood, with Seismic data compression methods, 1974, Geophysics, became much more famous, and focused along with Pierre Bois (again), on Hadamard transforms (aka Walsh, Paley, or Waleymard functions) and adapted quantization. There was a nice period around 1996, with the SDCI consortium (Seismic Data Compression Initiative), which I took part in, led by Chevron folks. Standard discrete wavelets became quite popular. Maybe because Paul Donoho (Chevron) had David Donoho for a son. We will explore both lossless packing and lossy compression. We are increasingly involed in a "big data/data science" period where seismic exploration produces petabytes of seismic traces, and where seismic data management, storage, processing and visualization represent a HUGE problem. Seismic data processing ought to become data-centric. Main issues: to develop sound quality metrics, and to convince geophysicists that careful compression can IMPROVE seismic trace quality, even via processing.
Abstract: Seismic data volumes, which require huge transmission capacities and massive storage media, continue to increase rapidly due to acquisition of 3D and 4D multiple streamer surveys, multicomponent data sets, reprocessing of prestack seismic data, calculation of post-stack seismic data attributes, etc. We consider lossy compression as an important tool for efficient handling of large seismic data sets. We present a 2D lossy seismic data compression algorithm, based on sub-band coding, and we focus on adaptation and optimization of the method for common-offset gathers. The sub-band coding algorithm consists of five stages: first, a preprocessing phase using an automatic gain control to decrease the non-stationary behaviour of seismic data; second, a decorrelation stage using a uniform analysis filter bank to concentrate the energy of seismic data into a minimum number of sub-bands; third, an iterative classification algorithm, based on an estimation of variances of blocks of sub-band samples, to classify the sub-band samples into a fixed number of classes with approximately the same statistics; fourth, a quantization step using a uniform scalar quantizer, which gives an approximation of the sub-band samples to allow for high compression ratios; and fifth, an entropy coding stage using a fixed number of arithmetic encoders matched to the corresponding statistics of the classified and quantized sub-band samples to achieve compression. Decompression basically performs the opposite operations in reverse order. We compare the proposed algorithm with three other seismic data compression algorithms. The high performance of our optimized sub-band coding method is supported by objective and subjective results.
We have a wavelet based compression package that runs in a DOS window on your PC or in an X window on your Unix workstation (Sun Solaris 2.8). This software package is available for a free trial. It is able to compress industry standard SEG "Y" IBM floating point format datasets by a factor of 10 without visible loss of signal definition. Ideal for remote location stack and limited prestack data transfers. The input data should be reasonably well modulated.
SeedCodec is a collection of compression and decompression routines for standard seismic data formats in Java. The goal is to support all the formats available within seed, but submissions from the broader community are needed to accomplish this.
WTCOMP compresses data by Wavelet Transform.
WPC1COMP2 compresses a 2D seismic section trace-by-trace using (1D) wavelet packets.
WPCCOMPRESS compresses a 2D section using wavelet packets.
DCTCOMP compresses data by Discrete Cosine Transform.
SUPACK1 packs SEG-Y trace data into chars.
SUPACK2 packs SEG-Y trace data into 2 byte shorts.
SeisPact is a specialty compression package designed specifically for the compression of two-dimensional gathers of seismic data. SeisPact is currently being used to compress seismic data on board seismic acquisition vessels for transmission over expensive satellite links to a shore based processing center.
This page contains some seismic data. The intention is to make some example data available to everybody who want to use some real seismic data to test their program or function. This page should make it possible for us working with seismic data, compression especially, to use some common data.
Calgary Corpus is a collection of text and binary data files, commonly used for comparing data compression algorithms. It was created by Ian Witten, Tim Bell and John Cleary from the University of Calgary in 1987 and was commonly used in the 1990s. In 1997 it was replaced by the Canterbury Corpus, but the Calgary Corpus still exists for comparison and is still useful for its originally intended purpose.102,400 GEO 32 bit numbers in IBM floating point format - seismic data.
Author | Title | Year | Journal/Proceedings | Reftype | DOI/URL |
---|---|---|---|---|---|
Duval, Laurent | Compression de données sismiques : bancs de filtres et transformèes étendues, synthèse et adaptation (Seismic data compression : Filter banks and extended transforms, Synthesis and adaptation) | 2000 | School: Université Paris-Sud Orsay | phdthesis | PDF-1 PDF-2 URL1 URL2 |
Abstract: Les algorithmes les plus souvent employés pour la compression de données sismiques utilisent des transformées en ondelettes ou en paquets d'ondelettes. Les coefficients issus de la transformation sont généralement quantifiés puis codés par les techniques classiques de codage entropique. Nous proposons ici un premier algorithme de compression basé sur la transformée en ondelettes. Cette transformation est assortie d'une technique de codage de type zerotree coding, d'emploi original en sismique. Cependant, l'emploi des transformées en ondelettes classiques reste une approche relativement rigide. Or, il est souvent souhaitable de pouvoir adapter les transformées aux propriétés de chaque type de signaux. Nous proposons donc un deuxième algorithme employant, à la place des ondelettes, un ensemble de transformées dites «transformées étendues». Ces transformées, issues de la théorie des bancs de filtres, sont paramétrées. Les LOT (lapped orthogonal transforms) de H. Malvar ou les GenLOT (generalized lapped orthogonal transforms) de de Queiroz et al. en sont des exemples connus. Nous proposons plusieurs critères d'optimisation de ces paramètres, permettant de construire des «transformées étendues» adaptées aux propriétés des signaux sismiques. Nous montrons que ces transformées peuvent être utilisées avec codage des coefficients analogue à un codage arborescent employé pour les ondelettes. Les deux algorithmes de compression proposés possèdent pour avantages la possibilité de choisir précisément le taux de compression des données, de comprimer les données par blocs (dans le cas des transformées étendues) et de pouvoir décomprimer partiellement les données, pour le contrôle-qualité ou la visualisation. Les performances des deux algorithmes proposées sont testées sur un ensemble de données sismiques réelles. Ces performances sont évaluées pour différentes mesures de qualité. Nous comparons également leurs performances à d'autres algorithmes de compression. | |||||
BibTeX:
@phdthesis{Duval_L_2000_phd_com_dsbftesa, author = {Duval, Laurent }, title = {Compression de données sismiques : bancs de filtres et transformèes étendues, synthèse et adaptation}, school = {Université Paris-Sud Orsay}, year = {2000}, abstract = {Les algorithmes les plus souvent employés pour la compression de données sismiques utilisent des transformées en ondelettes ou en paquets d'ondelettes. Les coefficients issus de la transformation sont généralement quantifiés puis codés par les techniques classiques de codage entropique. Nous proposons ici un premier algorithme de compression basé sur la transformée en ondelettes. Cette transformation est assortie d'une technique de codage de type zerotree coding, d'emploi original en sismique. Cependant, l'emploi des transformées en ondelettes classiques reste une approche relativement rigide. Or, il est souvent souhaitable de pouvoir adapter les transformées aux propriétés de chaque type de signaux. Nous proposons donc un deuxième algorithme employant, à la place des ondelettes, un ensemble de transformées dites «transformées étendues». Ces transformées, issues de la théorie des bancs de filtres, sont paramétrées. Les LOT (lapped orthogonal transforms) de H. Malvar ou les GenLOT (generalized lapped orthogonal transforms) de de Queiroz et al. en sont des exemples connus. Nous proposons plusieurs critères d'optimisation de ces paramètres, permettant de construire des «transformées étendues» adaptées aux propriétés des signaux sismiques. Nous montrons que ces transformées peuvent être utilisées avec codage des coefficients analogue à un codage arborescent employé pour les ondelettes. Les deux algorithmes de compression proposés possèdent pour avantages la possibilité de choisir précisément le taux de compression des données, de comprimer les données par blocs (dans le cas des transformées étendues) et de pouvoir décomprimer partiellement les données, pour le contrôle-qualité ou la visualisation. Les performances des deux algorithmes proposées sont testées sur un ensemble de données sismiques réelles. Ces performances sont évaluées pour différentes mesures de qualité. Nous comparons également leurs performances à d'autres algorithmes de compression.}, note = {(Seismic data compression)} } |
|||||
Duval, Laurent & Røsten, T. | Filter bank decomposition of seismic data with application to compression and denoising | 2000 | SEG Annual International Meeting, pp. 2055-2058 | inproceedings | DOI URL PDF |
Abstract: The use of discrete wavelet based analysis, feature extraction, denoising, and compression methods have led to extremely interesting developments in the field of seismic data processing. Notwithstanding, discrete wavelets belong to a wider class of filter banks. The use of more general filter banks allows the design of filter coefficients matching the signal's properties. Consequently, general filter banks bring forth the performance of discrete wavelet based seismic data processing techniques. In this paper, we discuss basics of general filter bank theory, and its applications to seismic data compression and denoising. We show that properly designed filter banks are able to outperform discrete wavelets in both instances. |
|||||
BibTeX:
@inproceedings{Duval_L_2000_p-seg_fil_bdsdacd, author = {Duval, Laurent and Røsten, T.}, title = {Filter bank decomposition of seismic data with application to compression and denoising}, booktitle = {SEG Annual International Meeting}, publisher = {Soc. Expl. Geophysicists}, year = {2000}, pages = {2055--2058}, url = {http://dx.doi.org/10.1190/1.1815847}, doi = {http://dx.doi.org/10.1190/1.1815847} } |
|||||
Duval, Laurent | Simultaneous seismic compression and denoising using a lapped transform coder | 2002 | Proc. International Conference on Acoustic Speech and Signal Processing (ICASSP), pp. 1269-1272 | inproceedings | DOI PDF |
Abstract: Compression and denoising are two of the most successful applications of wavelets to signals and natural images. Both techniques have also been successfully applied to seismic signals, but compression is not widely accepted yet, since it is often believed to harm seismic information. Trying to look at compression and denoising in another direction, this work stresses on the idea that they could be viewed as two sides of the same coin. As a result, in the case of naturally noisy seismic data, compression could be seen as a denoising tool, instead of a mere noise source. We substantiate this statement on a noise-free seismic data model and actual seismic field data. We show that, depending on the amount of initial ambient noise in the data, a lapped transform coder with embedded zerotree coding may be able to effectively denoise seismic data, over a wide range of compression ratios | |||||
BibTeX:
@inproceedings{Duval_L_2002_p-icassp_sim_scdltc, author = {Duval, Laurent C.}, title = {Simultaneous seismic compression and denoising using a lapped transform coder}, booktitle = {Proc. International Conference on Acoustic Speech and Signal Processing (ICASSP)}, year = {2002}, volume = {2}, pages = {1269--1272}, doi = {http://dx.doi.org/10.1109/ICASSP.2002.5744033} } |
|||||
Duval, Laurent & Bui-Tran, V. | Compression denoising: using seismic compression for uncoherent noise removal | 2001 | Proc. EAGE Conf. Tech. Exhib. | inproceedings | URL PDF |
Abstract: Wavelet related techniques have been proved successful in many seismic processing applications, such as filtering or compression. While seismic data compression is not yet widely accepted, we propose a compression based on filter banks as a means to remove uncoherent noise from seismic data, and thus improve the SNR. Results are demonstrated on synthetic data. |
|||||
BibTeX:
@inproceedings{Duval_L_2001_p-eage_com_dscunr, author = {Duval, Laurent C. and Bui-Tran, V.}, title = {Compression denoising: using seismic compression for uncoherent noise removal}, booktitle = {Proc. EAGE Conf. Tech. Exhib.}, publisher = {European Assoc. Geoscientists Eng.}, year = {2001}, url = {http://earthdoc.eage.org/detail.php?pubid=4506} } |
|||||
Duval, Laurent & Bui-Tran, V. | Compression de données sismiques par ondelettes et GenLOT (Generalized Lapped Orthogonal Transforms) [BibTeX] |
1999 | Réunion théoriciens circuits langue française, pp. 23-24 | inproceedings | |
BibTeX:
@inproceedings{Duval_L_1999_p-rtclf_com_dsog, author = {Duval, L. C. and Bui-Tran, V.}, title = {Compression de données sismiques par ondelettes et GenLOT (Generalized Lapped Orthogonal Transforms)}, booktitle = {Réunion théoriciens circuits langue française}, year = {1999}, pages = {23--24} } |
|||||
Duval, Laurent, Bui-Tran, V., Nguyen, Truong Q. & Tran, T.D. | Seismic data compression using GenLOT (Generalized Lapped Orthogonal Transforms): towards "optimality" | 2000 | Proc. Data Compression Conference (DCC), pp. 552 | inproceedings | DOI PDF |
Abstract: Seismic data compression is desirable in geophysics for both storage and transmission stages. Wavelet coding methods have generated interesting developments, including a real-time field test trial in the North Sea in 1995. Previous work showed that GenLOT with basic optimization also outperforms state-of-the-art biorthogonal wavelet coders for seismic data. In this paper, we focus on the problem of filter bank optimization using various properties of seismic data. It is often desirable to evaluate the compression performance of a transform on a set of data using a priori objective measures, to reduce extensive testings by selecting only good a priori transforms, and to tailor transforms to the statistical properties of the data set. In the scope of this work, we use symmetric AR models up to order 4 to obtain an average model of the horizontal and vertical signals of a seismic stack section. Rosten et al. (1999), have already shown that order 1 or 2 models give good results in filter bank optimization for non-unitary filter banks, using coding gain optimization. Several other criteria may be used for transform optimization. Following the theory in Tran and Nguyen (1999), we use a weighted combination of $C_o= k_C C_c C+k_S C_S+k_d C_D$ of coding gain, stopband attenuation and DC leakage minimization functions | |||||
BibTeX:
@inproceedings{Duval_L_2000_p-dcc_sei_dcgto, author = {Duval, L. C. and Bui-Tran, V. and Nguyen, T. Q. and Tran, T. D.}, title = {Seismic data compression using GenLOT: towards "optimality"}, booktitle = {Proc. Data Compression Conference (DCC)}, year = {2000}, pages = {552}, doi = {http://dx.doi.org/10.1109/DCC.2000.838199} } |
|||||
Duval, Laurent, Bui-Tran, V., Nguyen, Truong Q. & Tran, T.D. | GenLOT optimization techniques for seismic data compression | 2000 | Proc. International Conference on Acoustic Speech and Signal Processing (ICASSP), pp. 2111-2114 | inproceedings | DOI PDF |
Abstract: GenLOT coding has been shown an effective technique for seismic data compression, especially when compared to block-based algorithms (such as JPEG), or to wavelets. The transforms remove statistical redundancy and permit efficient compression, when used with advanced encoding techniques, such as the embedded zerotree coding framework. We derive a model for seismic data based on auto-regressive processes. This model is used to design GenLOT filter banks optimized for seismic data, using objective optimization criteria. | |||||
BibTeX:
@inproceedings{Duval_L_2000_p-icassp_gen_otsdc, author = {Duval, L. C. and Bui-Tran, V. and Nguyen, T. Q. and Tran, T. D.}, title = {GenLOT optimization techniques for seismic data compression}, booktitle = {Proc. International Conference on Acoustic Speech and Signal Processing (ICASSP)}, year = {2000}, volume = {4}, pages = {2111--2114}, doi = {http://dx.doi.org/10.1109/ICASSP.2000.859252} } |
|||||
Duval, Laurent & Galibert, P.-Y. | Efficient coherent noise filtering: an application of shift-invariant wavelet denoising | 2002 | Proc. EAGE Conf. Tech. Exhib. | inproceedings | URL PDF SLIDES |
Abstract: Coherent noise or surface waves filtering represents one of the most complex issues in land seismic data processing. Wavelet based filtering has recently begun to challenge the popular and robust frequencywavenumber ($f-k_x-k_y$) filter. Wavelet filters provide fine time-scale representations and non linear filtering capabilities that yield in some instances better results on dispersive coherence noise. We propose in this work an improvement over the classical discrete wavelets filtering via the use of shift-invariant wavelets. Though relatively computationally expensive, their theoretical framework enables a closer approximation to the continuous wavelets, which results in finer filtering, less subject to aliasing and to wavelet ringing artifacts. Results are demonstrated on real seismic data sets. Improvements |
|||||
BibTeX:
@inproceedings{Duval_L_2002_p-eage_eff_cnfasiwd, author = {Duval, L. C. and Galibert, P.-Y.}, title = {Efficient coherent noise filtering: an application of shift-invariant wavelet denoising}, booktitle = {Proc. EAGE Conf. Tech. Exhib.}, year = {2002}, url = {http://earthdoc.eage.org/detail.php?pubid=5867} } |
|||||
Duval, Laurent & Nagai, T. | Seismic data compression using GULLOTS | 2001 | Proc. International Conference on Acoustic Speech and Signal Processing (ICASSP), pp. 1765-1768 | inproceedings | DOI PDF |
Abstract: Previous work has shown that GenLOT coding is a very effective technique for compressing seismic data. The role of a transform in a coder is to concentrate information and reduce statistical redundancy. When used with embedded zerotree coding, GenLOTs often provide superior performance to traditional block oriented algorithms or to wavelets. In this work we investigate the use of generalized unequal length lapped orthogonal transforms (GULLOT). Their shorter bases for high-frequency components are suitable for reducing ringing artifacts in images. While GULLOTs yield comparable performance to GenLOTs on smooth seismic signals like stacked sections, they achieve improved performance on less smooth signals such as shot gathers | |||||
BibTeX:
@inproceedings{Duval_L_2001_p-icassp_sei_dcgullot, author = {Duval, L. C. and Nagai, T.}, title = {Seismic data compression using GULLOTS}, booktitle = {Proc. International Conference on Acoustic Speech and Signal Processing (ICASSP)}, year = {2001}, volume = {3}, pages = {1765--1768}, doi = {http://dx.doi.org/10.1109/ICASSP.2001.941282} } |
|||||
Duval, L.C. & Nguyen, Truong Q. | Seismic data compression: a comparative study between GenLOT and wavelet compression | 1999 | Vol. 3813, Proc. SPIE, Wavelets: Appl. Signal Image Process., pp. 802-810 | inproceedings | DOI URL PDF |
Abstract: Generalized Lapped Orthogonal Transform (GenLOT) based image coder is used to compress 2-D seismic data sets. Its performance is compared to the results using wavelet-based image coder. Both algorithms use the same state-of-the-art zerotree coding for consistency and fair comparison. Several parameters such as filter length and objective cost function are varied to find the best suited filter banks. It is found that for raw data, filter bank with long overlapping filters should be used for processing signals along the time direction whereas filter bank with short filters should be used for processing signal along the distance direction. This combination yields the best results. | |||||
BibTeX:
@inproceedings{Duval_L_1999_p-spie-wasip_sei_dccsbgwc, author = {Duval, Laurent and Nguyen, T. Q.}, title = {Seismic data compression: a comparative study between GenLOT and wavelet compression}, booktitle = {Proc. SPIE, Wavelets: Appl. Signal Image Process.}, publisher = {SPIE}, year = {1999}, volume = {3813}, pages = {802--810}, url = {http://spie.org/x648.html?product_id=366837}, doi = {http://dx.doi.org/10.1117/12.366837} } |
|||||
Duval, L.C., Nguyen, Truong Q. & Tran, T.D. | On Progressive Seismic Data Compression using GenLOT | 1999 | Proc. Conf. Inform. Sciences Syst. (CISS), pp. 956-959 | inproceedings | URL PDF |
Abstract: Wavelet and subband coding have been shown effective techniques for seismic data compression, especially when compared to DCT-based algorithms (such as JPEG), which suffer from blocking artifact at low bit-rates. The transforms remove statistical redundancy and permit efficient compression. This paper presents a novel use of the Generalized Lapped Orthogonal |
|||||
BibTeX:
@inproceedings{Duval_L_1999_p-ciss_pro_sdcg, author = {Duval, L. C. and Nguyen, T. Q. and Tran, T. D.}, title = {On Progressive Seismic Data Compression using GenLOT}, booktitle = {Proc. Conf. Inform. Sciences Syst. (CISS)}, year = {1999}, pages = {956--959}, url = {http://thanglong.ece.jhu.edu/CISS/fa6.html} } |
|||||
Duval, Laurent, Nguyen, Truong Q. & Tran, T.D. | Seismic data compression and QC using GenLOT | 1999 | Proc. EAGE Conf. Tech. Exhib., pp. P103 | inproceedings | URL PDF |
Abstract: Modern seismic surveys with higher-precision numerization (24-bits A/D converters) have led to ever increasing amounts of seismic data. Management of these large datasets becomes critical, not only for transmission, but also for storage, processing and interpretation. Compression algorithms have been In this study we compare GenLOT with wavelet compression results. shot, CDP gathers and stack sections. |
|||||
BibTeX:
@inproceedings{Duval_L_1999_p-eage_sei_dcqcg, author = {Duval, L. C. and Nguyen, T. Q. and Tran, T. D.}, title = {Seismic data compression and QC using GenLOT}, booktitle = {Proc. EAGE Conf. Tech. Exhib.}, publisher = {European Assoc. Geoscientists Eng.}, year = {1999}, pages = {P103}, url = {http://earthdoc.eage.org/detail.php?pubid=32003} } |
|||||
Duval, Laurent, Oksman, J. & Nguyen, Truong Q. | A new class of filter banks for seismic data compression | 1999 | SEG Annual International Meeting, pp. 1907-1910 | inproceedings | DOI URL PDF |
Abstract: Reducing the volume of seismic data would substantially improve the system management for both transmission and storage purposes. We propose in this paper a new class of filter banks (Gen- LOTs) for seismic data compression. GenLOT is a gen- eralization of local transforms with overlapping windows. The transforms are used in an embedded coding scheme, incorporating control quality features and allowing exact bit rate compression. Comparing GenLOTs with wavelet in seismic compres- sion, the simulation verifies that GenLOTs offer better performance than wavelets at a constant distorsion rate, achieving much higher compression ratios. Furthermore, coherent noise is reduced significantly in GenLOTs-based coder, and allowed compression of stack sections by com- pression ratio of $150 : 1$ without visible loss. |
|||||
BibTeX:
@inproceedings{Duval_L_1999_p-seg_new_cfbsdc, author = {L. C. Duval and J. Oksman and T. Q. Nguyen}, title = {A new class of filter banks for seismic data compression}, booktitle = {Annual International Meeting}, publisher = {SEG, Soc. Expl. Geophysicists}, year = {1999}, volume = {18}, number = {1}, pages = {1907--1910}, url = {http://dx.doi.org/10.1190/1.1820920}, doi = {http://dx.doi.org/10.1190/1.1820920} } |
Created by JabRef on 30/04/2011.