Make your own free website on

Double Vision: Case histories showing the value of dual processing for 3D surveys. (The Leading Edge, Nov. 98)

Double vision: case histories showing the value of dual processing for 3-D surveys.

Jack Bouska, Exploration and Production Technology Group, Amoco Canada Petroleum Company

It appears that there are as many unique ways to process a particular 3-D seismic survey correctly as there are processing companies to perform the task. Geophysicists experienced with reprocessing seismic data, or selecting processing contractors using simultaneous processing (turkey shoot) procedures, are well familiar with the idiom that every time someone new touches the data, it comes out looking different. During testing, or reprocessing, an interpreter's choice of the best product is influenced by a mix of objective and subjective criteria. Naturally we desire wide temporal and spatial bandwidth, low noise, strong continuity, and also a good match to wellbore information (or to some preconceived notions of the subsurface geology). Geophysicists are occasionally left with the nagging doubt that there may be more information imbedded in the raw field data than can be coaxed onto the final migrated image, particularly after running countless tests in a fruitless search for the elusive optimum set of processing parameters. This belief is reinforced whenever we have a dataset reprocessed to take advantage of some new breakthrough in seismic processing, and are subsequently rewarded with a much higher fidelity image from the legacy field data. In light of this, it may be more prudent to rephrase the first sentence of this paragraph to read: No matter who we get to process our data, there is probably someone else out there who can improve on it, ... to a degree.

Drawing on my own experience, I believe that there is a high level of trepidation regarding realistic expectations of final quality for a particular 3-D dataset, when processed by one individual on a given set of software. Processing skill, experience, and the appropriateness of a given software package to seismic data from a specific geographic area, are not uniform from one processing company to another. Even if we were to assume processing companies to be static organizations, (which they are not), past performance on one type of data does not guarantee similar quality in the future. This is especially true if the geographic area, or the 3-D acquisition design, is changed. Add to this the internal dynamics of a processing company, with possible staff changes and continuous software modifications, and it is easy to see why interpreters are wise to maintain close contact with their seismic processors. This means that every time we initiate a processing project, we must accept some level of risk associated with the resultant image quality of the final product, and suppress any decision regret that may arise from doubts about our selection of a single vendor for the processing contract.

Dual processing for risk abatement

Rather than admitting defeat before we start, it would be wise to turn this inherent uncertainty regarding processing performance into an advantage in some way. By simply lifting the artificial restriction that only one processing company is permitted perform the work we can alleviate some of the inevitable concern about choosing the best contractor for a particular processing job. Over the last four years, Amoco Canada has routinely employed the uncommon practice of dual processing for about one third of our 3-D surveys. Dual processing of land 3-D surveys is defined as having the same 3-D dataset processed by two companies concurrently. This unusual procedure has evolved from a status of wasteful extravagance, into a valuable, and often needed, form of insurance. Dual processing is not a new idea. Many geophysicists have sent data out to a group of processing companies to evaluate capability using the turkey shoot procedure. To my knowledge one of the most notable of these was performed by Paul Favret while working in Amocos Denver office during 1994 when he managed to send a single 3D dataset to eight different processing contractors. He was preparing to select a preferred partner to form a processing alliance in preparation for a massive 3-D acquisition effort in the Wyoming basin. Not surprisingly, processing results from the small 9 sq. mi. 3-D showed tremendous variation in final image quality among the eight vendors. Other examples of dual processing were recently presented by J. Allen during the Geophysical Society of Houston Spring Symposium & 13th Annual SEG Gulf Coast Meeting. The presentation introduced a skeptical audience to the unusual idea of using two different processors for each 3-D seismic project. I intend to amplify those ideas by illustrating how Amoco Canada has adopted the systematic use of dual processing on a routine basis. Results over the last four years have provided numerous 3-D seismic survey processing examples which illustrate the strong differences between final processed 3-D volumes, and highlight the risk of relying on a single vendor. When short turnaround time and high image quality are simultaneously required, the initial expense of the extra processing is often rewarded with the dividends of fast, high quality, results.

Why use dual processing?

Dual processing appears to be an expensive (double the processing cost) duplication of effort, and might be viewed as obsessive. Dual processing may also put a strain on manpower and resources, requiring more Q.C. effort, more interpretation work, more data handling, and additional computer resources (for example, double the disk space for the final 3-D data volumes). It is often difficult to rationalize the extra cost especially considering that any capital expense on the data acquisition goes toward buying an asset (the data), while duplication of processing work for added insurance is not an investment. Bearing these concerns in mind, there remain some compelling reasons justifying the expense of dual processing.

Ultimately, the quality of the final 3-D image is limited by four important variables:

1) Geographic location of the 3-D dataset determines signal to noise ratio via influence of near surface conditions, topography, and subsurface structure (i.e., some areas are renowned for bad data quality).

2) 3-D survey design is particularly important for land surveys. The 3-D survey design process is easily an order of magnitude more complicated than 2D survey design, primarily due to; equipment, surface occupation, and budget constraints.

3) Processing systems, algorithm differences (and bugs), in various software packages either limit or enhance ultimate interpretability, while hardware limitations may go unnoticed until an unusually large 3-D survey is sent to a particular contractor.

4) Human factors such as; processor skill, geophysicist Q.C., and cycle time constraints, all affect the quality of the finished product.

Dual processing benefits

Dual processing cannot mitigate difficulties associated with 3-D survey location (or design), but it can partially compensate for problems, or differences, in processing systems and processor skill. Often millions of dollars are spent during acquisition, with almost a hundred people deployed in the field, while the entire 3-D dataset funnels through the hands of a single processing geophysicist. Dual processing can also be used to evaluate new processing contractors, personnel, and software algorithms, to aid in future contract decisions. Also healthy competition can sometimes bring out the best in a processor.

Dual processing for 3-D data is one method to help insure that the best possible migrated 3-D volume is available for interpretation. Dual processing for a 3-D survey may be initiated simultaneously as a safeguard against turnaround time delays, or to shorten the overall cycle time. Dual processing may also be initiated in a sequential, overlapped fashion, after performance of the initial contractor falls below an acceptable standard, or as an aid in determining if problems are intrinsic to a specific dataset, or to a particular vendor. (See the various starting dates in Table 1.)

Table 1, Cost, quality and turnaround are all important issues, as summarized  
below: (*Cost in Canadian Dollars)

3D Name:     Contractor:    Date start - end   Weeks:   Cost: Quality: Comments:

3DSurvey: A  Contractor: A  19Jan93 - 17Jun93   21wk    $26.5k Medium  S/W Bug
(90 sqkm.)   Contractor: B  08Apr93 - 19May93   06wk    $21.3k High

3D Survey: B Contractor: B  10Sep93 - 25Nov93   11wk    $14.7k High   
(45 sqkm.)   Contractor: C  10Sep93 - 14Dec93   14wk    $16.3k Medium

3D Survey: C Contractor: B  16Sep93 - 30Dec93   15wk    $35.2k Medium
(300 sqkm.)  Contractor: D  28Oct93 - 14Jan94   11wk    $61.1k Medium

3D Survey: D Contractor: B  25Oct93 - 30Dec93    9wk    $37.3k Medium
(515 sqkm.)  Contractor: E  25Oct93 - N/C        N/A    N/A   Work Unfinished

3D Survey: E Contractor: B  28Oct93 - 15Dec93     7wk   $36.0k High
(370 sqkm.)  Contractor: C  28Oct93 - 4Febc94    14wk   $40.0k Medium

3D Survey: F Contractor: B  15Oct94 - 01Dec94     6wk   $40.0k Low
(230 sqkm.)  Contractor: F  15Nov94 - 01June95   30wk   $40.0k Medium

3D Survey: G Contractor: B  01Nov94 - 20Jan95    10wk   $      Low   
(168 sqkm.)  Contractor: D  05Jan95  - 26Jun95   24wk   $40.0K Medium

3D Survey: H Contractor: B  15Oct94 - 5Oct95     50wk   $      Medium  
(106 sqkm.)  Contractor: C  15Oct94 - 16Jul96    92wk   $      High Modified Mig

3D Survey: I Contractor: F  03Sept95 - 8Feb96    21wk   $      Medium 
(171 sqkm.)  Contractor: B  01Oct95 -  23Nov95    8wk   $      High    

3D Survey: J Contractor: B  11Nov96 - 06Jan97     7wk   $27.7k Medium
(614 sqkm.)  Contractor: C  27Nov96 - 30Dec96     5wk   $39.3k High

Dual processing is particularly well suited to land 3-D datasets, with relatively small trace count compared to a standard marine 3-D survey. Amoco Canadas processing costs on land typically run between 1% and 2% of total acquisition costs, while processing of marine 3-D datasets is closer to 10% of acquisition cost. This makes the beneficial outcome of dual processing for land 3-D datasets more cost effective than for marine 3-Ds.

Amoco Canada has leveraged additional cost savings, and safety gains, by intentionally relaxing field effort during acquisition, then compensating by increasing the processing effort with dual processing. We have found our sparse 3-D acquisition technique, with wider than normal line spacing and station spacing, provides adequate imaging performance, while minimizing surface acquisition effort, which saves money and is inherently safer for field personnel. However when the survey spans isolated areas of poor data quality, dual processing can usually provide some level of compensation, because at least one of the two processing vendors is often able to create a decent image through the noisy data zone. When using a single processing vendor there is more risk of failure in patches of bad data areas, and so traditional land 3-D surveys are often designed with narrow line spacing to provide excess fold over the whole survey, and create an (unnecessary) margin of safety for any bad data areas that may exist. When processing is much less expensive than acquisition, using excess fold as a safety net (instead of dual processing) is very expensive.

Comparisons from dual processed surveys

A wide variety of case history examples, comprising over 2600 sq. km. of 3-D data from ten Amoco Canada 3-D surveys, illustrate the types of differences observable between dual processed 3-D volumes. The following range of examples also help show how differences in processing can mask or accentuate the subsurface geologic features of interest.

Survey A: Contractors A and B, Unplanned Serial Initiation (Figure 1. HERE)

The first survey in which Amoco Canada employed dual processing was prompted by disappointing results from our primary processing contractor. The stack 3-D volume on survey A from contractor A appeared adequate, however the final migration left un-collapsed diffraction events in the crossline direction. The glacial pace and un- usable image quality from contractor A tempted us to consider concurrently sending the data to processing contractor B who had recently provided very good results on our first sparse 3-D. This approach was rapidly vindicated when contractor B who started later, surprised us by finishing earlier, while providing superior results!

Some five months later, contractor A forwarded the interpreter a mysterious 8mm tape containing a volume of re-migrated data, (with no explanatory cover letter). Subsequent investigative phone conversations revealed that other oil companies had suggested that contractor A's migration code was incorrect, and the bug (which applied an incorrect migration operator in the crossline direction) was then identified and repaired. The image from this improved migration can be seen in the left side of Figure 1. (the original migration being not much better than a stack).

After the dust settled, contractor A is credited with a fairly good image on survey A, yet contractor B provided a much better image, in one quarter the time and at a lower cost! The differences are evidenced by better continuity, and event standout, extending from the Cretaceous coals (at 1.8s.) to the lower Devonian events (below 2s.). Also note that S/N, wavelet phase, and mute zones differ between these contractors, creating areas of markedly differing quality, particularly on the shallow Cretaceous horizons (above 1s.) which exhibit low relief thrust faults.

Survey B: Contractors B and C, Planned Simultaneous Initiation (Figure 2. HERE)

The experiences with 3-D survey A made us wary of the non-uniform state of processing capability among contractors at that time. Amoco Canada initiated a series of planned, simultaneous, dual processing comparisons to help evaluate the various contractor capabilities. We also began an unprecedented strategy of acquiring regionally large SPARSE 3-D seismic surveys over open acreage, to evaluate new plays before capturing land positions. The processing contractors were made acutely aware that rapid evaluation of acreage, to stay ahead of the competition, would require both short turnaround times and high quality results. To reduce the risk of unexpected processing delays, or unacceptable quality, 3-D survey B was simultaneously sent to two seismic processing contractors, B and C.

Survey B was shot in the Alberta Foothills and contains horizons with significant structure. The work progressed smoothly at both shops, right up to the stack stage, when the interpreter began questioning the velocity field used for post stack migration. Neither of the images matched the structural shape expected from the 2-D pre-stack depth migration. A compromise was chosen for the velocities, and the final images are shown in Figure 2. Contractor B, was able to complete the processing a few weeks ahead of Contractor C, and also produced a better imaged zone of interest, at lower cost. The image of the Cretaceous coals (at 1.4s.) for Contractor B, on the right side of Figure 2. is better defined, and has better signal to noise ratio and continuity for both inline and crossline images, compared to contractor C, on the left. The deeper data from contractor C is superior to the same events on the sections from contractor B; however, these were not the primary target. As a final note, the best images were obtained using in-house 3-D prestack V(z) depth migration, with inline section and horizontal slice comparisons shown in the bottom right of Figure 2. The depth migrated volume positions the shallow structure quite accurately, (confirmed by well information).

Survey C: Contractors B and D, Unplanned Serial Initiation (Figure 3. HERE)

3-D survey C was acquired to image a very deep Devonian target, with the complication of a one kilometer thick Paleozoic carbonate thrust sheet outcropping at the surface. The carbonate outcrop is particularly destructive of data quality, creating very weak signal, poor signal penetration, strong noise events and considerable time pull-up, all of which create an intense challenge for processing. We initially sent the data to processor B, but soon realized that the data was more difficult than we anticipated.

One of the advantages of dual processing is the ability to initiate the second processing contract at the first sign of trouble. Five weeks into the processing, we elected to send the dataset to contractor D, who had produced exceptional work on some 2-D foothills lines although, with contractor D asking twice the price, we were hesitant at the cost differential. Intensive testing and careful processing on the part of contractor B eventually produced a useable image, as seen on the right side of Figure 3. The work at contractor D progressed quite quickly, with no unexpected problems, which raised our hope for a better image. Unfortunately the dip image from contractor D was not an improvement over the image from contractor B, and the strike image was actually markedly worse, making contractor Bs work look like a bargain.

We learned that attempting to undershoot surface carbonate with 3-D data does not solve the imaging problem. We suspect that even if the noise problem were solved, severe ray kinking from the thick carbonate wedge would require an extensive prestack depth migration effort.

Survey D: Contractors B and E, Planned Simultaneous Initiation

3-D survey D was also acquired to image the deep Devonian, and had the complication of outcropping Paleozoic similar to survey C. In light of the problems encountered with survey C we opted to send survey D simultaneously to both contractor B and contractor E. We were confident that processor B could produce a usable image, but we wanted to continue to evaluate capability at other processing contractors, as well as investing in cycle time insurance.

We were not prepared for the result, or rather lack of result from contractor E. They were never able to match the initial brute stack from contractor B, and eventually requested to be released from contractual obligations. In hindsight, if we had not stumbled on to one particularly skillful processor at contractor B, we might have prematurely abandoned the SPARSE 3-D technique.

Survey E: Contractors B and C, Planned Simultaneous Initiation (Figure 4. HERE)

Survey E was also acquired to image a deep Devonian carbonate reef. The quality that contractor C was able to obtain on the Triassic and Permian horizons of survey B prompted us to give them another try. The field data for survey E was sent to contractors B and C simultaneously, and the final images are shown in Figure 4. Contractor B once again provided better image quality, turnaround time, and value.

We determined that contractor C had difficulty with the large (300m.) shot station spacing on this sparse 3-D. To avoid migration operator aliasing problems contractor B typically used half the natural bin size before migration to process prior sparse 3-Ds with large shot station spacing (surveys B, C and D), and summed to natural size bins post migration. Contractor C elected to process survey E into natural size bins (35x150m) and the image seen on the left side of Figure 4. contains significant migration operator alias noise as a result.

Survey F: Contractors B and F, Unplanned Serial Initiation (Figure 5. HERE)

The Foothills 3-D survey F was shot for Triassic and Permian structural traps. Contractor B received the data with instructions to provide rapid turnaround processing, so that the interpretation could be timed to meet a well commitment deadline. Lack of adequate initial quality control resulted in a final stack suffering from over emphasized high frequencies, and attendant noise. A serious Q/C effort was instigated only after the interpreter had approved the extremely weak stack for final migration, as other options were precluded by tight time constrains.

While some data improvement options were being attempted at contractor B, we initiated dual processing, and send the data to contractor F. The time constrains for contractor F were relaxed, allowing ample resources to focus on quality. The processing project lasted six months, but careful attention to decon parameters, retention of low frequency reflector power, and meticulous selection of stacking and migration velocities yielded superior results, which can be seen in the bottom half of Figure 5. Here the frequency content is lower overall compared to contractor B (top Figure 5), but contractor F was able to image the Triassic (below 1.5s.) and Permian (below 2s.) structures over the difficult data area on the left hand side of the sections.

Survey G: Contractors B and D, Unplanned Serial Initiation (Figure 6. HERE)

Another Foothills survey, shot over rugged mountainous terrain with surface carbonate outcrop, presented problems of weak reflections similar to survey F. In this case, the signal to noise problems were anticipated from 2-D data, and the 3-D survey G was designed with high fold over the noisy areas to help compensate. This approach was initially proven futile as the brute stack from contractor B exhibited almost no reflection energy.

Faced with a short time frame to the first well positioning decision gate, we chose to use an unorthodox processing stream which built reflections using a narrow non-surface consistent CDP trim-static window. The window was centered on the zone of interest with constant start and end times, and allowed unusually large shift limits for the cross correlations. This essentially bypassed the traditional statics and velocity loop, and flattened the dominant Triassic reflection sequence (at 1.5s) independently in each prestack CDP gather. Following this unusual approach with migration actually built a useable structural image (Figure 6, left), which was adequate for our purpose at the time.

Following a more prudent tactic of dual processing, we forwarded the survey to contractor D. We utilized a more legitimate processing sequence at contractor D, emphasizing extra care with velocity and statics, and also using 3-D DMO from topography to yield a much higher quality image, seen in Figure 6, right. Although the cube from contractor B was used for the initial well decision, the much improved image from contractor D did confirm the interpretation of structural size and position gleaned from contractor Bs unorthodox cube.

Survey H: Contractors B and C, Planned Simultaneous Initiation (Figure 7. HERE)

The next Foothills 3-D example, Survey H remained with both processing contractors B and C for an inordinately long period, not so much because of difficulties with the data, but rather for political reasons. The play type was dropped from our funding portfolio shortly after the data was acquired and sent for processing. The interpreter was re-assigned, and with a major internal re-organization, survey H became orphaned for several months.

Processing contractors B and C continued in the absence of interpreter involvement, both finishing the bulk of the task in a reasonable period, but delayed while waiting for authorization to migrate the final stack. When the data from both contractors was eventually reviewed, we were reasonably pleased with the image from contractor B (Figure 7, left), but noticed some severe problems with contractor Cs migrated image in the crossline direction.

The hiatus from interpreter pressure actually worked in contractor Cs favor, as they continued to investigate improvements on survey H as an internal research project. The manager of the research department felt that the crossline quality loss was somehow linked to the excessively large bin aspect ratio (4:1), and consequently pursued the idea of azimuthally variant prestack migration operator anti- aliasing. This new code worked wonders for the image quality, as seen in Figure 7, right. Despite taking almost two years between processing initiation and delivery of final product, we continue to be impressed (and benefit) by the perseverance that contractor C applies to difficult data problems.

This is also a clear example of how useful dual processing can be. Given two processing products on the same dataset, the ability to blame the data or acquisition design for poor image quality is eliminated. Knowing that the data could be processed for better quality helped spur contractor C on to improve their software capabilities.

Survey I: Contractors B and F, Unplanned Serial Initiation (Figure 8. HERE)

The next example brings us back to the deep Devonian targets, once again in the Foothills with structural carbonate involvement. This time the survey was designed to avoid all Paleozoic outcrop, and without the fast velocity rock at the surface, data quality was much better than seen in either survey C or D. The survey I images a deep Devonian reef target, along with a shallow pseudeo-autocthonous massive Paleozoic thrust sheet. The simplicity of this extra carbonate stratigraphic section causes minimal ray bending, allowing standard CDP based time domain procedures to work well. The lessons learned from surveys C, D, and E, are to avoid shooting 3-D over areas where carbonate rock pinches out, either at surface or at depth.

Survey I was originally sent exclusively to processing contractor F, (who previously performed well on survey F), and who also acted as the acquisition company. Contractor F made (unfounded) claims that overall cycle time would be reduced by performing front end processing in the field during acquisition, and avoiding inter- company data transfers. The first brute stacks from contractor F were significantly delayed, and of such low quality, that we immediately initiated dual processing using our familiar and trusted contractor B. Despite the one month head start for contractor F, contractor B was able to complete the processing two and a half months quicker, as well as delivering a cleaner, much higher frequency, and more coherent final processed volume, as seen in Figure 8. The image from contractor F is much lower frequency, and weak near the survey edges.

The refraction static solutions are also very different between the two surveys, with the solution from contractor F showing a higher degree of false structure. The time slices at the bottom of Figure 8. illustrate features that are not obvious from looking at inline and crossline sections alone. The refraction solution from contractor F, while containing larger errors, is somewhat independent of the shooting geometry, while the static solution from contractor B shows marked resemblance to the shooting geometry on these unflattened time slices.

This leads to another general point about data processing quality control. Despite my extensive use of standard wiggle trace displays for the examples in this article, they have severe limitations as quality control tools. The best way to test a processing flow for improvement is to create a migrated cube, load the data on a workstation and compare the final interpretation products. For most of our stratigraphic plays, a flattened time slice usually plays this role. The additional expense and effort to get migrated brute and intermediate stacks from the contractors, loaded on workstations and interpreted, to allow comparison of processing flows, will ultimately illustrate its own value. Although flattened time slice comparisons would have been a useful addition to the examples in this article, for confidentiality reasons, I have excluded all but one set of these which appears in the final dual processing example.

Survey J; Contractors B and C, Unplanned Serial Initiation (Figure 9. HERE)

Survey J is interesting in its own right because of its size. At just over six hundred square kilometers it ranks as one of the largest contiguously acquired heliportable 3-D surveys in the world. With significant acquisition expense, and some known data quality issues on the west edge of the survey, we planned to use a combination of contractor B and in-house imaging as our processing effort. Removal of our Cray supercomputer forced a change in plans, so we opted for dual processing using contractors B and C.

Although contractor C started a couple of weeks later, they finished the standard processing earlier, and also discovered a serious error in the geometry, where a snaked receiver line segment was mis- labeled. This information was relayed to contractor B, who regrettably needed to re-start processing from scratch to correct the field acquisition problem. Contractor C also ran the dataset through 3-D prestack time migration, and the image is shown on the right side of Figure 9. The image quality is very good, from the deepest horizons, right up to very shallow Cretaceous markers. This shows that under certain conditions, prestack migration can extend the usefulness of the sparse 3-D technique to shallower targets than originally expected.

The accuracy of prestack migration is also well illustrated with the flattened time slice exerpts shown a the bottom of Figure 9. Here prestack time migration from contractor C is shown on the bottom right and post stack migration from contractor B is shown on the bottom left of Figure 9. The prestack migration gives a higher resolution (crisper, better S/N) view of the transition from shale basin (to the north) onto carbonate bank edge (to the south), easily justifying the extra expense.

Summary of statistics from case histories:

Amoco Canada has employed dual processing on approximately 35% of its 3-D surveys over the last four years. These were essentially all exploration plays in difficult data quality areas. We initiated simultaneous dual processing about 40% of the time, and fell back on serial (sequentially initiated) dual processing the other 60% of the time, to avoid disaster. Approximately half of the surveys were sparse 3-Ds and the other half conventional 3-Ds. In almost all cases of dual processing, we discovered that one of the companies was able to produce a significantly better product than the other, and the outcome was often not predicted by our prior experience. (See Table 1.).

Superlative processing in the past is unfortunately not a guarantee of continuing high quality work in the future. Conversely, one should not discount processing companies after one or more failed attempts, (although human nature is so predisposed). For instance, while processing company B was employed in all ten dual processing examples, their early superiority was not maintained over time. On the other hand the early weaknes of processor C did not deter them from improving their software and skills to the current state of the art, much to our benefit.

No amount of expert quality control (on the part of the client) can overcome software bugs or human error. Holding the assumption that careful quality control during processing is sufficient to assure success, usually results in frustration, ill will between parties, and significantly lengthened project turnaround time, because of delays in data processing.

It is surprising, and perhaps more than a little unnerving, that two versions of the final processed 3-D cube are often very different (with respect to signal to noise ratio, frequency content and continuity of reflectors), considering the two processing contractors started with the exact same dataset. Despite all the mathematics embodied in the software, production processing is simply not an exact science.

Shuki Ronen put it most succinctly when he cast the value of processing as the following equation:

      Value of Seismic Processing
     ------------------------------   >  one
      Cost of Seismic Processing

If this were not true, we could save the whole cost of processing and interpret raw field data! Also if we believe this to be strictly true then we can increase value by simply spending more money on processing. However from experience we all anticipate a crossover point of diminishing returns, where simply spending more money during processing does not improve quality. Commenting more implicitly, we cannot increase our knowledge, or buy more information by continuing to spend money past a certain point, at one particular processing company. However it is possible to extend the boundary of cost effectiveness by spreading additional funds across more than one processing company. We have found that for certain 3-D surveys, dual processing bought us additional knowledge and information that we could not obtain any other way.


Dual processing should be considered whenever there is significant uncertainty associated with a 3-D program, such as:

1. Entry into a new geographic location with no existing 3-D surveys, and very little 2D coverage, particularly when the acquisition budget has imposed survey design constraints.

2. 3-D surveys shot in geographic locations which are known to yield poor data quality.

3. New processing contractors (or personnel), with no track record, are employed.

4. Whenever cycle time delays cannot be tolerated.

Dual contractor processing on an entire marine 3-D is not recommended due to prohibitive cost, however on board Q.C. processing is recommended. On board Q.C. processing can provide significant cycle time reduction to first investment decision gate.

Dual processing is best suited to land sparse 3-D datasets. This is true not only for the reasons stated above, but dual processing is also attractive for land 3-Ds because of the low processing costs for land datasets with relatively few shot records.


Dual processing can be treated as buying an insurance policy on the 3-D program investment, with the added benefit that it can be purchased just after disaster strikes.

Dual processing for 3-D data is one method to help insure that the best possible migrated cube is available for interpretation. Dual processing can be used to safeguard against turnaround time delays, or to shorten overall cycle time, as well as to improve overall data quality.

For these reasons, Amoco Canada continues to employ the technique of dual processing on a regular basis, as we have found that spending an extra few thousand in processing can save literally millions of dollars in the field.


I would particularly like to acknowledge the insightfulness and prior work of a respected colleague, David DAmico, currently with Range Petroleum. David is credited with the early championing of dual processing for Amoco Canada 3-D surveys, and had originally prepared the data comparisons seen in figures 1 through 4. I would also like to acknowledge the (anonymous) host of processing geophysicists that worked on the data at the various contracting companies and also some current (and former) colleagues at Amoco: C.Martin, T.Zwicker, D.Cottle, G.Fraser, D.Au, E.Keyser, T.Joubert, H.K.Michel, H.Lilles, N.Kohlhammer S. Gray, W.May

Reference: James L. Allen, Christopher E. Betz, Ron A. Krenzke, Keith I Haun, Why you need two different processors for your 3-D seismic data. 1997 Geophysical Society of Houston Spring Symposium & 13th annual SEG Gulf Coast Meeting

  • back to home page