New operating methods: Data handling to downhole completions – Statistical Data Included

For proven techniques that improve field and office operations, here are five new technologies. These range from suggestions for management on how growing companies should proceed in updating computer networks; to two reports on downhole equipment used in well completions; to a lower-cost, heavy oil stimulation tailored for a Southern California field. And the Petroleum Technology Transfer Council describes a new website for its Eastern Gulf Region that offers comphehensive data on state-wide operations in Mississippi.

Specifically, these five summaries include: 1) suggestions from Cierra Solutions, Houston, for sorting out the data network dilemma; 2) an Expro Group review of an SPE report from North Sea case histories on the concept of completions without packers and downhole safety valves; 3) SmarTract’s review of unique applications of a new downhole tractor; 4) a summary by Baker Oil Tools of an improved, lower-cost water-frac treatment that boosted a heavy-oil well’s productivity over previous stimulations; and 5) a report from the Mississippi Office of Geology on design, installation and applications of its new website and how it can help independent operators get critical data.

Sorting out the data-network dilemma

Franklin M. Cantrell III, President/CEO, Cierra Solutions, Houston, Texas

Data networks or systems are no different than building a house–a blueprint is needed to avoid arbitrary decisions. By dealing with networks from a business perspective, oil and gas companies should take a common-sense approach. That is, examine the current technology setup, evaluate the issues, then develop and execute a game plan to deliver a clear-cut capital and operating cost justification. This presentation explains how an independent party can be effective in creating the right network solution.

Growing company, outdated computer network. A small, publicly traded, independent E&P company was much smaller just a few years ago. Its corporate offices in Oklahoma were staffed by four people, and its Houston production office was staffed by five. Then, capitalizing on 3-D seismic, horizontal drilling technology to recover bypassed pay and a few key acquisitions, the company grew quickly.

Meanwhile, in today’s computer-driven business environment, their systems were on the verge of becoming a business liability. Essentially, a network had internally been cobbled together, and an outside service was called periodically for equipment repairs (such as servers). In the process of consolidating corporate/production offices into substantially larger Houston office space, management seized the opportunity to upgrade its computer networking.

They wanted to move up to an entirely new plane of services, including: increased Internet service, integrating AS400 systems for their main accounts and remote-access requirements for roving executives. But that created a dilemma. Not only did the wiring plan have to be changed, literally everything plugged in also had to be changed and expanded. Additionally, they debated whether to bring an information technology (IT) specialist on staff to be responsible for implementing and maintaining the new network.

What this company experienced is as old as the rapid growth of small companies, but the connectivity aspect of computers makes it a whole new ballgame. How did it solve the dilemma and move seamlessly ahead? Instead of rushing out and purchasing much more than actually needed in equipment and associated services, management took a business approach to a technology problem. An independent technology consulting firm developed a plan, with the end result essentially balancing capital costs and operating costs to fit the company specifically–not an off-the-shelf solution.

Business approach, independent evaluation. Like a seesaw on a fulcrum, a company can minimize operating costs by strategically incurring capital costs. The opposite is potentially a greater burden–going light on capital costs may drive the company to distraction on operating costs. Therefore, the only practical way to look at data-network problems and solutions is from a financial perspective (capital and operating costs) while staying in the technological mainstream.

In the technological spectrum (with the mainstream occupying the wide middle) outer edges include the highest end (Bleeding Edge systems) and the lowest end (outdated systems). Each technology must be weighed on its performance, cost and benefits in the same way a company would approach purchasing a new telephone system, Fig. 1.


The key consideration is which system is most appropriate to drive the company’s core operational business, not which brand names of computers, servers or routers to buy. Unless business needs demand otherwise, companies should stay in the technological mainstream for their data network that–at a basic level–lets their PCs communicate and share files.

Realistically, there are unlimited data network solutions. As with any segment of any business, a company coping with data networks needs a fundamental design that serves the core business. Preferably, this design phase should be handled by an independent party that develops a conceptual plan based on a review of the company’s existing business and technology.

The goal is to effectively remove technology from the equation and reduce decision making to identifiable business points. For example:

* On the technology side, are the data-network products under consideration faster, more reliable and in the technology mainstream, or are they problematic, and in which ways?

* On the business side, how do the products fit within the spectrum of capital and operating costs?

The key to effective evaluation of a company’s network requirements is the third party’s investigative process–i.e., interviewing executives and department heads, as well as the IT group. From these one-on-one interviews emerge the wants and needs of the company as a whole. Just as important, it provides insight into what are actual needs, vs. a “wish list.” The consultant’s end result is presented as a conceptual plan, its value is validated and–if it makes sense–it becomes the recommendation.

From the independent party’s conceptual plan, management should develop a set of recommendations based on its own goals and budget–just as the company’s departments typically prepare annual goals and budgets. In this instance, the focus should be on anticipated needs for technology systems and projected costs. Included are whether their execution will be through the internal IT department or outsourcing, Fig. 2.


Getting to core business issues. An example of what the independent interview process can uncover involves another E&P company that was maintaining all its pipeline maintenance (FERC) reports on spreadsheets. This practice, performed on a PC and not backed up, was the only method utilized for years on gas flow calculations, as well. Handling this task was one staff individual, all day/every day. By custom designing a small software application, the task was reduced to about 15 minutes a day.

Initially, capital costs increased for about two months to cover software development and implementation. However, at that point, operating costs went virtually to zero. Additionally, the report is now uniform and more accurate.

But without the investigative process, these manual costs may have continued unchecked for years. By taking the fiduciary step of obtaining independent recommendations based on a hard evaluation of existing technology and business requirements, the company avoided the mistake of accepting the wrong, overall technology solution. Following workflows and paperflows is much more productive than saying “We need a network” and paying the piper.

Possibly the biggest impact of evaluations, interviews, plans and recommendations is that business/technology sides of the company can become more allied in what their mutual goal should be–serving core business requirements. While most executives would readily admit that their business changes with the times and the marketplace, many still want technology to stop changing. Realistically, they would probably concede that technology will not even slow down.

Meanwhile, many companies remain in the “fire-fighting” mode. By doing as little as possible to implement new technology, install or upgrade networks, or just remain current on software programs, companies position themselves for regular computer system problems. That approach persists, despite the fact that a company’s technology infrastructure not only supports the staff, it performs myriad ancillary functions.

Again, the situation comes back to the importance of understanding only technology basics and focusing more on how a data-network, life-cycle evaluation–preferably once a year–can best respond to a company’s core business needs. That goes a long way toward preventing the following scenario:

* IT department: “We just lost xyz on the network.”

* Executive: “I don’t know what that is, but it sounds bad. What will it take to fix it?”

* IT: “It costs X-thousand.”

* Executive: “What are my options?”

* IT: “There aren’t any. It has to be fixed.”

* Executive: “OK, just make it go away for as little money as possible.”

In fact, there are numerous options when technology is essentially removed from the business/technology equation. By following the gameplan from which recommendations are presented, a new scenario surfaces. One by one, data-network problems are addressed on a priority level. Each is explained in terms of its urgency, which business issue it addresses and why, how it fits in the technology mainstream, and whether it has a payback timeframe. Then a consensus is reached and recommendations are implemented on a cost-effective basis.

In turn, a program is established whereby an annual or semi-annual review tracks the technology that was deployed and provides opportunities to make minor adjustments in the previously recommended plan.

What should a small company do? An old adage is that computers only exist to run software that performs a task. Software, or an application, runs on an operating system (OS) which runs on a platform. If, for example, a 20-person exploration company with nothing more than 20 PCs concludes it needs a network, which road should it take?

In broad terms, the type of network should be Ethernet, because it is so flexible, inexpensive and satisfies wide-ranging criteria for a company’s core business. It delivers speed, scalability, performance and reliability. Then, the systems should be set up to be as close to the middle of the technology spectrum as possible. From these basic decisions, everything else can be determined by the Needs Analysis, covering the gamut from Internet requirements to WAN design.

Completions without packers and downhole safety valves can be safe

Hydrocarbon-release frequency, per well, per year

Prod. release freq., without Combined release freq.,

DHSV or workover with DHSV and workover

Case I 1.53 x [10.sup.-4] 2.59 x [10.sup.-4]

Case II 10.60 x [10.sup.-4] 6.99 x [10.sup.-4]

Case III 1.46 x [10.sup.-3] 1.28 x [10.sup.-4]

Case IV 2.10 x [10.sup.-4] 2.10 x [10.sup.-4]

Paper SPE 56934, “Radical solutions required: Completions without packers and downhole safety valves can be safe,” authored by Craig J. Durham and Craig A. Paveley, Expro Group Integrated Services Ltd., was presented at the 1999 Offshore Europe Conference, Aberdeen, Sept. 7-9, 1999. That paper contained 10 tables of data and four figures relating to studies on the subject. The following is an abstract of that presentation.

In most countries, legislation regarding design/operation of oil and gas producing wells does not specifically prescribe packers or downhole safety valves. When interpreting the legislation, operators generally require double barriers in wells that can sustain natural flow. Production packers and downhole safety valves are commonly used in completions to satisfy this requirement. The resulting policies provide a standard for well barriers and define an operational envelope that ensures pressure integrity and environmental protection consistent with safe operating practices.

Case histories studied. In four case histories discussed in the paper, Quantitative Risk Assessment was used to assess the requirement for double-barrier isolation using the following approach: 1) Fault Tree Analysis (FTA) to quantify well reliability and the probability of hydrocarbon release from the reservoir to the environment, 2) Failure Mode, Effect and Criticality Analysis (FMECA) to determine critical activities impacting well operations during workover and production phases, and 3) Quantitative analysis of the risked annual blowout rate, where a blowout is defined as any uncontrolled flow of hydrocarbons to the environment.

Results of the FTA are valid for normal production operations only, the authors caution. The probability of a blowout due to workover activities required to repair failed safety valves must also be considered, to obtain a combined hydrocarbon-release frequency.

The four case histories described in the paper and summarized here show how risk of an uncontrolled hydrocarbon release can be quantified, both with and without a functioning downhole safety valve.

Case I–Compares the combined hydrocarbon release frequency for an offshore, naturally flowing, gas-lifted oil well that would require a workover to retrofit an annular safety valve (ASV).

Case II–Considers the increase in hydrocarbon release frequency when an offshore oil well incapable of natural flow is put on gas lift where both the surface-controlled, subsea safety valve (SCSSV) and ASV are inoperable and require a workover to repair.

Case III–Compares the combined hydrocarbon-release frequency for an onshore, naturally flowing oil well, with and without an SCSSV installed.

Case IV–Compares the hydrocarbon-release and workover frequency for an onshore, naturally flowing oil well, with and without a production packer installed.

The hydrocarbon-release frequency for each case history is summarized in the accompanying table.

Fault Tree Analysis, results. As explained in the paper and the references cited therein, FTA can be used to quantify the probability of completion success. By adapting this technique to completion failure, it is possible to establish the incremental change in completion reliability with respect to hydrocarbon-release frequency between a well producing with or without a particular completion component.

As such, the total system reliability has to be considered. In this context, the authors note, the system is defined as the well, including all equipment required for pressure and/or flow containment of reservoir hydrocarbons. In this analysis, the point of demarcation is taken as the connection flanges immediately downstream of the tree valves.

Each mechanical barrier within the system must provide either a constant flow barrier or closure on demand, plus internal and external sealing. The relevant failure modes included as part of the FTA are, therefore: external leakage of non-moving components, failure to close and leakage in the closed position. The analytical procedure is further detailed in the paper, including discussions of Combined Hydrocarbon-Release Risk and FMECA.

The results, as described by the authors, demonstrate the following:

* FTA can quantify the risk of an undesirable event occurring for various completion configurations.

* A downhole safety valve will reduce the probability of an uncontrolled hydrocarbon release occurring during normal production operations. However, the probability of such an event occurring may actually increase if a workover is required to install or repair a downhole safety valve.

* The use of a liner-top PBR instead of a production packer has negligible effect on the hydrocarbon release or workover frequency.

* In new completions, a downhole safety valve reduces risk of a blowout by an order of magnitude and should normally be included in a naturally flowing well.

* Due to the very low probability of catastrophic events occurring, a Failure Mode, Effect and Criticality Analysis is necessary to ensure that the risk of critical events occurring is reduced to as low as reasonably practical by ensuring that adequate controls are in place.

Fault Tree Analysis, the authors say, should be more widely applied to quantify the overall benefit to risk reduction of remedial workovers in wells with limited production life, or which are incapable of natural flow.

New downhole tractor put to work

In March and April of this year, a new kind of downhole wireline tractor from SmarTract, Inc., Houston, entered the market and successfully completed two jobs that can be classified as “firsts.” The initial, high-angle, well-logging application in Canada turned out to be something more. And a recent application in the U.S. Gulf of Mexico was unique for its deepwater location.

Making tractor history in Canada. Calgary-based Wascana Energy (a division of Canadian Occidental) ran the new, bi-directional “crawler” type, downhole tractor on a number of high-angle wells in Saskatchewan and Alberta and, in doing so, made history with a first in fishing with a tractor. The operator found, as expected, that the speed of the tractoring operations saved time and costs compared to the alternative method of conveyance (coiled tubing), and that the tractor helped it complete its logging program before frost laws temporarily shut down further operations.

The company had originally planned on using the new tool for four wells only but decided to extend its use. Two tractors were used in Canada, and both showed strong performance–tractoring totaled close to 40,000 ft in high-angle wellbores without any servicing. Both units were fully operational at the end of the project, with one doing over 75% of the work by itself.

However, the unique application occurred on the last well. While operating in a high-angle section on the eighth well, the tractor and logging tools (that contained a nuclear source) became unexpectedly stuck. The wireline logging company pulled as much as it could safely pull to try to break free without risk of pulling the cable free at its weak point, the cable head. However, it became clear that more force was required. Faced with a fishing job that could have lasted up to a week using coiled tubing, the alternative was to use the bi-directional capability of the tractor to push up the hole from its position just below the cable head. Commands sent to the tractor from surface had previously confirmed that the tractor itself was not stuck, and that it was free to move.

The tractor, acting at this point more as a robot, was told to reverse direction. In a well-coordinated and timed effort between logging and tractor personnel, over 800 lb force was applied by the tool at the same time a 3,000-1b pull was applied by the logging company’s winch unit. The weak point was 3,500 lb. On the first attempt, a few inches of upward movement was seen. On the second attempt, the logging tool was freed completely.

This entire fishing operation using the tractor only took a few minutes. This was the first time a tractor had ever been used to push upward to free a stuck logging tool. It is also the first time a logging tool had been freed by a combined wireline-pull and tractor-push application of force that, combined, was beyond the weakpoint limit of the cable head.

Since working under Canada’s sub-zero surface conditions, the SmarTract tractor has now also completed its offshore debut with a major deepwater operator. It was successfully run on a deepwater, horizontal well in the Gulf of Mexico, where it is believed to be the first time a downhole tractor has been used in this part of the Gulf.

How it works. The tractor used for both operations was a 2 1/8-in.-OD unit. Just over 30 ft long, it represents one of the latest types of tractor to enter the market. It is lightweight, modular and easy to transport. Power requirements are less than some previous designs, and this provides relatively more flexibility in choice of wirelines on which it can be used. Surface equipment is compact, and the power supply is small in size.

Driven by an electric motor, a pump provides hydraulic power to two “stroke” sections of the tractor that each contain a piston that slides on a central rod. Each outer piston sleeve has a set of anchors attached, these also being powered hydraulically. These stroke sections can slide both ways on a central rod and will extend or lower their respective anchor sections in sequence. While one anchor/stroke section is gripping the borehole (casing ID) in a static position, the entire tractor moves in the direction intended (forward or backward) by moving through the piston section. Meanwhile, the other piston section is moving back into position with lowered anchors ready for the next stroke.

Compared to some earlier tractor designs, this type of system is also “smart.” Rather than being only uni-directional and unable to relay cable-head tension information back to surface, this tractor constantly relays a great deal of information back to surface as it is working. Cable-head tension, hydraulic pressure, temperature, anchor-arm pressure and other parameters can be constantly measured. Speed, pulling force, anchor-arm pressure and other settings can also be made from surface and changed during the tractor run, as required.

It can also be made to lock onto the side of the casing and follow other commands. This latter capability also makes it a useful downhole-robot-type device and separates it from being just a method of conveyance. Ways to add logging data to what is collected while tractoring are also being developed.

One of the tool’s innovative features is that it uses anchor arms to move rather than wheels, as found in most older tractor designs. Moving backward at full power is very difficult–if not impossible for any wheeled tractor, but the hydraulic action of this system design makes it simple to reverse direction. It is the only tractor known that is designed to crawl in both directions with full power–a capability used in Canada to significantly reduce fishing risk and cost.

The developer has received a positive reaction to the Canadian operation from oil companies in the North Sea that are experienced in tractor use. They have welcomed this bi-directional capability as a way to reduce risk when planning tractor operations and believe it will open additional opportunities.

Another advantage of the newer crawler design is that contact with casing is minimized. The anchor pads of the tool become statically attached to the casing in only small areas at the end of every stroke while the tractor is moving, whereas wheels apply continuous side force and are in constant contact.

As for the future, SmarTract has been asked by a number of oil companies about using the tractor in open hole. Although originally designed for cased hole, its crawling motion and movement are regarded as attractive for this application. Other sizes of the tractor can also be created. Operators have also become interested in other applications that make use of its potential robot-type capabilities. Its flexible design allows for a number of interesting future applications and options, as well as some likely other firsts.

Current SmarTract product

Cable: Single or multiconductor

Outside diameter: 2 1/8 in.

Casing ID: 7.125-in. or larger

Max. pulling force: 1,000 lb

Max. speed: 1,800+ ft/hr

Length: 32 ft (modular)

Max. pressure: 15,000 psi

Max. temperature: 150 [degrees] C

Tensile strength: 24,000 lb

[H.sub.2]S service: Optional

Wiring through tool: Seven wires

Real time monitoring: CHT, CHV, tool parameters (pressures,

temperatures, motor rpm)

Pulling direction: Bi-directional, equal force

Small-scale water frac boosts productivity in San Joaquin well

In late 1998, Baker Oil Tools used its [H.sub.2]O-FRAQ process to perform a small-scale water-frac treatment in a producing well in Tejon oil field in California’s San Joaquin Valley. The treatment dramatically improved both initial and sustained productivity, demonstrating the effectiveness of water fracturing as a cost-effective, well-stimulation option in the heavy oil regions of Southern California. Productivity increases of more than 500% initially, and 400% sustained, resulted from the stimulation of Stockdale Oil & Gas Co.’s Tejon Well 104. Previous gel fracs in the field had proven ineffective and uneconomical.

Background, development of water fracturing. Tejon field, located near Bakersfield in the southern San Joaquin Valley, is a mature field characterized by low production rates of relatively heavy crude. The reservoir comprises multiple sands, produced independently. Average permeability is in the 100 to 150-mD range. Bottomhole temperatures are 140 [degrees] F, and current reservoir pressure is about 1,500 psi. Reservoir fluid comprises 18.9 [degrees] API oil with a produced GOR of about 100 scf/bbl. The field is currently produced from three of the original cased and perforated wells, with five remaining idle.

When the field still had virgin reservoir pressure, the original completions came online at rates of about 100 bopd. However, all the wells “sanded up” within the first 30 to 60 days. Later wells typically produced 25 to 30 bopd initially, with productivity declining to about 5 bopd after a few months.

Previous attempts to improve productivity through hydraulic fracturing yielded poor results, e.g., 6 bopd after treatment. Additionally, the method has been deemed uneconomical because of high treatment costs associated with increased surface-pumping equipment requirements.

Stockdale decided to attempt a water frac after learning the results of a previous treatment on a nearby injector well. The water injector, completed in what has been described as a granite wash, experienced injectivity increases of 300 to 500% following its water frac. This improvement was achieved with minimal surface-pumping equipment.

During the past five years, applications for water fracturing techniques have become recognized as a cost-effective, above-fracturing-rate pressure treatment for both hard/low-permeability and soft/high-permeability formations. In low-permeability formations, water fracing tends to create fractures that are narrower than those created with viscous fluids. In addition, since fracture faces are not smooth, they tend to be “self-propping,” especially in hard rock. This factor, coupled with low reservoir permeabilities, leads to high dimensionless conductivity, even with low proppant volumes. Additionally, water fracing can deposit highly-conductive proppant packs.

In high-permeability, unconsolidated formations, the objective is to generate a tip screenout that is key to creating sufficient near-wellbore fracture width to bypass damage. Tip screenouts are induced by bridging proppant near the extremity of the fracture. Once the proppant has bridged the tip of the fracture, additional proppant injection into the fracture must increase fracture volume by increasing fracture width. Field data indicate that the fracture created in a water-frac operation is more than sufficient to bypass near-wellbore formation damage. The main drawback to the slightly-reduced, near-wellbore proppant concentration is the increased likelihood of having higher non-Darcy skin in the high-rate wells. However, in moderate- to low-rate completions, such as those in Tejon, this should not be an issue.

Tejon 104 treatment, equipment requirements. The zone selected for recompletion was about 200 ft uphole from the previous gel-frac treatment. General completion procedure was as follows:

* Reperforate lower zone from 4,916 to 4,924 ft at 8 spf and 4,890 to 4,904 at 12 spf

* Isolate at about 4,170 ft with wireline-set, retrievable bridge plug

* Perforate upper zone from 4,672 to 4,682 ft and 4,636 to 4,646 ft using 8-spf, big-hole, 3/4-in.-entry-hole guns

* Perform water frac on upper zone using 2 7/8-in. frac string and service packer

* Clean out to bridge plug and retrieve. Clean out to TD

* Run gravel pack assembly, with 4.5-in., 12-gauge, wire-wrapped screen across perforations and 12-gauge, semi-slotted pipe between zones

* Pump circulating gravel pack. Return well to production.

This treatment was initiated by pumping a step-rate test to ensure that the formation could be fractured with brine. Results of the test verified that a fracture had been initiated at about 3 bpm, at a BHP of 4,519 psi, corresponding to about 3,200-psi surface treating pressure.

With frac pressures and rates established, the main water-frac treatment was pumped using a 34-bbl pad of 3% KC1, followed by 127 bbl of 2.5-ppa (lb proppant added) slurry of 20-40 gravel pack sand in 3% KC1 into the fracture. The entire treatment was pumped at 8 bpm. This treatment resulted in 11,900 lb of gravel being placed behind pipe (about 300 lb/ft).

The increased fluid efficiency associated with fracturing a formation containing a relatively-high-viscosity crude led to a fracture length somewhat longer than normal for water-frac treatments in high-permeability formations. Plots of final net-pressure match, corresponding fracture cross-section and in situ proppant concentration indicated a fracture length of about 30 ft and an average proppant concentration of 3 lb/sq ft. Resulting productivity is charted in the accompanying figure.

The productivity increases obtained with the Tejon 104 water-frac treatment were achieved with substantially less equipment than that required for a large-scale gel frac. The gel frac on the lower interval of the well was to be executed by pumping a 2,000-gal prepad, followed by an 8,000-gal frac treatment with a cross-linked gel. This was then displaced by 1,300 gal of 25 lb/1,000 gal linear gel. The job entailed pumping 30,000 lb of sand at concentrations up to 10 ppa and rates of 12 bpm, requiring two pumps, a frac blender and a bulk-sand handling system.

The water-frac treatment required a total of 14,100 gal of 3% KC1 for the pre-treatment testing and pumping treatment. Only 14,000 lb of sand at a concentration of up to 2 ppa were pumped. This required the use of five 3,000-lb “SuperSacks” of sand and a high-rate Gravel Infuser. Additionally, because the treatment was pumped at 8 bpm, only about 700 hhp was required. This was supplied by two standard gravel pack pumps.


PTTC launches online system in Mississippi

Jack Moody, Director of Energy and Coastal Geology, Mississippi Office of Geology

The Mississippi oil and gas industry now has a visually oriented database of geological data available free to anyone with Internet access. In May, the Mississippi Office of Geology (MOG) launched the new system as part of its website, through the Department of Environmental Quality. One of the main catalysts for this important project has been the Petroleum Technology Transfer Council (PTTC), through its Eastern Gulf Region.

The database, available at: http://, now provides an electronic source for information about many of the state’s: 1) scout tickets, 2) sample well descriptions, 3) county production maps, 4) production summaries, and 5) other basic well information, including identification of wells that have been cored.

“Mississippi’s new online system is a perfect example of how PTTC works with other groups to put information directly in the hands of smaller oil and gas companies,” explains Deborah Rowell, PTTC executive director. “During a 1996 meeting of the region’s producer advisors, I asked what PTTC could do to meet the highest-priority technological needs of local industry. Harry Spooner (a widely-respected independent in the region) voiced a common concern–the need for electronic access to geological data. In particular, he said that he needed to be able to work Mississippi geology over the Internet while he was at his summer fishing lodge in Colorado. That vision is now a reality.”

Collaborative effort led to results. This was really a three-way collaboration. Cragin Knox, director of MOG, committed the agency’s funding and staff to the project. In addition, the MS Oil and Gas Board expressed its intent to provide production data and well file information online. PTTC supported the project by funding the computer system developer, Peter Hutchins, who was also a staff geologist at MOG.

“There were several key factors causing this project to happen,” according to Dr. Ernie Mancini, director of PTTC’s Eastern Gulf region. “The dedicated efforts of innovative geological and computer professionals were helped by the rapid evolution in computer and Internet technologies. Also, the donations of several important data collections from private industry were a big boost. Of course, none of this would have been possible without the financial support of several state and national groups working together for a common goal.”

The scout-ticket collections have been a major part of getting this project off the ground. MOG was able to find five private collections to make available on the Web. One of the first datasets to be scanned was the recognized, one-of-a-kind, Robert Steffey collection of sample descriptions of wells drilled in the 1930s and 1940s. Steffey had an oil and gas scout service out of Jackson, Mississippi, and his style of reporting was colorful–anyone reading the weekly reports will be entertained, as well as informed. Steffey also described sample cuttings of wells; the scanned images are the only information available on some of these old wells.

Overcoming challenges. Speaking for the developing organization, the author says the first time that MOG scanned the information, the digital images were huge. Learning as it went, the second scanning effort was much better. However, with so many part-time students involved in scanning, it became critical to develop a program to do quality checks on the work in progress–looking for duplicates, bad files and whatever data appeared suspicious. Barbara Yassin of the MOG staff was responsible for all scanning efforts.

Very early in the project, MOG was concerned with file size, storage and retrieval time for such a massive system. There needed to be enough resolution for users to read and print data clearly, but that meant larger file sizes. Ultimately, it was decided that the issue of file size should take a back seat to higher resolution. Fortunately, the computer industry was constantly coining up with new technologies to help solve the problems being faced. For instance, the project was started with HTML, but quickly embraced the advantages offered by XML and VML when they became available on a beta version.

A key enabling technology for interactive map presentations was AutoCAD’s Whip tool. The agency realized immediately that it would allow the development of maps on its website at no expense to the user. This was a major breakthrough. Steve Champlin, a geologist at MOG, went to work developing county production maps that show field location and color codes for the producing formations. Through PTTC’s earlier efforts, field production data had already been converted from hard copy to digital format, which allowed the creation of objects on the maps. The user, after downloading the free Whip tool software, can click on any field and browse through information hot linked about that particular field–such as production by formation.

Future directions. MOG is creating various image collections in GIF format, which enables the recognition of most handwriting–even with faded pencil marks and other traits of the original image. The agency also plans to make its data available on CD-ROMs. After getting the scout tickets online, it will be pushing to add digital versions of mudlogs, one-inch logs, and core analyses. After that, the next priority, is to develop a digital base map with wells spotted, so that it can use the same approach for well information as it did for field production.

The ultimate vision is for the user to have open access, via the Internet, to digital data from both MOG and the MS Oil and Gas Board. However, the agency realizes that this important effort is likely to be an ongoing process. More data will be added to the library, which means that system capabilities will need to expand. As the computer industry continues making data transfer faster over the Internet, users with phone/modem lines will see improved access.

In the future, MOG hopes to supply a lot of good subsurface studies, so that people everywhere–even in fishing cabins in Colorado–can explore for oil and gas in Mississippi. For further information, contact the author at: Tel: 601 961 5522; Fax: 601 961 5521; Email:


In addition to the individuals and groups already mentioned above, it is important to recognize: 1) the Mississippi Department of Economic and Community Development, which made funds from its oil overcharge trust available to pay students to scan documents, 2) the U.S. Department of Energy’s Office of Fossil Energy, which primarily funds PTTC, and 3) the University of Alabama, which hosts PTTC’s Eastern Gulf Resource Center, which also encompasses Mississippi, Alabama and Florida.

COPYRIGHT 2000 Gulf Publishing Co.

COPYRIGHT 2000 Gale Group

You May Also Like

Waste disposal unit – drum compactor

Waste disposal unit – drum compactor – Brief Article The CTI unit is an in-drum compaction system for compressing hazardous waste. It uti…

Russia stalls commitment to OPEC cuts – Looking Ahead

Russia stalls commitment to OPEC cuts – Looking Ahead – Brief Article Concerns arose over Russia’s delay in reaching an agreement on expo…

Harding field: a North Sea success story

Harding field: a North Sea success story – part 3 W. McLellan Part 3 – Use of drillstring vibration analysis systems proved helpful …

Medium speed pump – new equipment

Medium speed pump – new equipment – Brief Article This variable-speed, vertical in-line pump offers exceptional performance and operating…