Jump to content

Wireline QA/QC

From Wikipedia, the free encyclopedia

Wireline quality assurance and quality control (wireline QA/QC) is a set of requirements and operating procedures which take place before, during, and after the wireline logging job. The main merits of wireline QA/QC include accuracy and precision of recorded data and information. Accuracy is a measure of the correctness of the result and is generally depended on how well the systematic errors (a reproductible inaccuracy introduced by faulty design, inadequate calibration or a change in borehole) are controlled and compensated for. Precision is depended on how well random errors (errors that cannot be reproduced and are mostly related to the physics of a measurement) are analysed and overcame [citation needed].

Introduction

[edit]

Wireline logging is a part of exploration geophysics and is mainly used to detect the presence of economically useful hydrocarbons in the Earth’s sub-terrain. Products of wireline logging are wireline logs or well logs.:[1] (Fig. 1. Ll1LOG)

LI1LOG

In the oil industry the logs are “a recording against depth of any of the characteristics of the rock formations traversed by a measuring apparatus in the well-bore”.[1] The well logs are obtained by lowering the measuring tools on a wireline (cable) into the well (borehole).[1] The integral parts of wireline logging operations are quality assurance and quality control procedures. Quality control is the “process that defines how well the solution for a specific problem is known”.[2] Well log quality control is a “set of methods that identifies and analyses data deviations from established standards and allows the design of a remedy”.[3]

Unlike measurements in well-known and controlled laboratory conditions, logging is performed in-situ and can be affected by different possible failure sources and susceptible to systematic and random errors. The objective of wireline logging job is to obtain a permanent continuous record of the rock properties penetrated by the wellbore with the result of wireline logs, fluid, and rock samples. Of all the well data sets recorded and collected, well logs are the most valuable as they are vital for reservoir and formation evaluation. Wireline (well) logs are then combined with drilling data, mud logs, and measurements while drilling (MWD) and coring information in order to choose correct testing and completion intervals to properly evaluate the production potential of the well. There are two categories of well log data: the original data (e.g. Gamma Ray log as a result of gamma rays measurements in the borehole, through an iodine crystal scintillation detector, calibrated as per normal field operational procedure) or derived data (data resulting from the processing of the original data, e. g. calculating the volume of shale).[4] Main components of the well log data record are logs themselves (main curves for the main and repeat sections with relevant information - calibration information, parameter tables... and additional curves of the main and repeat passes including down-logs and images of quality control logs) [1] and contextual information, such as data acquisition plans, other job reports, witness reports, and tool specification. Contextual data is of a great value for exploration but can be lost or difficult to access.[4] With the best quality wireline logging data acquired, subsequent steps in well production and completion are more precise and effective. High quality data and conscientious data management enables the prognosis of potential problems and failures so the whole process (the whole rig or oilfield lifetime) can be adapted and configured in order to prevent possible consequences in regard to human safety, environmental preservation, and infrastructural integrity. Acquisition of geotechnical and petrophysical data is costly, but necessary.[4] Data of dubious quality cannot yield reliably good decisions. The poorer the quality of data, the higher risk associated with decisions based upon them. The less knowledge there is about the quality of recorded and collected data, the higher is the uncertainty of said decisions and can lead to false interpretations and evaluations. A good evaluation is only possible with good quality data so it is essential to acquire the best quality data.[4]

Factors affecting the data quality

[edit]

The data quality may be compromised to a greater or lesser extent by bad hole conditions, wireline equipment failures, human errors, or even extreme weather conditions. Within the bounds of the rig time costs and preservation of the personnel safety, equipment and the well, the logging engineer, operational geologists and wireline log witnesses have to ensure that the best quality data is acquired. The objective of the wireline logging services is to provide the best possible quality data in the minimum possible rig-time. Quality of the data is directly dependent on available rig-time. Factors affecting data quality and time efficiency are operational procedures and environmental conditions, but mainly problems caused by them. During wireline job planning, some factors are accountable for specific types of equipment used for specific borehole geometry and conditions (temperature, pressures and mud type used) and others are wireline responsibility on site and during the wireline logging job. Data quality and time efficiency factors [1]

  • Environmental conditions:

Drilling mud – hole diameter affects log readings, mud type and density affect type of tools used and measurements such as conductivity and resistivity, invasion of drilling, and mud fluids into reservoir rock, forming of mud-cake, wash-outs in soft sediment, formation of stress-related break-offs.

Borehole geometry – e. g. affects type of logging and tools used (wireline, pipe conveyed or LWD).

Borehole environment – temperature, pressures and possible hostile (H2S) environments affect tools used and their operating limits.

  • Operational procedures:

Equipment calibrations and verifications – tools are calibrated by adjusting their response to read some predetermined value in a situation for witch the response is known, and the only time we know for certain that the tool is working properly is during calibration and verification. They should be checked as often as possible to ensure the correct data is recorded.

Tool positioning and configuration (geometry) – tools have to be centralised or decentralised depending on the type of measurement recorded, they have different depths of recording intervals depending on their position in tool-string, different vertical resolution (e. g. difference in vertical resolution in short and long spacing resistivity tools), spacing of sensors on tools influences the volume of rock recorded. Different tools have different depths of penetration into formation, making the need for accuracy very important.

Depth mismatch/accuracy – elemental requirement for consistency of data is to know on what depth the measurement is taken, otherwise it is difficult to compare different logs needed for further data processing (interpretation).

Logging speed – logging speed limits directly affect rig time efficiency as they are not the same for all types of logs (and connected tools), and service companies provide maximum speed values for chosen operating tool.

Types of Log Data Quality control

[edit]

Log data quality control is a method to assure a certain level of well log data quality and accuracy in the oil company’s system. The most effective way to be certain of the quality is to make checks at the time of the data acquisition while the wireline operation is ongoing or just finished. The best practice is for all three types of log quality checks to be done on-site during a wireline job. According to Storey (2016)[4] there are three types of log data quality control (LQC):

  • Type 1 - Acquisition LQC

Quality control applies to original data as they are recorded by the wireline service company and is performed by the wireline service companies engineers and/or wireline log witnesses. The main objectives are to monitor operational procedures, that the logging program is followed, communication about progress and problems, recording of any notable events, verifying the precision, accuracy and completeness of acquired data (logs and contextual information), and equipment verifications and calibrations.

  • Type 2 - Acceptance LQC

Quality control during the acceptance of data by the oil/gas company (log data recorded and collected by the wireline service company and/or 3rd party). Objective is to verify the data and record for any deviations from original job plans and register them without delay. The purpose is to assure completeness and accuracy, to remedy all unacceptable deviations from the original plan and record all acceptable deviations, to clarify any significant questions or problems, to communicate progress, problems and concerns to subject matter experts, to record the log quality control summary in writing and to accept the data formally. The main concern is the completeness and accuracy of data, and consistency of different components of the data and information set.

  • Type 3 – Pre-exploitation LQC

Verifying that the Type 1 and Type 2 LQCs have been done before data exploitation. This control can happen during operations/job to help decide on the next operation, or after the operation/job to construct, constrain, and refine formation and reservoir models, appraise uncertainties and decide on follow-up action.

Wireline QA/QC service providers

[edit]

The well site is the front line of well log quality control. Only while the equipment is on-site and the well bore is open is it possible to investigate in-situ formations, and, if necessary, influence decisions which can result in better, higher data quality. If the problem is detected it can be readily addressed by the personnel around.[4] Quality executed wireline logging job from start to finish makes subsequent steps in oil and gas extraction (the main goal of well site operations) more straightforward and thus ultimately less expensive. It makes better prediction possible and adapting to unwanted surprises and problems which usually cause delays (down-time), additional interventions and higher costs.

Service companies

[edit]

Wireline service companies are responsible for delivering complete, accurate and consistently recorded all-encompassing data and information which has been acquired by correctly calibrated and verified instruments, correctly used during the logging runs (standard division of logging job) by providing relevant documentation in hard and soft copy data. Wireline service companies provide the personnel and equipment needed and agreed upon, in coordination with the oil company, for any specific logging job on the specific well site (FORASERV, Schlumberger, Baker Hughes, Halliburton, Archer well to name a few).

Wireline logging QA/QC witness

[edit]

Oil companies, to ensure that the highest quality data is recorded, engage an experienced expert to supervise wireline logging job on site that coordinates an entire operation (with assistance of logging engineer and operational geologist) from start to finish and has knowledge of every step of the operation in intimate detail. They are called wireline log witnesses and are either individual consultants or part of the wireline QA/QC consulting firms. Their objective is to keep track of the wireline performance of logging provider, their efficiency and procedures. Some of the notable wireline QA/QC consulting firms are QO Inc., Gaia Earth Sciences Limited, OGEC, one & zero and STAG. Complexity of the modern wireline and LWD logs, with several pages of logging job parameters and calibrations, makes this duty very demanding and even specialized training courses for log witnesses are available (for example Petroknowledge and Opus Kinetic).

Roles and responsibilities of wireline log witness

[edit]

The role of wireline log witness is all-encompassing and includes coordination, participation, and supervision of all pre-job, during job (on site), and post-job logging activities:

  • They are responsible for communication with on-site personnel, for reporting all logging objectives, and operations to the contractor (usually the oil company) during the job and a final report after the job.
  • Their answerability includes following up the predetermined logging program and objectives (between the oil/gas company and wireline service company) and approving and recording of all subsequent modifications within the limits of service company abilities and the oil company wants.
  • They need to know what tools are required and what tools are available.
  • They need to make sure the logging crew has checked all the equipment (to ensure all the logging tools are in working condition), that no equipment required for the job is missing, and all tool calibrations and verifications are valid.
  • They are responsible for repeating the tool calibrations and verifications several times during the job to check them.
  • They are obliged to review and discus job risk assessment and potential safety issues.
  • They are in charge of implementing "Lessons learned" from previous jobs and taking the notes of "Lessons learned" on present job.

Well log QA/QC procedures

[edit]

Foundations of a good well log data quality are data consistency, tool calibration, tool reliability, and wireline service company performance. In collecting right and good quality data, proper and valid calibrations and verifications of all the equipment, consistent complementary readings checks (e. g. density and neutron logs or porosity logs) and repeatability of recorded logs are all essential.

Basic steps in log quality control are:[5]

  1. Having a well-planned and detailed logging program which takes into account site and safety conditions, information about the equipment, acquisition parameters, borehole environmental conditions, and planned operational procedures. The program should be coordinated by the oil/gas company, wireline service company and wireline log witness.
  2. Paramount to the on-site job is documentation of all the fieldwork – field logbooks, data collection sheets, and field notes, as well as completely recorded instrument digital data.
  3. The highest priority before the job and during the job are equipment calibrations and verifications. Routine checks of the equipment should be made on periodic bases and after each problem and repair. An operational check of equipment, along with test measurements, should be carried out before the start of each job and before starting each run.
  4. Standard corrections and changes in logging programs made by field engineers or data managers for operational procedures, borehole conditions, previously unknown site conditions and main depth shifts should all be documented. The rationale for why the change was made, compromises and consequences that change may represent should also be documented.
  5. Recording the conditions which affect the survey and measurements. When recognized they should be documented to provide guidance for later projects ("Lessons learned").
  6. All the equipment problems, and steps taken to correct these problems, should be documented with insights how the corrections may affect the data.
  7. It is important to review electronically recorded (digital) data (well logs) to ensure that the data recorded and their values are consistent with the setting. The logs should be depth matched (final depth control), compared, and any borehole geometry effect taken into account. Overall data quality and accuracy should be documented.

Log Quality Control (LQC) Digitalization

[edit]

Data management is more commonly done using a variety of software packages. Most equipment manufacturers have data transfer (real-time data transmission) or download software provided for their instruments which permits data editing and limited data manipulation.

Digitalization and digital solutions are an important part of the oil and gas industry.[6][7][8][9] Harnessing new technologies, as Devold said, ''“is a critical business need”.[7] Digitalization and finding new software solutions is a mandate in logging because the largest errors come from inadequate knowledge of the position and orientation of the measurement sensors and the biggest problems come from human error.[2] This can be partly eliminated by employing smart software solutions. No matter how careful, people make mistakes, equipment fails, accidents happen. Effectiveness of a quality wireline logging management system is the ability of the wireline service companies, their engineers and logging witnesses to predict, eliminate, overcome, and compensate for possible problems and limitations.

Major steps are taken in the field of log data interpretation and visualisation (e. g. Interactive Petrophysics by Lloyd's Register, Log Studio by Logtek softwares, Petra® by IHS, GeoSoftware by CGG, JewelSuite by Baker and Hughes, Delphi by Schlumberger, Petrosys, Geolog®, GEOSuite7 to name a few) with some software development in the area of well operational performance management and optimisation (RIGIQ® by Trigpoint Solutions, EnergySys), but log data and contextual information management (including wireline program optimisations) is sadly underrepresented. Some of the wireline service QA/QC companies have developed their own internal software solution to this problem (e. g. one & zero consulting company), but almost none of the software is open-sourced or commercially available. At present there can be found one commercial software - RIGPRO by the same company – that offers solutions for digitally supervising all logging data, including the contextual information, in a format that enables quality QA/AC.

References

[edit]
  1. ^ a b c d e Serra, O. (1984): Fundamentals of well-log interpretation, Elsevier, 435 p.
  2. ^ a b Olhoeft, G.R. (2005): Quality Control in Geophysics, www.geophysics.mines.edu
  3. ^ Theys, P. (1999): Log data acquisition and quality control, Editions Technip, Paris, France, 330p.
  4. ^ a b c d e f Storey, M.C. (2016): Demystifying Log Quality Control, SPE Asia Pacific Oil & Gas Conference and Exhibition, Perth, 25–27 October 2016, Paper SPE-182313-MS, 24 p.
  5. ^ Hoover, R. (2006): QA/QC and Geophysical Projects, Highway Geophysics-NDE Conference, www.umr.edu/2006geophysics/, 10 p.
  6. ^ Condom, S. (2017): Digitization holds key to unlocking oil and gas industry’s potential, Offshore, 36 - 38.
  7. ^ a b Devold, H. (2017): Digitalization evolves from buzzword to critical business need, Offshore, 39 - 41.
  8. ^ Jayakody, P. (2017): Data-driven analytics technology improves asset management, Offshore, 42 - 43.
  9. ^ Sharma, P. (2017): ‘Digital twin’ concept underpins successful digitization strategy, Offshore, 34 - 35.