July 2020

Special Focus: The Digital Plant

Calculating petroleum quantities in the 21st century

Over the past 100 yr, a number of technological advances and process improvements have dramatically changed how the oil and gas industry finds, produces and refines petroleum.

Tanner, L. S., Quorum Software

Over the past 100 yr, a number of technological advances and process improvements have dramatically changed how the oil and gas industry finds, produces and refines petroleum. The industry has made vast improvements in physical property databases, calculation procedures, and the determination of density and petroleum quantities by incorporating advances in modern processing technologies and updating measurement standards.

This article discusses how the oil and gas industry is reducing measurement uncertainty through the development of new physical property standards. It also examines the history and differences between various 1980 American Petroleum Institute (API) standards and their 21st-century counterparts. Additionally, it discusses computer hardware technology and software improvements and how the ongoing evolution of standards can help the industry bridge the gap between operations, modern technology and an ever-changing hydrocarbon supply chain.

A long history of standards

Industry organizations including the API, the American Society for Testing and Materials (ASTM), the Energy Institute (EI) and the Gas Processors Association Midstream (GPA Midstream) have created, and routinely update, standards that put in place methodologies for the accurate measurement of hydrocarbon fluids.1

For the purpose of commodity trading of petroleum oils, hydrocarbon distillates, petrochemicals and natural gas products, the densities, volumes and energy units are corrected to a contractual reference base temperature and pressure condition. In the U.S. and producing countries working with the U.S., the customary units are 60°F and 0 psig (14.696 psia) or saturation pressure. Most U.S. international trading partners use SI units for determining base conditions. The international community typically trades commodities at 15°C or 20°C and 101.325 kPa as base conditions.

Typically, the fluids properties are not measured at these base conditions, but instead at process temperatures and pressures. The density and volume of hydrocarbon fluids measured at process conditions can be significantly different from their values at base conditions. As a result, volume correction factors are used during metering to correct the observed densities and gross volumes to equivalent densities and volumes at a base pressure and temperature.

The first correction factors to account for the thermal expansion of liquid hydrocarbons were developed and tabulated in 1916 by the National Bureau of Standards in the U.S. The work of developing coefficients of thermal expansion of crude oil and crude products involved taking careful density measurements of several commercially available oil samples. From these measurements, detailed calculations were developed to correct for the actual temperature and pressure of the fluid being measured.

Standards were updated, and many new ones were introduced, over the past century. In 1952, the British and American temperature correction factors were combined into the Petroleum Measurement Tables, which contained many sets of correction and conversion factor tables commonly used in the measurement of various hydrocarbon liquids. These tables were developed largely from the volumetric data for crudes and crude fractions published in 1916.

In 1965, the API adopted these tables. Oil producers, carriers, refiners and marketers all rely on the Petroleum Measurement Tables (also widely known as the 1952 Tables or Blue Book Tables) to correct their products’ densities and volumes to the standard temperature. The corrections provided by these tables afford more equitable and consistent fiscal transactions between parties. They also give government agencies a means to ensure that any applicable taxes and tariffs are fairly assessed. With billions of dollars of hydrocarbons being traded every day, a variation of just 0.1% due to temperature or pressure effects can result in a significant gain or loss of revenue for a single transaction. With so much money at stake, the industry is constantly revising standards to minimize measurement uncertainties.

Standards evolve with industry needs

The advent of more sophisticated computers from the 1960s through the 1980s promised a means for minimizing variation and tightening up liquids measurements. As a result, the 1952 lookup tables were replaced in 1980 with new temperature correction tables for density and volume that could be incorporated into computer subroutines. Specifically, the printed tables were replaced with mathematical equations that served as the standard. The equations could then be implemented into the computer subroutines through consistent procedures.

The 1980 implementation procedures were the first attempt to provide the petroleum industry with a means to produce identical numbers on a variety of computer hardware and software configurations. This was not such a simple task in the late 1970s and early 1980s, when computers varied widely in processing capacity—4-bit, 8-bit and 16-bit processors were commonly used. In addition, the first Institute of Electrical and Electronics Engineers (IEEE) standard for floating-point arithmetic was not introduced until 1985.

Due to these computer hardware and software dissimilarities and relatively low capabilities, users would frequently obtain different answers from the same subroutine run on different systems. This issue was addressed by modifying the published 1980 implementation procedures to include a series of intermediate rounding and truncation rules that helped ensure consistent answers between different computer configurations. This made the procedure very complex, which increased the risk of programming errors by users.

Between the initial publication of the 1980 tables and the mid-1990s, evolving industry needs and advances in computer technology prompted a number of changes to the standards.2 For example:

  • The initial 1980 tables were developed from data obtained using the International Practical Temperature Scale 1968 (IPTS-68), which has since been replaced with the International Temperature Scale 1990 (ITS-90). The standard accounted for this change by correcting the input temperatures to an IPTS-68 basis prior to performing any other calculations.
  • The accepted value of standard density of water at 60°F was slightly changed from the 1980 standard value, which affected values of relative density and API gravity.
  • In 1988, the IP extended the procedures used for the 15°C tables to include newly developed procedures for 20°C, which met
    the needs of those countries that use this higher value as their standard temperature.
  • Tables for lubricating oils were developed and approved, but they were not immediately documented as a part of the standard. Initially, implementation procedures for the lubricating oil table appeared only for the 20°C tables.
  • The tables were extended to factor in lower temperatures and lower API densities (i.e., higher densities) to account for changing business dynamics and the need to perform custody transfer on a wider range of products.
  • Rounding and truncation were eliminated for initial and intermediate values, and now only apply to the final volume correction factor values. These final values are consistently rounded to five decimal places.
  • Changes in computer technology allowed for updates to implementation procedures. The advent of IEEE standards and widespread adoption of 32-bit and higher-level machines allowed the integer (scaled values) arithmetic used in the 1980 tables’ IPs to be replaced with a double-precision, floating-point math procedure that produces 14 significant digits (double the number of significant digits produced with the previous method).
  • Increasing the number of decimal places and using the floating-point math format highlighted discrepancies between the 60°F, 15°C and 20°C tables that were concealed in the 1980 tables. These discrepancies produced a slightly different volume correction factor value for the same output temperature. A new procedure was developed for calculating temperature on liquid (CTL) and pressure on liquid (CPL) factors that ensured that the results would remain consistent, regardless of the base temperature used.
  • As flow computers in the field became more commonplace for real-time measurement of petroleum-based fluids, improved convergence methods were developed to correct the observed density to a base density.
  • Earlier editions of the printed tables assumed that glass hydrometers were used to make density measurements; therefore, a hydrometer correction was included on the observed density. Such corrections are no longer applied, as any density measurements made with a glass hydrometer are now corrected prior to applying the calculations.
  • The use of density meters to make real-time density measurements has become more prevalent. Since these measurements are often made at pressures greater than atmospheric, the pressure effect must be accounted for at the same time as any temperature effect when determining the density at standard conditions. As a result, pressure and temperature corrections are now combined into one procedure.

As the oil and gas industry moved into the 21st century, standards organizations worked diligently to keep pace with the complex energy requirements of an ever-growing global population. As a result, the industry has adopted numerous new standards to meet changing needs in sustainable ways. Updates and additions to the standards catalog include:

  • A newly adopted water standard (Chapter 11.4) in 2003
  • An updated implementation procedure in MPMS Chapter 11.1 in 2004 (the update focused on new computer technology, expanding the operating density and temperature ranges, and addressing shortcomings with the previous standard)
  • A light hydrocarbon standard (Chapter 11.2.4) in 2007
  • IUPAC ethylene density calculations (Chapter 11.3.2.1), adopted by API in 2013
  • A new ethanol standard (Chapter 11.3.3) in 2015
  • An ethanol-blended gasoline standard (Chapter 11.3.4) in 2019.

These updates were designed to make the standard more consistent and meet industry needs; and yet, with the exception of the development of the new ethanol and ethanol-blended gasoline standards, no new hydrocarbon samples or data were taken. The basic equation forms and the associated constants used to define the temperature and pressure correction factors were not changed. Ranges of density and temperature over which certain parameters apply were only slightly changed.

Addressing ongoing industry changes

While revisions and updates to measurement standards are welcome, these standards have not yet fully accounted for major shifts in the industry’s business operations and new technology trends.

Physical properties keep changing. Today’s crude oils and refined products do not have the same physical properties as in the 1970s. The tetraethyl lead (TEL) that was used to boost octane 45 yr ago has since been replaced with ethanol. Feedstocks that were largely composed of lighter, sweeter, conventional crudes a generation ago have been replaced with greater quantities of heavy oils from Canada and crude from a number of shale plays across North America and elsewhere.

Further complicating matters, standards like API MPMS 11.1 have a limited density range. For example, many types of liquid petroleum gas (LPG) and natural gas liquids (NGL), as well as liquified natural gas (LNG), are less dense than the liquids covered by the API MPMS 11.1 standard. As a result, measurement subject matter experts must have a wealth of physical properties and standards knowledge.

Despite these dramatic changes to petroleum feedstocks and products and the industry’s best efforts to keep up with them, measurement standards must continue to advance. The API standard still uses constants developed from empirical data collected in the 1970s. The graph in FIG. 1 represents the empirical data set collected in the 1970s, and how the industry used mathematical models to expand the empirical data to accommodate change in the production source.

FIG. 1. The industry has used mathematical models to expand 1970s empirical data to accommodate change in the production source.

Demographics keep evolving. Back in the 1980s and 1990s, a number of oil and gas companies developed their own measurement systems, written in software languages like FORTRAN and COBOL. The popularity and usage of these programming languages has waned over the past 30 yr, and the number of qualified personnel who can code in these languages has dwindled.

Even companies that do not have their own measurement systems have relied on the knowledge and expertise of baby boomers in interpreting and using the measurement tables. The departure of such knowledgeable employees has many companies wondering how they will be able to effectively implement new measurement standards as they are developed.

Technologies keep advancing. As stated earlier, the limits of electronic technology available in the early 1980s forced the industry to implement a series of intermediate rounding and truncation in its measurement calculations. However, the processing capabilities of modern computers means that intermediate discrimination is no longer necessary—given the same processing inputs, different computers can produce the same answers.

Nevertheless, the use of rounding truncation inside calculations is still a standard process today. Although the industry is using the most advanced computing systems, many of the underlying calculations running in today’s electronic flow measurement systems are still using algorithms that were written for 4-bit processors.

When it comes to dynamic measurement tickets, the industry is still applying 1970s computer technology. Meanwhile, advancements in metering and measurement technology are allowing upstream and midstream operators to track the inventory and movement of multiple fluid streams with far greater precision. The use of rudimentary measurement devices, like a metered stick, has been replaced with more sophisticated methods and meters, from traditional orifice, positive displacement and turbine meters to Coriolis and ultrasonic meters. Strapping tape is still widely used in custody transfer operations, as is radar- and infrared-measurement equipment.

These newer technologies can improve inventory measurement and management, but only if the data obtained is properly handled and interpreted. This typically requires an operator to consolidate datasets from different metering sources, verify the accuracy of the data, apply any necessary fluid quality samples and then recalculate as required.1 Only then can the operator perform a system-wide inventory balance to check for any operational issues or missing data.

Upgrading standards to today’s needs

The industry is aware of the challenges of making accurate measurements with antiquated or outdated systems, and is working hard to update its standards to reflect present needs and move fluid quantification into the 21st century. A number of API standards are being rewritten to reflect changes to empirical data, to the type and composition of petroleum feedstocks, and to the computing and metering technologies available.

API Chapter 18.2 was introduced in 2018 to allow for the use of radar-guided measurement technologies. New technologies have allowed for the introduction of new standards, like API Chapter 18.2, that implement safer and more efficient ways to transport crude oil from production leases to processing facilities (FIG. 2). A number of working groups, with measurement experts from service companies and operators across the industry, are rewriting several standards. API Chapter 12 (covering measurement tickets) and Chapter 20 (allocation measurement) are being updated, as is Chapter 21, which covers electronic flow measurement for gas and liquids.

FIG. 2. New standards implement safer and more efficient ways to transport crude oil to processing facilities.

One of the major upgrades to Chapter 21.2 will change how liquid inventories are recorded. Rather than recording liquid inventories on a batch basis, which might cover a duration of up to one month, the standard is being rewritten to put liquids on a quantity transaction record (QTR) of an hourly basis, similar to comparable gas inventories. This change will allow users to reconciliate smaller time intervals of data, which will help improve measurement tracking and more accurately pinpoint the time when some measurement error (due to a transmitter failure or a metering problem, for example) occurred.

Modernizing measurements

The industry is doing its best to keep up with the many changes to regulations, supply, logistics and consumer needs; however, with limited time, money, dedicated labor and other resources, keeping up is an ongoing challenge.

Companies can take measures to minimize uncertainty in the measurements of today’s crude oils and processed products, even while using the standards available today. For example, the empirical data used to generate the correction tables contain an inherent temperature bias. Companies using these tables can correct for this bias by quantifying their fluids—running physical tests to gather their own empirical data of temperature and density. These data are then regressed to determine a coefficient of thermal expansion (or alpha) for each fluid. Table C in Chapter 11.1 requires that users quantify their fluids in this way to determine an alpha.

This alpha is then plugged directly into Table C, which uses a physical-property, binary model to determine densities at flowing temperatures. To complete the process, the effects of pressure on density must be included in the calculations at saturation temperatures.

Giving users the ability to plug their own alphas into Table C helps decrease the uncertainty of their overall measurements compared to using generalized tables. However, without access to systems that could automate the process, this would be a complex operation that requires significant time and in-house knowledge to execute correctly.

Another automated solution comes from an advanced software application that can be run on any modern-day flow computer. The software uses a three-dimensional (density, temperature and pressure) model that incorporates a simplified equilibrium vapor pressure equation to predict the Table E densities for light hydrocarbons. This model combines five different standards into a single iterative process, thereby decreasing measurement uncertainty.

This measurement system application was designed with the input of multiple companies that understand and implement best practices in fluid measurement within the industry. As a result, the application brings greater accuracy and accountability to fluid transfers and inventory management in a number of additional ways.

Users can record and analyze their data with much higher resolution than is possible from other systems, or is even required from industry standards. On the gas side, for example, the application allows companies to scrutinize their data in 15-min intervals, as opposed to the traditional 60 min. Liquid measurements can also be taken in much tighter time intervals. Furthermore, data can be imported from every type of measurement point—both static (tanks, railcars, ships and trucks) and dynamic (any kind of metering device).

Users can audit their measurements and identify any data quality issues or errors. The time and location of an error or data anomaly is recorded and easily retrieved, which allows the user to quickly identify the source of the problem—such as a failed transmitter or sampler—and make the necessary repairs or changes to address the issue.

Measurement errors are corrected by using the average of a measurement from the past several hours as a basis. Once the error is corrected, the application restates the data—including pressure, temperature, volume and mass anomalies—and creates a complete, SOX- and API-compliant audit trail. The numbers are closed out at the end of the month (or other appropriate reporting period) and sent to downstream accounting systems.

The application allows for a complete, system-wide fluid balance to accurately determine inventories by location and any type of product loss. Most other measurement applications cannot track fluid quantities so closely, deliver detailed fluid inventories and identify the location of potential problems areas, such as leaks, with the same level of scrutiny.

Takeaway

To ensure greater measurement accuracy for a wider range of today’s hydrocarbon fluids, the industry continues the important work of updating its measurement standards to align with the advanced processing technologies available. At the same time, software providers have continued to update their measurement application software.

One such advanced application combines all static and dynamic measurements, physical properties, flow calculations, data processing, auditing and verification into a single platform, thereby minimizing uncertainty in fluid measurements. With tools like this, users can confidently implement the standards to ensure unprecedented accuracy and accountability in every fluid transfer and inventory transaction. HP

LITERATURE CITED

  1. Squyres, M., “Navigating the petroleum measurement perfect storm,” Pipeline & Gas Journal, December 2014.
  2. ASTM, “Chapter 11.1: Standard guide for the use of the joint API and ASTM adjunct for temperature and pressure volume correction factors for generalized crude oils, refined products, and lubricating oils,” ASTM D1250–19, online: https://www.astm.org/Standards/D1250.htm

The Author

From the Archive

Comments

Comments

{{ error }}
{{ comment.comment.Name }} • {{ comment.timeAgo }}
{{ comment.comment.Text }}