Insights

The Future of Data Management in the Oil Industry

Over the past five years, global economic fluctuations, industrialization of the BRIC economies and globalization of oil trade have been the key drivers of oil industry’s performance. Going forward, while the influence of these factors will remain strong, we believe that technology will play an increasingly indispensable role in the oil industry. This article focuses on how data management, a pivotal part of oil technology (and technology in general), transforms the oil industry.

From now until 2020, the oil industry is expected to grow to $5,300 billion in revenue, with an average annual growth rate of 4.4%.

Meanwhile, the oil data management market is forecasted to grow from $6.08 billion in 2015 to $21.22 billion in 2020 at an annual rate of 28.4%.  In a simplistic world, it would be convenient to conclude that every 6% increase in investment in oil data management would lead to a 1% increase in revenue.

4 key forces behind data management that impact the performance of the oil industry:

Machine Learning

Oil mining preparation follows geological exploration – reservoir estimation – mining design – secondary recovery. In the conventional model, mining design accounts for the majority of the cost.  Traditionally in the design phase, every characterization data point of the reservoir (such as porosity and permeability) is acquired and analyzed. Based on the analysis, oil engineers develop a numerical simulation of the reservoir on computers to forecast oil production.

After the simulation process, engineers use historical production data to revise the simulation model mathematically and to enhance the accuracy of the simulation. This process is known as history matching.

Parallel to the simulation model creation, well testing is conducted to acquire production-like test data. The reservoir simulation model is then compared and integrated with the test data to reach a finalized production model. The biggest problem with this design process is that both reservoir data analysis and oil well tests are enormous cost centers that would burn millions of dollars each time a design plan is developed.

The goal of machine learning is to eliminate the unnecessary (yet costly) data analysis process and to establish a mining plan in a shorter amount of time with a much lower budget.

Machine learning enables cost savings by concentrating on historical reservoir characterization data and production data, both of which are widely available to oil companies. Instead of acquiring new test data, today’s data management uses existing characterization data from thousands of wells and relevant historical production data to train a production model. The more historical data that is fed into the model, the more accurate an oil production forecast. The accuracy of the model is then validated against additional historical data.

Finally, the characterization data of the targeted reservoir is processed into the model in order to obtain a production prediction.

The use of machine learning significantly reduces costs from production preparation while maintaining planning accuracy. As machine learning is gaining popularity in the oil industry, more mathematicians and programmers are entering the industry to fill the growing demand.

Big Data Analytics

Because of the vast amount of unstructured data throughout the oil exploration process, the big players in the oil industry are pouring money into Data Analytics.  Oil companies have a high volume of data in various forms – geologic data, reservoir characterization data, test data and production data. Today’s oil industry focuses on mining this existing data to discover meaningful patterns in production.

For example, two reservoirs in South Dakota share similar geologic proprieties and characteristics. The drilling methods of the existing wells on the two fields are also the same. In theory, the production volume of the two reservoirs should be similar, but the actual production outcomes are often remarkably different.

When such a difference occurs, oil engineers compare the two sets of data closely to identify variables that cause the production differences. Advanced data mining techniques allow engineers to discover correlations between oil production and variables that have never been previously considered. These new discoveries may lead to better production planning and greater profits.

But data mining is only a part of the Big Data plan - the future lies in oil field automation. With fiber optic sensors, key oil field data can be sent to computers for real-time analytics.

A sophisticated data model that has been continuously trained with new data mining discoveries would process the real-time data feeds and turn them into adjustments of production plans in a matter of seconds. The entire production preparation process would be streamlined and automated.

In the past, oil engineers could only monitor one well at a time on site. With oil field automation technologies, they can now manage thousands of wells from their corporate offices.

Chevron’s “i-field” (which stands for intelligent fields) is the pioneer in production automation. Chevron has implemented thousands of sensors across its oil fields to monitor operations, respond to production changes and enable real-time analytics.  The i-field “is already saving several million dollars in [Chevron’s] operating costs,” according to USNews.

Mobility

In a recent survey by Accenture and Microsoft, 90% of oil industry respondents believe that “more mobile technologies in the field would increase value.”

In the past, turning freshly collected data into decisions would take a fortune.  A great amount of time and money are wasted when engineers travel between the oil field and their corporate offices because data is only shared among networked computers on-premises. Mobility bridges the gap between data and decision making.

Mobile applications allow supervisors to use production data on demand. With mobile applications, data is now presented on remote devices and shared with key oil field personnel, usually in the form of dashboards. With just a few glances on their smart devices, supervisors can be aware of current production status and any changes. As a result, key personnel can stay where they are most needed and still make nimble decisions.

Production engineers can also use mobile devices to monitor assets. A key advantage of mobility is location-awareness. Using mobile applications helps production engineers understand and quickly locate problem areas on assets. This advantage cuts down reaction time to urgent asset issues while improving effectiveness of asset management.

Cognitive Computing

Cognitive computing is an untapped goldmine for the oil industry with very promising potential. It uses machine learning and cognitive recognition algorithms to train computers to process like human brains. Cognitive computers have the power to learn, understand and visualize human communications.

Natural language recognition and image recognition are two vital areas in cognitive computing. As historical information is key to the success of oil exploration, cognitive computing provides on-demand analysis of past geologic reports, drilling logs and seismic images; allowing decisions to be made quickly and with references to past experience. Meanwhile, freshly published oil reports and recent geographic changes that are available online can also be capitalized with cognitive recognitions.

Another use of cognitive computing comes from its potential to facilitate cross-domain, cross-functional collaborations. Oil exploration is a complex science that involves expertise in geology, geophysics, engineering, financial and many other areas. Cognitive computing can be used to replace the inefficient meetings among experts of different areas and decrease the time that is wasted as experts try to understand each other.

Cognitive computing is still a fresh concept to the oil industry, but decision makers have already placed it on the top of their wish list. Repsol, the largest Spanish oil operator, recently launched a research collaboration with IBM to study the application of cognitive technologies in oil. Leveraging IBM’s Cognitive Environments Laboratory, Repsol plans to build its competitive edge on language processing and next-generation interaction platform across domains.

Article written by Mingbo Gong and Tongxin Zhang
Want more? For Job Seekers | For Employers | For Influencers