As the oil and gas industry recovers from the downturn, the surge of new cloud technologies, analytics, the ‘Internet of Things’ and machine learning tools entering the business are promising to digitally transform old business processes and drive big improvements in efficiency. This article explores how data management services are evolving to play a vital role in this transformation of our working lives and the way we interact with complex information. How will traditional data silos evolve to support new analytical toolkits? Will data management departments progress to become more agile and efficient? How can we empower users to access relevant, high quality data more effectively? And can we deliver lower-cost, open, integrated data-centric working environments, fit for the new generation of geoscientists, engineers, data scientists and data managers?
A Brief History of Geoscience Data Management
Data has always been at the heart of the oil and gas business. We conduct geophysical and geological surveys that allow us to construct pictures of the subsurface when we explore for reserves, we create and capture data when we plan, engineer and develop fields, and we continuously acquire data as we produce from these assets. The timeline of technology and services that data managers use to manage this hugely diverse collection of data charts the evolution of the tools used by geoscientists and engineers through our industry’s historical journey.
In the analogue era, data management involved managing large physical libraries and warehouses of prints, films and analogue tapes. With the advent of digital technologies, we began to extend these libraries to include digitally formatted tapes for processing on mainframes and minicomputers. In the workstation era, we started to work with large volumes of online data and developed disk-based formats for efficient interactive data access. We built highly structured databases, and we used advanced visualisation, modelling and simulation systems to accelerate interpretation and planning.
In the desktop application era that has dominated data management for the last 10 years, we saw a move away from proprietary workstation technology towards desktop platforms that took advantage of huge leaps in compute, memory and video capacity of consumer PCs, to deliver a wide range of functionality from seismic to economics. In parallel with the rise of the desktop application suite, our industry also started to build new, large-scale, centralised computing services for data management and processing. In the data management domain, ‘big iron’ systems were built to manage regional-scale and corporate datasets that held all the data for a particular discipline.
But unlike the transition between geological epochs, the evolution from one data management era to another does not typically involve a convenient mass-extinction event. In the world of data management, old species of data live on rather than joining the fossil record. It is only the data managers that become petrified.
The Digital Transformation Deluge
In the ’80s and ’90s, we might have reasonably claimed that oil and gas was pushing technical computing boundaries and was at the leading edge of visualisation, big data and large scale analytical processing. However, the developments in computing over the last 20 years have seen oil and gas data processing innovations reduced to a speck in comparison with those innovators that now dominate our highly connected and globalised digital society. New businesses are now creating a wave of ‘Digital Transformation’ that presents a powerful challenge to orthodox business models and stagnant enterprise IT platforms.
In the new digitally transformed economy, conventional wisdom does not provide much of a guide to what tomorrow’s organisation will look like. The world’s largest retailer owns no inventory, the world’s largest taxi company owns no fleet, the largest accommodation provider owns no real-estate and the most popular media owner creates no content. Some might argue that these are transactional businesses, and are therefore poor analogies for the ‘special’ world of upstream data management, but digital transformation is less about the common ground between businesses that are transformed, and more about simplification or reduction in the use of intermediaries as a way of improving productivity.
The traditional oil and gas technologist is steeped in the culture of enterprise computing, and this new wave of unfamiliar technology has been resisted for some time. But the levee holding back the surge has finally broken, and we are faced with a deluge of innovation entering our industry, the origins of which lie in social media, online advertising and the gaming industries. Rather than being overwhelmed, can we embrace this change, rally together and deliver better data management products and digital services to our consumers?
Accelerating Cloud Technology Adoption
The notion of using shared, public infrastructure has sat uncomfortably with organisations that are used to playing their cards very close to their chests, but in the current low-cost operating environment, the economic benefits of paying for IT infrastructure, data storage and applications ‘as a service’ are overwhelming. Companies have been steadily transitioning through internal and hybrid clouds, and now more are committing towards a Cloud First infrastructure policy for all of their IT needs.
The traditional arguments against using cloud services for petrotechnical data have included hidden and longterm costs, security concerns, volumes of data involved and loss of control of data and systems. These arguments have become weaker as more critical, non-technical enterprise IT systems are migrated to cloud. After all, if our most sensitive commercial data can be entrusted to the cloud (e.g. email, ERP, documents etc.), then maybe it is safe enough for bulk seismic trace storage. In general, the exploration and production departments in oil companies are starting to lose their ‘special’ status in IT terms, and are increasingly expected to adopt cost-effective systems.
There are still hard barriers to cloud adoption in certain business scenarios. Many countries have laws that prevent data from leaving their geographic borders. On the other hand, we also see data regulations evolving to become more aligned with modern digital infrastructure as regulators increasingly recognise that their primary challenge is not to protect their data, but to get as much of it in front of as many eyeballs as possible to stimulate inward investment.
Improving Data Services Delivery
Digitally transformed data management services delivery can both improve service quality and reduce costs. Shared online workspaces enable teams to collaborate remotely, across geographic, organisational and time-zone boundaries. Online ‘data factories’ can be created to extend the capabilities of a small core team. Online workflows and tools can dramatically improve the way teams control data quality, data loading and the governance of high-value datasets. Raw data placed on the cloud can be accessed by offshore or outsourced workers, with processed results distributed, reviewed and approved, interactively online.
This globally-distributed, collaborative working will become increasingly important given the demographic changes our industry has experienced in recent years – the ‘big crew change’ is well underway. Experienced staff permanently lost to the industry need to be replaced with less experienced staff in lower-cost locations, working with easy-to-use, automated data processing tools. The same environment can be used to re-engage experienced staff on a part-time or consultative basis.
The concept of a collaborative online workspace can be extended to full geoscience and engineering projects, from simple processing and interpretation through to field development planning. Using modern cloud infrastructure from major providers, it is now possible to deliver full interactive working environments as a cloud service, with legacy desktop applications hosted alongside cloud-native services. Creating virtual professional service teams in the cloud will improve the utilisation of global geoscience and engineering teams in the same way that a cloud-hosted data factory improves the utilisation of data management resources.
Rise of the Data Scientist
Traditional petrotechnical analysis revolves around the creation of physics-based predictive models. These are developed and executed by geoscience professionals with a deep understanding of geology, geophysics, engineering, reservoir dynamics, production technology and economics, among other things. They are supported by data managers who ensure the availability of high quality data that powers the decision making process.
In regions where there is a high density of data from large-scale drilling and completion operations, significant money – and faith – is now being invested in advanced analytical techniques such as machine learning, to augment or replace decisions traditionally made exclusively by geoscientists and engineers. Some companies now use algorithms to define optimal drilling locations, using automated or semi-automated systems that deliver results on much shorter cycle times compared to traditional methods.
To be successful and meaningful these algorithms require vast volumes of data – much more than traditional human-driven workflows require. These new methodologies introduce a new role to the traditional decision-making process: the Data Scientist. The data scientist is a decision maker, in the same way that a geoscientist is a decision maker. However, he or she may not be a domain specialist. The data scientist who is helping you discover a pattern in your data to optimally place your next well, could be equally qualified to work for a client who wants to know the ideal number of ridges to place on the sole of their new running shoe.
That might sound far-fetched, but there can be no doubt that many companies believe that they can generate fresh insight, reduce decision cycle times and steal a march on their competition by automating the search for patterns and relationships in their data. In order for the data scientist to build effective models, they still need to collaborate with domain specialists to understand the features of the domain, as well as work closely with data managers to get access to high-quality, certified data.
The data manager has a critical role to play as a supplier of data for analytics projects. In the same way that they are a crucial part of the governance process for managing the traditional quality-controlled project and corporate data delivery to applications, they must also now track and manage petrotechnical data as they move into and out of analytical projects. If analytics projects are not connected with evergreen, validated data sources, the results of those analyses cannot be trusted by the business, and we will once again be burdened with the high cost and increased risk of managing and working with duplicated, untraceable and untrusted data.
Data Remains the Key
These technologies are digitally transforming the way we work and interact with our world at every level. Oil companies that ride this wave will significantly increase the current productivity of their knowledge workers, optimise business processes and reduce operational costs in a way that is not possible through incremental change. But whilst technology, people and processes are changing rapidly, we can take some comfort in the fact that data will still remain at the heart of everything we do.