Effective data management is the key to correct and timely decision making. It provides the right information, to the right people, in the right place, at the right time, delivering the best outcomes based upon true insight. If the mistakes of the past, like bad drilling decisions or incorrect reserves estimates, can be avoided, the potential values of the future are more accurately assessed.
However, implementing consistent data management practices in an organisation is complex and challenging due to the sheer volume of data, the mixed vendor environment, and the multi-disciplinary and dispersed location of data users and keepers. A structured but flexible, vendor-neutral solution must be developed, based upon three fundamental processes. These are data discovery; organisation; and preservation.
The vastness of the data and the tendency of some users and applications to store data in diverse places can make it difficult to obtain a complete picture of an asset. Data can exist on user workstations, external hard drives and dedicated file stores. Discovering and understanding exactly what is available can be achieved by profiling file systems with a fine level of detail and application awareness. Utilising a combination of data points from applications, projects and file contents will enable inter-related data to be captured and correlated. This may also identify previously unknown data that can contribute to a better valuation of the asset. It is important that factors such as the frequency of data access, duplication and the relationship between dispersed data sets are considered within the discovery process.
Efficiently structured data sets improve the management of assets by increasing the speed of access, improving accessibility and ensuring that complete projects can be relocated, perhaps for a divestment or portfolio re-organisation. When the full spectrum of data is identified and understood, it can then be organised appropriately. This should follow a logical structure that mirrors the taxonomy and processes of the business.
When performing any restructuring it is essential that the referential integrity of the data sets is maintained, capturing references to external data sources where possible. A highly effective approach for this is intelligent integration to the very applications that manage the data.
Preserve and Retrieve
Asset data, such as that captured during a seismic survey, collected from the well site or interpreted data, doesn’t have a set expiry date beyond which it is no longer useful. Indeed data can evolve, changing hands and value many times, with factors such as fluctuations in oil price, regulation, compliance, acquisition or divestment all affecting its currency and relevance. Exactly what to preserve and for how long is a key business decision. Keeping everything is costly, but not keeping enough may be much more costly. The challenge is to define and keep the valuable data and enable decision makers to retrieve it quickly, efficiently and effectively.
Most asset data needs to be kept for a long time. Regulatory requirements can mandate retention of data for life of field plus seven years. Reserves estimation data should be quickly and easily reproducible on the demand of auditors within an alarmingly short period of time. However it is not always feasible to store all data online. Free space is at a premium but typically 80% of network attached storage has not been accessed for over six months. A structured archiving methodology can be adopted, to move infrequently used but still valuable data to more cost-effective disk or tape storage. Much of the data will be inter-related, yet it can securely reside across multiple file stores. Where data is managed by applications, a vendor neutral approach will ensure that consistent levels of detailed information about the data or ‘metadata’ is captured and complete referential integrity maintained. To allow everyone, not just data owners, fast location and retrieval of quality data with good provenance, it is crucial to catalogue the data logically and to capture high quality metadata during this process. This can be captured manually, including the wisdom only in people’s heads, and automatically from the managing applications, complemented by a process of content indexing.
These sound data management practices, enabled by specific software and process solutions, drive long term oil and gas asset values. They deliver clarity and control of information along with knowledge of the past and present while being ready for the future. Critical decision making is enhanced and the enterprise is enabled to react more quickly to change and to compete more effectively in the marketplace.