The fundamental idea in this article is that oil and gas technology has progressed to the point where we can – or perhaps should – stop using the word unconventional and simply recognize that oil and gas can be found in many places and in various qualities, and that more or less all of them can in principle be developed and produced. The question is then simply “at what cost?”; or more precisely, “what is the margin per barrel or per million cubic feet, and how does it compare with alternative investment opportunities?”
Reflecting first on technology: Barclays Capital publishes an annual review of the oil and gas industry, including a summary of its survey of the ‘Most Important Technologies’, based on the percentage responses received for each of 12 candidate technologies. As in 2009 and 2010, the top three for 2011 were Fracturing/Stimulation, Horizontal Drilling and 3D/4D Seismic, accumulating between them more than 70% of the responses. What is more, the same technologies have dominated this survey for all 12 years for which data has been published, invariably accumulating more than 60% of the responses between them. Over this period, the only other technology to win more than 10% of the responses has been Directional Drilling, garnering 11% four times.
One way to look at these results is to say that responses may be dominated by professionals working in North America and that the favoured technologies will therefore simply reflect what is happening there, in particular the pursuit of shale gas, shale oil, ‘tight’ gas, and coal bed methane, as domestic sources of conventional hydrocarbons begin to diminish.
But there is another way to look at it. The survey-leading technologies are those that offer the means to identify the presence of hydrocarbons in ‘tougher’ reservoirs and then extract them. Put another way, wherever there is a prolific source rock, our industry has developed the technical capacity to move away from conventional reservoirs with good porosity/permeability characteristics, and extract petroleum wherever it is reservoired – whether still in the source rock, in ‘tight’ sands, in fractured basement and so on. The North American industry is leading this charge.
Shale Gas and Shale Oil
Much has been made of the shale gas revolution in the USA, with production from such reservoirs rising from under 1 Bcfpd in 2003, when fracturing/stimulation, horizontal drilling and 3D/4D seismic were first used, to nearly 20 bcfpd by mid-2011.
However, despite this success, much shale gas production is not commercial at current gas prices. Specifically, nearly all shale gas plays require gas prices to be in excess of $4 per million BTU, mostly well in excess, towards $8/mBTU. The benchmark ‘Henry Hub’ US gas price dropped below $4/mBTU in mid-2011 and is projected to be below $3/mBTU this year.
A second difficulty is that shale gas resources do not respond to the same methodologies for conversion to reserves as conventional fields where it is acceptable to carry out reasonable appraisal to define the static reservoir and estimate OIIP, undertake a moderate amount of testing/flow assurance and then apply an often conservative recovery factor based on analogue fields. In the case of shale gas reservoirs, recoverable gas has to be estimated from the performance of existing producing wells, often over a relatively short production history, resulting in a wide range of possible ultimate production volumes and therefore asset reserves.
One could take the view that the best cash flow from shale gas has, in fact, come to oil field service companies that supply fracturing/stimulation, horizontal drilling and 3D/4D seismic, and to the US ‘resource play’ companies that have sold or farmed-down their shale gas assets. For more detailed reviews of the issues surrounding shale gas, I refer you to the work of the respected petroleum geologist Arthur Berman(1)(2).
It is not surprising, therefore, that having learned how to apply the key technologies, these US ‘resource play’ companies are switching their attention to shale oil, in most cases in basins which have a long history of conventional oil exploration and are now in decline. The economics of shale oil are generally better than those for shale gas because of the linkage to global oil prices. The same challenge with the estimation of reserves remains.
Other countries and companies have not been slow to board the shale gas ‘train’. For example, Repsol YPF has raised its estimate of shale oil and gas resources in Argentina’s Vaca Muerta formation to 22.81 Bboe, quoting an external audit that shows that this formation holds gross prospective oil, condensate and gas resources of 21.17 Bboe in an area covering 8,071 km.
Turning briefly to western Europe rather than the US, pursuit of shale gas is encouraged by European gas prices but challenged by the limited availability of Fracturing/Stimulation equipment: where exploitation has been attempted, for example in Poland, the rocks have disappointed and success is still awaited. Perhaps shale oil, related to one of the region’s major ‘oily’ source rocks, will be more fruitful.
Waxy or Heavy Oil
‘Waxy’ crude, which is derived from lacustrine source rocks, is in production throughout South East Asia and in Rajasthan in India, and development planning is under way in the Albertine Rift in Uganda. Flow assurance is the critical issue, with the risk of the crude oil solidifying in flow equipment, for example when exposed to low temperatures in the oceans. The technology to solve these problems – special chemical additives, down-hole pumps, heated pipelines – is all tried and tested, and these projects are or will be economic.
However, recent discoveries in the pre-salt offshore Brazil and the North Falklands Basin are all sourced from lacustrine source rocks which are known to invariably yield a high wax content. Whilst Petrobras, the operator of the pre-salt discoveries, has plans to deal with this issue in what are reportedly huge discoveries, the Falklands discovery is much smaller and in a relatively hostile environment. The operator has mentioned that the crude oil is ‘waxy’ but has yet to publish any analytical data.
‘Heavy’ oil is the result of a poor seal, allowing the light components to evaporate or be consumed by bacteria, leaving a poorly-flowing viscous residue. In the North Sea it is in production at Grane in Norway and at Captain and Alba, both UK Continental Shelf, while also in the UKCS development planning is under way at Mariner, and appraisal at Kraken and Bentley, the last of which may be difficult due to the viscosity of the oil.
The new challenge on the UKCS is to develop a relatively modest number of remaining discoveries where the viscosity is greater than 5 centipoise (cP) and API below 22°. This has already begun, with for example the development of Captain (88 cP), Gannet E (20 cP) and Clair (up to 20 cP), and development planning is well underway for Mariner (up to 540 cP). Bressay (up to 1,000 cP) has been studied extensively but no development plan has been forthcoming. Excluding extensions and prospects, in 2006 there were 19 UK Sector North Sea heavy oil ‘fields’, ranging in size up to around a billion barrels of oil-in-place, although all but three of these are below 500 MMbo in-place. If the extensions and prospects are included, in total there are around 10 Bb of heavy oil in place on the UKCS.
The fields are located where the water depth is around 100m, with the reservoirs themselves at depths of 600 – 1,800m subsea. Although suitable infrastructure is expensive, the technology exists: assisted recovery is likely required to obtain realistic rates (1,000–10,000 bpd) and recovery factors, even with long horizontal wells. This is different to most UK sector experience so far and technology must be adapted to meet the challenge. In addition, the oil may have to be sold at a discount to Brent.
There are a variety of definitions of heavy oil and in a world context the UKCS viscosities are relatively modest but one can perceive a sensible ‘queue’, the lower viscosity fields being developed first. The maximum viscosity in the UK data base is around 2,000 cP, small compared to the Orinoco extra heavy reserves in Venezuela, with API of 7 – 10° and up to 5,000 cP, while Canadian extra-heavy crude has viscosity in the range 5,000–10,000 cP.
Basement reservoirs owe their petroleum storage capability and productivity to the presence of naturally permeable fractures providing a lattice of void space within rocks which are typically igneous or metamorphic such as granite, basalt and gneiss. Such plays occur around the world, most notably in onshore China where they are referred to as ‘buried hill’ plays.
They present particular challenges to the explorer in that appeal may have to be made to sophisticated ‘plumbing’ for hydrocarbons to migrate from source to trap, while conventional seismic will not easily distinguish between a basement feature which is fractured and one that is not.
These plays provide development challenges too in that it is important to find ‘sweet spots’ where enhanced fracturing taps into granular porosity. It is then necessary to drill horizontal wells that access as many fractures as possible, for example by drilling normal to any preferred fracture orientation.
Overall, these plays provide some tricky problems for geoscientists and reservoir engineers, all of them soluble but with extra risk factors thereby introduced into economic calculations. A key technology will be multi-component 3D seismic which will allow fracture density and orientation to be mapped.
What Does This Imply?
It is a fact that our industry has developed the technology to escape from the ‘conventional’ – light oil or ‘clean’ gas in good poroperm sandstones or carbonates – resource limitations. Will this precipitate a dramatic change in the way many explorers think? The starting point needs to be plate tectonics, paleo-drainage systems and paleo-climatology so that we can arrive at a view of where prolific source rocks exist. Following this, we need to understand petroleum systems in an integrated fashion so that we can model a source rock’s maturation history and predict where expelled hydrocarbons might have migrated to – if indeed they have left the source rock! And then we need to understand the dynamic properties of these ‘unconventional’ reservoirs. Now at this moment, I can hear a large group of both ex and current colleagues saying “That’s what we always do!” And that of course is true – in some cases.
However, for the first step in understanding regional geology, what is obvious is that extraordinary amounts of very different types of data are now available in the public domain to supplement the proprietary data a company might itself hold, whether rock samples, geochemical analyses of seeps, well logs, seismic and so on. Integrating this mountain of data and making sure everybody is looking at the same thing is both difficult and time-consuming, and demands innovative technologies.
A team of subsurface specialists – whether working on a basin, a prospect, a discovery or a field – can in principle access very large amounts of different types of data, each requiring its own conditioning and analysis, before attempting to integrate these many strands into a coherent interpretation, certainly in three dimensions and possibly four. I say ‘in principle’ and ‘attempting’ because in reality the amount and diversity of data available to sub-surface specialists has outpaced the ability of their systems and workflow processes to manage, integrate and interpret it.