The challenge of linking seismicity with human activity is not a new one. Surface seismic networks, for example, were established to monitor induced earthquakes related to coal mining activity, as early as the 1900’s. Since then the science and technology for real time seismic monitoring has evolved. From the turn of this century greater sensitivity downhole 3-component sensors, coupled with advanced low noise downhole digital electronics and telemetry systems, have delivered a paradigm shift in how well we can spatially locate increasingly smaller earthquake foci in a 3D or even a 4D space. The real time monitoring and location of such ‘microseismic’ activity is now common practice during unconventional hydrocarbon reservoir injection and depletion, where triggered events can be correlated with the spatial distribution of hydraulically fractured pathways (Fig. 1) and so indicate and optimise total stimulated reservoir volume and integrity.
It is, however, only in the last few years that legal and regulatory bodies have begun to formalise the process for instrumenting, monitoring and most importantly characterising all significant seismic activity related to the reservoir activity. The economic incentive for operators and service companies to focus primarily on the very small microseismic (<~M0) events has yet to be extended to the practice of precisely monitoring and defining larger scale (>~M2) activity, especially after unconventional operations are completed.
Induced or Triggered?
The larger scale ‘human-induced’ earthquakes are now a common topic for discussion, several of which have been identified as greater than M6 events located proximal to hydrocarbon extraction zones. Recent unconventional operations associated with any ‘felt’ seismicity, especially in NW Europe (e.g. Groningan/Ekofisk/Preese Hall), have resulted in lengthy disputes between the industry, seismologists and an increasingly sceptical public. Such impasses are only convoluted further by the lack of any commonly accepted rules for the discrimination of induced seismicity from triggered events.
McGarr and Simpson (1997) amongst others have attempted to refine these terms where ‘induced’ seismicity is said to be created only when volume change occurs in the Earth, most likely by Human action during reservoir depletion. In such cases vertical shear stress is expected to be at a maximum at the edge of expanding reservoir, leading to dip-slip events. ‘Triggered’ events on the other hand are associated with the release of natural tectonic stress. It should be said that ‘microseismicity’ associated with the injection of proppant along a fracture plane during fluid injection can also be said to be ‘triggered’ as the effective stress changes trigger shear movement along/across the fracture network (Gupta, 2002; Ellsworth, 2013).
To differentiate seismic events recorded during reservoir operations by these terms, the spatial and temporal distributions of the foci need to be precisely and reliably resolved. The question of how close (in time or distance) a seismic event has to be to the reservoir operation in order to be assumed related, remains a subjective one. Such a differentiation is made weaker when trying to discern induced events within tectonically active regions where natural earthquakes frequently occur. Seismologists argue that a more Bayesian attitude would be to integrate the recorded 4D foci distribution with the physical failure mechanisms whilst also incorporating the modelled local/regional geomechanical regime (Dahm, et al 2015). An example of such is the efforts to characterize frictional stability parameters as a function of fluid pressure on the seismic potential of faults. This could then be used to help qualify a triggered/induced probability value for an event (Scuderi and Collettini, 2016).
The categorization of the event foci could be further constrained by comparing the frequency magnitude distribution (b-value). McGarr and Simpson (1997) highlight that triggered and induced events have distinct distribution patterns. Comparing the b-value with the source mechanism can tell us something about the dimensionality of a seismic dataset such as whether they, for example, follow along a planar distribution or fill up a 3D volume (Fig 2). This then potentially provides the regulator with a further quantitative parameter in the process of qualifying induced earthquake occurrence.
The methodology of acquiring such data is becoming firmly rooted in the practice of permanent/semi-permanent passive monitoring. This continuous recording of events prior to, during and after injection (potentially years) would provide an ever more stable background seismicity model, especially if combined with a known fault map. However, this can add significant cost to the process, especially if monitoring equipment deployed within hostile well environments is vulnerable to long term exposure to high temperature/pressure and chemically aggressive well fluid. Contractors and manufacturers need to adapt and ruggedize existing technology to meet such a demand for continuously recorded data.
Appropriate Regulation - Collaboration is Key
Standardising such integrated ‘Big Data’ into a legal framework that both regulatory and industry bodies can adhere to presents some very broad challenges. In Europe, monitoring is within mainly a risk management system rather than a production system. Demands from insurers and regulators are especially fragmented and so require applied research and specialised niches. A UK regulation example is the ‘traffic light system’ (Fig. 3) which applies operational constraints when a certain recorded moment magnitude is reached. This has been criticised as being too crude a regulatory threshold (EAGE Passive Seismic Workshop, 2016) with suggestions that peak ground velocity (PGV)/max acceleration would be a much more appropriate measurable variable that is proportionate to ‘felt’ seismicity. Probabilistic ground motion monitoring could be an option to give confidence in PGV measurement whilst also delivering a more positive engagement with the public, especially if shown to be adapted from a more favourably perceived energy extraction methodology e.g. Geothermal.
Unconventional extraction in Europe can no longer be justified with US analogues. Recent events in USA (2015+ mostly associated with seismicity linked to waste water disposal) has given much unwanted publicity and pressure to hydraulic fracturing and its management. Both US and European regulatory authorities are watching each other to see what actions will have the first positive perceived impact.
A network is required in Europe to defragment and standardise what is required from the operator, regulator and academia. Such criteria if agreed will need to be appropriate for the background seismicity of a region. Permanent monitoring will help constrain uncertainty and protect operators whilst bolstering the seismic recording market. Seismicity applications in other industries such as geothermal/mining already have much a more robust and established regulation practice. By encouraging bodies to take a multi-disciplinary approach across these industries we will see the beginnings of quantitative, testable rules for the probabilistic discrimination of human-induced, human-triggered and natural earthquakes.