Big Data And Microseismic Imaging Will Accelerate The Smart Drilling Oil and Gas Revolution
Looking for good news in American innovation? Over the past five years, technology has improved the productivity of the typical oil or gas rig on America’s shale fields between 200 and 300 percent according to industry experts at Bentek. This stunning gain in energy yielded per dollar of capital explains why U. S oil production has grown faster than anywhere in the world.
Technology has improved wind turbines and solar cells too. But with both of those, a 200 percent gain in productivity took 20 years. Consequently, in two years American oil output has risen five-fold more than wind production and 200-fold more than solar have in two decades (in energy-equivalent terms).
Consider another important benchmark: the average cost to find and yield a new barrel. Globally, that cost has increased from $6 in 1998 to $27 a barrel today. But in America’s oil shale fields it’s now dropped to the $7 to $15 range. This reality explains why $150 billion of foreign direct investment flowed into U.S. shale fields over the past four years.
America’s on-shore oil output has grown more in the past five years than has output from off-shore Gulf of Mexico in twenty years. The International Energy Agency forecasts U.S. output will soon surpass Saudi Arabia’s.
These stunning changes in the domestic oil and gas markets have taken nearly everyone by surprise. But can this kind of productivity boom continue? You bet. And it will happen because of yet more technology — information technology to be specific.
Parked in the loading bay of a small tech company in the outskirts of Los Angeles — yes, L.A., not Houston or Silicon Valley — we see the future of big data in the emerging digital oil fields. We’re not talking about the racks of computers, optical convertors and interrogators on hand, but the hundreds of feet of dirty cables thick as a forearm lying on a flatbed trailer. Buried within the cables are the same hair-thin optical fibers that underpin the global Internet. But here U.S. Seismic [NASDAQ:ACFN] is using the exquisite properties of optical fiber to sense and generate massive data streams at the critical working end of oil and gas wells in the deep hot subsurface. This imaging technology does for seismic the equivalent of going from 20th century x-rays to 21st century MRI.
It’s technology like this that prompted IDC, the global information-tech consultancy, to assert that "unconventional resources (shale gas, tight oil) will drive innovation in the expanded use of Big Data." Or, as Bill Gates recently said: "The one thing that is different today [in energy] is software, which changes the game."
Information has been the sine qua non of the oil industry from inception. It’s always been about knowing where to look, where exactly to drill. But seeing through rock is hard. The earth is opaque to everything in the optical spectrum and essentially all radio waves. Only vibration — a.k.a. sound — propagates through rock. (Well, for the physics purists, gravity waves, neutrinos and other exotic things do pass through the earth as it if were nearly non-existent.)
Although horizontal drilling and hydraulic fracturing — fracking — have been widely reported as the reasons for the recent American oil & gas boom, neither of these is explanatory. In fact, the techniques are decades old. The boom emerged from smart drilling.
At the core of the hydrocarbon revolution we find 3D seismic maps that let you know where to drill, and down-hole sensors that tell you where to steer the drill underground. Analogously, a car and a highway are, tautologically, essential for taking a road trip — just as fracking and horizontal drilling are for shale production — but without a map and headlights, (it’s always dark underground), the drive would be fruitless.
And to stretch the analogy, what comes next in the oil fields is the equivalent of your car’s GPS; i.e., high-resolution microseismic imaging, the domain of tech players amongst which we count U.S. Seismic. Microseismic brings radically enhanced navigation and, critically, the opportunity for a new era of dynamically controlled subsurface operations. This will both sustain the current hydrocarbon revolution and unleash another round of productivity growth. But we’re getting ahead of the story.
To understand what the future holds requires a bit of a deep dive to see what subsurface information technology has already done. It begins with seismic maps.
Seismic technology uses acoustic vibrations that propagate and reflect back to the surface. Buried in the resulting interference patterns detected by geological microphones — geophones — one can divine the subsurface features. Blind bats use the same acoustic principles with great precision through air, as does the Navy through water, and doctors with ultrasound.
At the dawn of the oil industry in the 19th century, geologists tried to divine what was below by looking at the surface for oil seepages or landscape features that signaled the possibility of subterranean oil fields. Petroleum’s information age began early, in 1924 in Brazoria County, Texas, with the first two-dimensional (2D) seismic map that yielded a successful oil well. Then in 1972, Exxon, along with six oil majors jointly funded a defining field test using a 3D seismic survey. Even with the capability of the mainframe-class computing at that time, a month’s worth of 3D data took two years to process. But the test successfully identified new places to drill in a mature field.
With 3D seismic, the subsurface resolution jumped orders of magnitude by going from the mile-level spacing of the 2D sensors to 100-foot spacing. The quantity of data gushing out of 3D surveys would have overwhelmed petroleum engineers but for the happy coincidence of the dawn of modern computing.
It didn’t take long for the now-familiar combination — more data from better geophone sensors, better algorithms, and more compute power — to change the oil world forever. One perhaps unsurprising discovery: 3D showed that the 2D maps were not just low resolution but often wrong. U.S. drilling success rates rose from an average of barely 50 percent in 1972 — a coin-flip, hence "wild cat" drilling — to over 85 percent today.
Even with good 3D seismic maps, drilling still requires information about precisely what the head of the drill string is chewing through. While crude electrical and chemical sampling and measurements date back a century, it was the combination of the steerable drill (invented in the mid-1980s) packed with modern electronics and sensors that enables today’s measure-while-drilling capability. We are now firmly in the era of smart drilling. But it’s not quite smart enough.
In the shale fields in particular, the challenge and opportunity is clear from a singular fact. Fracking is done in sections called stages. Only one out of four of the stages produces a payday. Such low yields would be anathema in modern manufacturing. Oil bears see this as a warning sign. Tech companies see it as an opportunity.
The potential for huge gains in output are arithmetically obvious. But getting all the stages to be productive can only come from far more precision and far higher resolution of the underground domain. In particular, operators need to move from today’s static and episodic geophysics maps to dynamic models of how rocks and fluids change. The next revolution comes when, as 3D pioneer Roice Nelson puts it, "you can listen to the fluids in the reservoir." Nelson should know. In 1984 he co-founded an early 3D survey company, Landmark, which Halliburton acquired in 1996.
Geophysicists like Nelson talk about "listening" to how and where fluids are flowing and how and where rocks are changing. The goal is to listen how and what parts of miles of 6 to 12-inch pipe thousands of feet down are producing. For that you need far more sensitive detection, with far more bandwidth than is possible with conventional seismic.
You need to see the micro-fractures from the real-time results of the fracking itself. Hearing the millimeter-scale cracks forming complex arrays in shale a mile away, then separating out the signals from confounding environmental noise is like hearing a whisper from the other side of a noisy auditorium. All of these "micro" events are not detectable with standard seismic. Enter the era of microseismic technology.
By oil industry standards microseismic is new, becoming commercial barely a decade ago through a familiar combination — better electronics yielding more sensitive sensors, higher-speed communications down-hole, and the fruits of the on-going computer revolution.
The giants of the oil services industry all have microseismic offerings: Schlumberger [NYSE:SLB] with StimMAP, Halliburton with FracTrac, Weatherford’s SeismicSpear, and Baker Hughes [NYSE:BHI] with IntelliFrac. Without microseismic fracture mapping, drillers are essentially wildcatting underground, the equivalent of what they used to do on the surface. And just as the advent of 3D not only expanded and illuminated the errors in 2D, so too are microseismic surveys quickly revealing the holes and even errors in 3D.
But today’s best-in-class geophones cannot fully unlock the potential of microseismic. The sensors need to see (hear) signals at 100 times the bandwidth, and be at least 100 times more sensitive (quieter). And, in an ideal world — which eventually becomes the standard operating procedure — imaging has to move from snap-shots to continuous real-time monitoring. Geophysics leader CGG-Veritas calls this SeisMovie. This will require a new class of sensors both cheap and robust enough to survive for long periods, even permanently in the hot underground. Conventional electronics cannot deliver that combination. This is where U.S. Seismic’s fiber optic technology changes the game.
Physicists have long known that fiber optics can be used to perform virtual magic with photons. For communications systems, information is encoded in the light carried in fibers. Similarly, used as a sensor, even a subtle environmental change like vibration encodes itself into a laser beam inside a fiber. The tantalizing feature of a fiber-optic sensor is not only its exquisite sensitivity to vibration, but that the fiber itself needs no power or electronics. And, despite being fabricated from glass, optical fiber is remarkably robust and temperature tolerant. While fiber optics are already in use for some tasks including temperature and crude acoustic sensing, the industry still lacks (until now) a viable fiber geophone.
Making practical fiber optic geophones has been devilishly difficult. The issue isn’t in the physics — though the physics-based algorithms are critical. The challenge has been to come up with a design that is simple, easy to manufacture, yielding a highly reliable and reproducible product. This is where U.S. Seismic’s CEO Jim Andersen brought unique experience.
A Naval Academy grad, Andersen served as the Engineering Officer on a submarine before starting his post-military career at Litton (acquired by Northrop in 2001). The daunting realities of submarines operating self-sufficiently in a terribly hostile environment forges a vital instinct — a near monomaniacal focus on ensuring that everything is reliable, practical and simple. At Litton, Andersen took over a team that had been struggling to develop a new sonar based on fiber optics. His drive to make-it-simple successfully culminated in a practical design and a $400 million contract to deploy the fiber sonars on submarines.
Fortunately for the oil & gas industry, Andersen brought that core team and intellectual property out of Northrop into U.S. Seismic where the technology was adapted for the geological subsurface, and the design further advanced for low-cost high-volume manufacturability. In the same practical spirit, the output data stream (from the entirely surface-located electronics) is directly compatible with existing seismic imaging software and analytics. So U.S. Seismic is now running the usual gauntlet of "show me" field trials; a standard procedure central to all industrial domains.
In a hotly competitive market it is doubtless frustrating to U.S. Seismic that many early customers, in particular the bigger players, require confidentiality. Fortunately, some of the boutique players are less reticent. One customer and partner, Gary Tubridy, CEO of Avalon Sciences, founded his oil & gas tech company to focus as well on supplying specialty high-performance tools to the big service companies like Baker Hughes and Weatherford. Tubridy, who began his career shooting high-resolution borehole 3D seismic in the North Sea, is bullish on the "game changer" that will come from fiber geophones. Once U.S. Seismic’s field trials are complete, along with the normal subsequent tweaks, he’s ready to begin the commercial roll-out with the fiber sensor incorporated into Avalon’s new BOSS, Borehole Optical Seismic System. Tubridy says his company is starting to get a lot of inquiries for permanent subsurface monitoring capabilities.
And Avalon has developed its own companion product: a down-hole acoustic source. In one mode, microseismic is passive, listening to flows and to the "snap crackle pop" of fracking, as one industry veteran phrases it. In an active mode, especially for exploration, the subsurface needs to be ’illuminated’. With 3D seismic, both sensors and noise-makers are on the surface. But if you want a clearer underground picture using a down-hole microseismic ’camera’, you need a ’light’ underground too; hence Avalon’s borehole noise maker.
Cheaper, faster and better is a familiar trajectory. But the revolution really takes hold as the full power of U.S. Seismic’s richer data streams enable the emergence of better algorithms and better dynamic geophysics models that accommodate the complex mix of both plastic and brittle behavior of rocks in the subsurface. This will play directly into big data analytics where correlations become possible now across the entire suite of hydrocarbon information — surface survey maps, drill-bit sensors, flow rates, pressures, temperatures, chemical analyses, etc. There is no chicken-egg uncertainty in which comes first; better and richer data flows will drive the emergence of new big data analytics.
Here again the oil majors and service companies are either proprietary or coy about their software pursuits. But some of the big software companies are less so, seeing the potential in the oil & gas domains, including Oracle, IBM, SAP and Teradata [NYSE:TDC]. If history is any guide here, the smaller and near-start-up classes of IT companies may rapidly innovate around the bigger players as the hydrocarbon data tsunami grows.
Amongst the smaller companies we find the likes of Neos (where Bill Gates is an investor) and Sigma3 which acquired Fusion Geophysical shortly after we wrote about them in 2011. (We liked their clever software, and moniker, From The Geophone To The Drill Bit.) Then there are companies whose DNA is anchored in both the IT and geophysical domains, like SR2020, also in the L.A. area where CEO Bill Bartling is another U.S. Seismic customer and partner. Bartling’s pedigree, with his background in places like Silicon Graphics, software company SciFrame, and at Occidental Oil & Gas where he was manager of technical computing, telegraphs the IT-centric reality of hydrocarbons.
For Bartling there is no ambiguity about the future of oil and gas tech, especially in the productive shale fields. According to Bartling, microseismic capabilities are not only economically exciting, they promise collateral opportunity for greater safety and environmental monitoring at de minimis costs, or even as a ’free rider’.
But it’s the disruptive potential for microseismic to change the operating paradigms for exploration itself that animates Bartling, predicting that it enables a migration from massive, expensive and long-duration 3D surface seismic, to modular, incremental and fast surveys. This is another familiar trajectory, like the change from central to distributed computing; from mainframe to PC. SR2020 intends to be on the front lines of the revolution and has already placed an order for one of the first big commercial fiber arrays from U.S. Seismic.
As Bartling points out, while the popular media has gushed (appropriately) about the billions of barrels in shale deposits, microseismic mapping and real-time monitoring provides the key to unlock more of what’s left behind in today’s horizontal wells (the 80 percent unproductive stages). And microseismic will play a critical role in figuring out how to optimize processes to unlock the geologically different shales both in America and around the world.
Ernie Majer, a geophysicist at the Lawrence Berkeley National Labs, is bullish about microseismic (and the performance of U.S. Seismic geophones, having tested them). He foresees the same kind of revolution that has altered the field in decades past. In particular, Majer expects that another emerging technology, micro-drilling, can be combined with microseismic in due course. Micro-drilling is precisely what the name implies, much smaller drill rigs that can drill smaller holes far more rapidly (10-fold) and thus far less expensive, making it ideal for exploration and subsurface sensor placement.
Big changes from big data are now coming fast to the oil & gas industry. First of course, big data will wring more efficiencies using all existing data. This always comes first. But the combination of the exaflood of new sensor data with increasingly ubiquitous access to Cloud-based super-computing, available at low cost and by-the-drink, is bullish for everyone, especially for smaller companies. Better sensors and the Cloud levelize the playing field between oil majors and ’minors’. It was, after all, the 18,000 small and mid-sized oil & gas companies that were almost entirely responsible for pioneering the production boom on America’s shale fields.
The incendiary combination of America’s entrepreneurs and the inexorable force of technology was, and is, an anticipatable revolution. As my colleague and I wrote eight years ago in our book The Bottomless Well, when we predicted oil abundance:
When we wrote that, it was a peak time for the peak oil theory, the ostensible imminent and inevitable demise of the age of hydrocarbons. You know the world has changed when the New York Times notes that America could become an oil exporter.
And it has only just begun. With the convergence of microprocessors and microseismic imaging, the revolution expands. As the trenchant Ed Morse, global head of commodities research at Citi, has said: "Peak oil is dead." And Big Data will put a stake through its heart.
This piece originally appeared in Forbes
This piece originally appeared in Forbes