Wednesday, December 21, 2011
Jim Crompton, Senior IT Advisor for Chevron, addressed the SPE Digital Energy Study Group in Houston on 16-November on the topic of the “Digital Oil Field IT Stack”. He announced that he wanted to address and be a bit provocative about what he described as two recognized barriers that came out of the panel discussions at the SPE ATCE in Denver a couple of weeks before. His experience comes from implementing what he described as “gifts” from the Chevron central organization in diverse business units for real world application. He felt the two unaddressed barriers were change management, and the need for a standard infrastructure and architecture, which he proposed to describe. His presentation started with some standard, but according to him, neglected trends in the expanding scope and role of IT, including increased digitization, a move into plants and fields, and the need to address the latest generation of IT consumers, which he described as the first generation of oilfield workers to have better IT infrastructure in their homes than at work. He acknowledged that in many cases, the “first kilometer” is still a problem, as where an entire offshore field may be instrumented with fibre optics, but the link to the onshore office is still via low bandwidth microwave links (he only half jokingly suggested lack of telcom coverage as a positively correlated indicator for oil occurrence). So how do we leverage the hundreds of thousands of sensors on a new greenfield platform and move from a “run to failure” mode to one of proactive failure detection and avoidance? Jim cited some examples of predictive analytics, Statoil’s experiements with injected nano sensors that report back on reservoir conditions, distributed sensors for real-time optimization, and new mobility platforms for field workers. But the most interesting new idea was that of borrowing sensor mesh architectures from agricultural and military applications to go beyond current de-bottlenecking workflows and address the advanced analytics used by electrical engineers in their instrumentation. He indicated such a robust and cheap architecture “pattern” might be one of maybe half a dozen that an IT group like Chevron’s might use to provide semi-customizable solutions. Part of the frustration he acknowledged was that at least at Chevron, his best Visual Basic programmers are petroleum engineers using Excel, and they are more in touch with MicroSoft development plans than his IT group and upset that the next version of Excel will remove Visual Basic and move it to the Sharepoint platform. Faced with Chevron now having over 20 million Gigabytes of digital data under management, he suggested treating the information pipeline in the same way we manage hydrocarbon pipelines, and trying to prevent “leaks” to unmanaged environments, like Excel. He showed some digital dashboards that could provide a balance between real time surveillance and advanced modeling, mix the needs of mapping and reporting services, and move organizations up the Business Intelligence maturity model. He finished with a quick nod to HADOOP solutions and a need to move away from “creative solutions that only solve when the creator is present”.