This is Molten's view on data management in the Oi l& Gas that Rory Colfer, Managing Partner, has recently presented at the ECIM E&P Data and IM Conference in Europe. For more information or to read on "How oil and gas executives could use ‘Big Data’ as a powerful source of competitive advantage", please visit our website http://www.molten-group.com/the-data-crunch/
Self explanatory.There are many more of these anecdotes. Typically people don’t like to share them – it can be seen to kill careers. Also it can drastically affect the share price of a company. However, in some appropriate and ethical way we should encourage and reward people who do share such anecdotes (at least within the company they work for) to help prevent them from happening again.
This is a question that Rory can start his presentation from…to engage the audience. The answer will smoothly lead to slide 6.
We have shown how the nature of data varies across the hydrocarbon value chain, however the method of putting a value on data is consistent throughout, and should be derived from the following 4 assessments for any given data type, as shown on the graphic on the right:-1. The business value of ‘good’ data –the value of all key business decisions which are dependent on fit for purpose data2. The cost of data acquisition and how it is represented on a balance sheet as an assetOne of the challenges in understanding the value of the data asset is that the cost of acquisition are not fully understood across the whole organization. A study undertaken for an IOC in 2009 showed that around 70% of their total data holding is technical data (compared with financial or general business data) and the majority of that is in the subsurface and wells area. A 2010 high level analysis of annual spending on petrotechnical data acquisition by another supermajor showed the following spend profile:- •Seismic - ~$1000mm •Well logs - ~$600mmReservoir - ~$250mmMeasurement / Logging While Drilling (MWD/LWD) - ~$90mmEngineering - ~$80mmPlant Equipment - ~$120mmTotal acquisition costs of all data were estimated to be well in excess of $2billion, a sum of not insignificant value, even for the oil and gas sector, and one that warrants a due level of care afforded to any asset. Furthermore, the cost of acquisition was seen to differ substantially depending on Function, with seismic data heading the list and Drilling, Engineering and Plan Equipment data each coming in at an order of magnitude less. Of course, the relative costs will vary from company to company depending on where their strategic focus lies. Often the value of these investments are entered on the balance sheet as assets and depreciated over time. Many times they are not, and are thus more difficult to identify when trying to get an overall true aggregate of spend on a particular data type. This requires a change in the approach to how such data acquisition spend is categorized and recorded – at least for the purposes for management reporting, if not accounting. With these insights, the value of the asset can start to be understood and the data can start to be treated like an asset. Until these insights are obtained and shared with executive management, we will continue to see that data are not managed like assets in oil and gas companies.3. The cost of data management – i.e. the operational management activity required to ensure that data is available, fit for purpose and accessible, including stewardship (setting and ensuring compliance with appropriate standards), data maintenance and access, archiving and records retention4. The potential business impact of ‘bad’ data on decision making in human, environmental and business terms (e.g. loss of life; loss of a valuable asset or licence to operate; loss of credible reputation amongst the investment and sovereign government communities; and the list goes on).
Oil and gas company invests significantly in data. According to a study recently undertaken for a supermajor, the annual amount their upstream division spent on the acquisition of Data (excluding management and technology costs) exceeded $2billion per annum. All of these acquisition costs ended up somewhere on the balance sheets of the company in question and applying a straight-line depreciation over 10 years to this data equates it to an asset worth more than $10 billion for that company. With $10billion an oil and gas company can acquire several production facilities. We cannot conceive of any of these assets not having ongoing investment in their care and maintenance. The same should apply to data - as with other assets of this value, a due level of diligence and care is required to manage and maintain it. However typically this ‘asset’ oriented mindset for data does not happen in the oil and gas segment, because business executives, through no fault of their own, do not or, more accurately, cannot see it in that way.The linkage and dependencies of successful delivery of BP Upstream’s business strategy is illustrated in this ‘value stack’ diagram, and applies to all four of BP Upstream’s key disciplines. Working from the top down, strategic objectives are set and appropriately qualified and experienced people make critical business decisions. Standard critical decision processes are defined for some key areas, such as OMS, and elements of some of these are automated / accelerated / controlled through the use of standard technology and tools, such as those defined in the Bill of IT, which in turn use data as their raw material and generate more as their output. A significant level of maturity has already been achieved across most of this value framework, including the way in which:- strategy is developed; people are appointed and managed; processes for making critical decisions are used and tools for assisting critical decisions are applied. However, the data layer requires more attention to improve its maturity in terms of its common definition and methods of management, to bring to it a level of due diligence commensurate with its value and importance.
To maximize the value of our data assets we need to manage them well. Traditionally what that means has not been completely or holistically define in our view. In our model we include what many others don’t, including performance management and organization capability, for example. Here is the complete models and the key components of each element. (explain each key theme)This kind of model can be used to drive a diagnostic process against what good looks like, and of course that is something we have a great deal of experience at in helping companies to define their DM strategies and priorities. Getting this wrong leads to the kind of mistakes and tragedies that I have already illustrated (this was TNK-BP)Getting it right will get significant efficiencies to flow – for example, as we have already demonstrated with one of our clients, enabling reservoir managers to become 15% more efficient in their work by making it easier to get data they are confident in.
A mechanism for evaluating the ROI of data in a supermajor has been applied in the following way. We consider a typical oil and gas business to be structured by Function, such as Exploration, Wells, Developments, Subsurface and Operations etc, and then by Region. By performing an analysis of a) the acquisition costs, b) the management costs and c) the agreeing a formula for the business value of data we can represent the role and value of data quite powerfully in the form of a ‘forensic appraisal’ of data as shown in this slide It is referred to as a forensic analysis to reflect the reality of oil and gas businesses today – that data is not managed as well as it can be, and the data to compare costs and value across functions and regions is not a straightforward exercise. The data has to be found, gathered and then analysed. The results of the forensic analysis will show where best practices within a company will be taking place and will allow those practices to be promulgated. It will also provide the basis for challenge in areas where evidence of suboptimal performance appears. The ultimate long term objective will be of course to move the company to a sustainable basis of such reporting such that it becomes a ‘business as usual reporting activity’ rather than a forensic one.
With all the hope and hype around big data you would be surprised not to hear me talk about it at least a little. First of all let’s acknowledge the fact that the term means different things to different people. So I would like to start this topic by offering Molten’s view on what the definition of Big Data is.‘Big data’ is a collection of complex data sets so large that they are often difficult toprocess using traditional database management tools or data processing applications.As an industry we’re actually pretty good at managing Big Data. We have been doing it for over 40 yearsWe routinely use IT to splice and dice large complex and diverse data sets like seismic, well logs and reservoir models, to deliver key insights that drive critical decisions such as depletion strategies and enhanced oil recovery plansHowever, as an industry we are leaving a lot on the table, but not widening our gaze and looking at the significant incremental opportunities. Two such examples are:-Analyzing big data around safety incidents and near misses will enable future incidents to be avoidedWe are still learning about unconventional plays. The nature of unconventional reserves is such that the are ‘unbounded’ and therefore difficult to estimate. We are therefore relying on the gradual build up of data and insights about unconventional reserves to tell us more about how to exploit them. By applying Big data principles to unconventional plays, we will be able to compare well production data profiles of newly completed unconventional wells with huge global data basis for quick identification of analog wells, wherever they are in the world, that will enable us to extrapolate and thus better predict unconventional reserves, production forecasts and thereby optimize well plansBig data is not easy however. There are challenges, and these include:-Capturing and keeping the data for long enough Providing access to huge data sets for analysis in an efficient and cost effective wayDeleting the data so it doesn’t get too big (deciding which data to delete and when – for example there is no need to keep invoices and transaction data after regulatory records retention rules have been surpassed)Presentation of key insights so they can be easily explained and justifiedHowever, these challenges are being met and gradually big data is becoming a recognized additional means of competitive advantage for oil companies.
Data costs huge money. Yet, while careful about investing billions of dollars to acquire data every year, some oil and gas operators are borderline irresponsible about how they maintain it. In addition to profit and market impact it’s not enough for executives to consider ‘people’, ‘process’ and ‘technology’ when assessing their business initiatives. Now we must consider data too.People who manage data in Exploration and Production (E&P) businesses generally are given very limited career development possibilities whereas people who manage other assets are more highly regarded. Data management is generally a career cul-de-sac. This point is emblematic of the whole data management issue.Get it wrong and it’s wrong on a macro and a micro scale. On a macro scale the impact includes tragedies such as loss of life, environmental disasters and the potential to bring down the biggest companies in the world. At a local level wide-scale inefficiencies and significant lost opportunities result.Get it right and significant efficiencies flow. Good Data Management drives business performance and mitigates risk.
What is the cost of Big Data? – it depends on your ambitions, but expect to spend several million dollars (typically single digit though can be more) on a best in class solution on a global basis How long does it take to do a data management assessment? – Typically between 6 weeks for regions, 3 months for global functions and 4-6 months for a the entirety of a global supermajor.What is your experience in getting people to share anecdotes? It’s tough. We’re all human and don’t like admitting to mistakes, particularly if tens of millions of dollars were lost as a consequence or even worse loss of life. However, there are ways of incentivising it. The first step is to collect them internally in an ‘under the radar’ light touch way, and share the results, protecting those who submitted them (e.g. my anonymizing them or not releasing data that makes it obvious who was culpable). However in our experience the impact can be stunning to the point where you really can wake up a sleeping giant to pay more attention to data.Why do you think that the industry can achieve an 8% accuracy in RRR, when we have been getting 10% on average over the years? We know this is achievable because according to our research several companies are already achieving this on a consistent basis, so why not all? Can you reveal which companies you assessed in your DM maturity assessment? No, sorry that is confidentialWho would you say are the leading performers? And the worst? I would consider BP to be quite mature now, and leading in much of its thinking around professional discipline and how they manage change in DM. Shell are also strong in different areas – for example governance. ExxonMobil benefit from their strong control culture which translates well to data management. Many other supermajors are now making good progress, and catching up. As you saw in the main presentaion, we see a trend that shows that in general supermajors are more mature than large independents who are more mature in general than NOCs who are more mature in general than smaller independents.How did you assess their DM maturity – what framework did you use? Some of the companies have been subjects of benchmarking exercise we have conducted for one of the clients. We also gathered insights from our interactions with several companies at meetings and attending conferences.How did you get data on reserves replacement? All data used is publically available through annual reports or investor reports.What percentage of total value of a data asset do you think a company should be investing in its duty of care? We feel on a global basis a central budget should be set at somewhere between 1 and 3 percent on a global basis. So for a $10billion asset that would be between $100million and $300million. Sounds a lot but it isn’t compared to the value of the asset. How did you manage to get the financial data to drive your forensic analysis on data management costs? It was hard because companies are not used to recording data management costs consistently across regions and functions. In many cases data management is not recorded as such, but may be put into other existing boxes such as subsurface or IT, without easily revealing its DM.How do you go about getting the attention of senior business executives when they have many very important issues to deal with? We always start with the cost of bad data management i.e. the risk side of the equation which is illustrated by the anecdotes we have included in this presentation. That gives them something to worry about. Then we talk about how to make that worry go away, which is about diagnostics, strategy, appropriate investment, integrated planning and a formal performance management framework. In fact it’s about all the elements in the structure I have described to you.