By Evan Schuman
Energy executives know that they need to embrace efficiency and sustainability goals, such as Net Zero for 2050. But the best ways for the planet to move forward require consumers to make major changes across the globe, transitioning from gasoline-powered vehicles to electric, for example. Not only will such change take a very long time, but energy executives have a limited ability to materially move that needle.
What energy executives can do, however, is to start transforming their current operations, making them more effective, productive and efficient. The sustainability improvements that can be made today with more aggressive use of technology far outweigh the often glacial movement of humans making an effort to adjust their behaviors and preferences.
Artificial intelligence (AI) and machine learning (ML) capabilities can provide energy companies with valuable insights to know where to drill, when to drill and how to most effectively utilize the assets they find. With the goal being sustainability ROI, that will deliver far more energy dividends than any other single effort.
Consider one of the biggest elements in reducing greenhouse gas emissions: Carbon dioxide (CO2) capture and sequestration (CCS). That complex process involves underground water and salinity levels, water flow, and the best process for injecting CO2 into the aquifer.
This process involves an overwhelmingly large number of considerations involving physics, geology and mathematics, and is the ideal environment for AI algorithms. With AI and especially ML, engineers are no longer constrained by what they can measure in a physical space, giving them the ability and freedom to create any number of features that are meaningful to them. AI uses raw data to create properties that look like the physics; it understands the nature of pushing water out of the way and dealing with the limits of usable space.
How much of an improvement in efficiency is at issue here? Several orders of magnitude of time reduction, typically going from a pure physics-based approach taking a year to an AI approach that can complete the analysis in minutes. What, if any, are the AI time restraints? Little beyond the time needed to upload and analyze the relevant data—which is, logically enough, on top of however long it takes to collect and prepare the data.
What sets Beyond Limits apart from other AI software companies is subject-matter expertise. Beyond Limits enlists domain experts with PhDs focused on how energy companies find and extract energy sources, working with a team that understands what customers need. That means that the software — and Beyond Limits’ support teams — already factor in the engineering issues that are necessary for the solution to be effective, as opposed to other firms that deliver naked AI algorithms that require programming efforts on the part of the customer’s teams.
How can AI help with Net Zero for 2050 goals? Given that it’s far easier — and more effective — to improve power plant efforts to capture CO2 than to wait for humans to transition to more sustainable alternatives, some of the most impactful gains will be made by optimizing electrical generation and improving demand/supply forecasting.
This is ideal for AI and ML solutions because that improved accuracy comes from continuous analysis of constantly changing variables, such as renewables production and weather forecasts. The more mathematically complex a problem, the better AI is at helping improve operations. AI, and especially ML, was designed to project trends and pattern deviations from massively complex datasets. Beyond Limits, for example, often works with customers who see a significant improvement in accuracy after deploying AI. Beyond improving energy extraction operations, these improved predictions minimize downtime and increased accuracy and efficiency.
Naturally, the process of improving operations and efficiency must start with the current level of system efficiency. And one of the challenges in the energy space is integrating legacy data from a wide range of platforms. Depending on the original format, AI applies various capabilities — including natural language processing and advanced image recognition — but the limiting factor is context. The algorithms need to have that historical context to know how to interpret and reabsorb the older data.
Those efforts are critical, though, because those older datasets provide history and a rich environment, which drive increasingly more accurate projections and guidance via AI. The challenge is not dissimilar to today’s IT struggles of working with structured and unstructured data. AI needs to come up with the proper tagging schemes to classify YouTube videos or a discussion within a Zoom live session, for example, analyzing the sentiment of the speech.
That being said, there are several important issues to consider when working with legacy data. How high a level of confidence does the team have in the accuracy of the data retrieved? What percentage of that older data has been retrieved, and what’s the level of confidence in that data? Finally — and often overlooked — what percentage of data could not be retrieved, and what might that data include?
For example, let’s say the team is highly confident that 85 percent of the targeted legacy data has been retrieved, and with an accuracy of 97 percent. That’s very helpful, but without knowing what that other 15 percent includes, it’s hard to make meaningful recommendations. What if that 15 percent happened to include all the negative outcomes? This would mean that the retrieved data could potentially be overly optimistic.
Do most energy firms have the expertise on their teams to know what is missing and the corresponding accuracy and confidence numbers involving the retrieved data? That’s an area where Beyond Limits has focused, with teams specializing in digitizing and interpreting data — even when some datasets may be missing or incomplete. They have mastered techniques for cleaning and preparing that data with the right domain expertise.
One of the most practical implications for energy companies includes grid failure protections, which can come from weather predictions, historical use trends, population shifts and other elements. Knowing these historical patterns with a high confidence level is critical to that analysis. And if companies can’t translate that legacy data, they can’t leverage it. Even worse, if they don’t know what they don’t know, they can’t even accurately extrapolate the data to make good guesses as to what is missing.
This is why Beyond Limits makes sure to include an explainable AI layer. In short, the solution explains to end-users – via audit trails – its rationale and calculations behind a recommendation so that “humans in the loop” can make a well-informed decision about whether to accept the recommendation or not. Confidence doesn’t merely refer to data accuracy, but how comfortable an end-user/decision-maker is with accepting an AI recommendation. The explanation of the recommendation is what makes decision-makers far more comfortable with trusting the recommendations. Stakeholders can’t benefit from a recommendation that they don’t trust. Higher trust delivers more accepted recommendations, which in turn delivers the desired boost in productivity and efficiency.
As an example, Beyond Limits has taken all of this analysis to the next level with their new LUMINAI Refinery Advisor. The software is a cloud-based decision support solution that guides refinery operators to hit production economic planning targets and improves reliability in operations and start-ups. The application is able to work with different, often changing, constraints. Designed to think like an engineer and provide expert guidance to solve problems across the entire refinery, the advisor works with and goes beyond conventional automation-focused approaches to ensure process objectives are always met.
LUMINAI Refinery Advisor removes communication barriers between planning, engineering, and operations teams. This software is quickly adopted by operators and provides easy-to-use and specific instructions on how to get back on plan by detecting process conditions every 15 min and giving actionable advice. This functionality is less expensive than homegrown approaches and a fraction of the typical cost of other refinery optimization applications. This helps energy companies to expand heavy crude selection options and enables operations teams to optimize across highly varying campaign feedstocks.