AI in Energy: Knocking on the Door, But Are We Ready to Let It In?
AI is steadily making its way into the energy sector, not quite kicking down the door, but certainly knocking loudly and peeking through the window. It promises smarter decision-making, better grid stability, and more personalised customer engagement. But, as with any new technology, just because we can use AI doesn’t mean we’re ready to use it well. That’s where Ofgem’s recent AI guidance consultation comes in – a much-needed conversation about how we bring AI into the sector responsibly.
At Flexitricity, we see AI as an exciting tool, and one in which we’re already using in our optimisation tools, but it needs careful handling. Ofgem’s framework lays a solid foundation, but the real challenge lies in bridging the gap between ambition and reality. AI adoption isn’t just about plugging in a clever algorithm; it requires readiness, risk management, and a strong ethical backbone.
Readiness: AI is Here, But Can We Keep Up?
Ofgem rightly points out that existing regulations should be enough to govern AI, and, in theory, that makes sense. AI doesn’t change the fundamental principles of energy markets; it just gives us a new way to optimise them. But the big assumption here is that organisations are actually ready to do this. Spoiler: many aren’t.
AI is only as good as the data it learns from, and let’s be honest, some organisations are still wrestling with the basics. If your data is scattered across spreadsheets, inconsistent, or locked away in inaccessible silos, throwing AI at it is like asking a toddler to solve a Rubik’s Cube. Before we talk about AI-driven optimisation, many businesses need to focus on building solid data foundations, getting their information structured, clean, and usable.
On top of that, Ofgem’s expectations around governance and explainability assume a level of AI literacy across businesses that just doesn’t exist yet. Understanding how AI makes decisions isn’t just a job for the data science team; it’s something that needs to be embedded across risk, compliance, procurement, and leadership. And that takes time.
At Flexitricity, when we first put our Data Science, Risk, and Compliance teams in the same room, it was like tuning a radio to three different stations – lots of noise and not much harmony. But we stuck with it, and over time, a common language started to emerge. It turns out, AI adoption isn’t just about getting the non-technical parts of the business up to speed; it’s also about helping the technical teams see the bigger picture. The data scientists patiently explained what their algorithms were doing, while Risk – equally patiently – kept asking those awkward but necessary questions. The result? Better understanding on both sides. AI works best when it’s a team sport, and collaboration is what keeps it on track.
Risk: AI is Powerful, But Who’s Actually in Control?
Ofgem’s focus on a risk-based approach is absolutely the right call, but one of the biggest risks in the industry isn’t AI itself, it’s the mismatch of knowledge between AI vendors and the companies trying to use their products.
AI providers come armed with the latest technology and impressive promises, but energy companies often don’t have the expertise to push back and ask the right questions. It’s a bit like being sold a top-of-the-line espresso machine when you don’t even know how to make instant coffee. Without a deep understanding of how AI makes decisions, there’s a real danger of over-reliance on black-box models that no one fully understands.
The solution? Companies need to get comfortable with AI. Not necessarily by becoming experts, but by developing enough literacy to engage critically. That means asking vendors for transparency, setting realistic expectations, and using AI cautiously, starting small, monitoring closely, and scaling only when we know it works. AI isn’t magic, and it certainly isn’t a "plug and play" solution.
Ethics: AI Needs to Work for Everyone
Ofgem also puts a welcome focus on ethics, ensuring AI benefits all consumers, including those who are vulnerable or digitally excluded. This is essential, but there are risks that need careful management.
Take AI-driven pricing and load-shifting recommendations. If not handled carefully, they could disproportionately benefit tech-savvy consumers while leaving others behind. And let’s not forget behaviour-shaping, if an AI system suggests a change in energy usage, is it a helpful nudge or an unfair pressure? Similar issues are being played out in other industries now.
The key here isn’t just designing fair AI systems, it’s ongoing monitoring. AI models drift over time, consumer behaviours shift, and unintended biases can creep in. To keep things fair, companies will need to continuously check and adapt their AI-driven decisions. The last thing we want is a system that subtly disadvantages certain groups without anyone noticing until it’s too late.
The Road Ahead: AI, But Make It Thoughtful
Ofgem’s AI guidance is a great starting point – it’s practical, measured, and focuses on the right things. But making it work in reality will take effort from the whole industry.
For AI to succeed in energy, we need:
✅ Stronger data foundations – AI can’t work well if it’s fed inconsistent, incomplete, or inaccessible data.
✅ A culture of AI literacy – Businesses don’t need everyone to be a data scientist, but they do need teams that can ask the right questions.
✅ Cautious, well-monitored adoption – Start small, test thoroughly, and scale only when you’re confident.
✅ Ethical and inclusive design – AI should work for everyone, not just those with the best digital access.
At Flexitricity, with our data engineers having laid the data foundations, the AI we’ve deployed in our optimisation toolkit has been done systematically, and in collaboration with risk and compliance. This generates buy-in, and helps the team move forward together.
We’re optimistic. AI has the potential to make energy smarter, more efficient, and more inclusive. But it’s not a race to be first, it’s about getting it right. AI isn’t kicking down the door just yet, but as it starts making its way in, we need to be ready to handle it responsibly.
We look forward to continuing the conversation with Ofgem and industry peers to ensure AI is adopted in a way that is practical, transparent, and built on solid foundations.