Skip to content
oposite cut
Eirik Larsen13 May 20266 min read

10 Years Of AI Innovation And Building The New Paradigm

In the space of a decade, AI has gone from a nascent and specialist technology to the subject of world-wide debate. But even in the earliest days of Earth Science Analytics, it was clear to the founders that AI and Machine Learning would revolutionise the data problems facing the energy industry. How did that belief come about, and how did ESA grow out of it? 

It will soon be ten years since Earth Science Analytics was formed. May 17th, 2026, we will mark our tenth anniversary.  This feels like a good moment to look back at what first brought the three of us together - myself Eirik Larsen, Behzad Alaei, and Dimitrios Oikonomou – and how the challenges facing the exploration industry set us on the course we have taken.

In 2016 all three of us had already worked in positions within other companies where huge decisions - 100-million-dollars or more - were being made based on data that no one was really certain about. Behzad had been a Chief Geophysicist. I had been an Exploration Manager. Our shared experience was that there was so much complex data, in so many different forms, each handled by an expert in a specific field, and with a long list of disciplines that didn’t always overlap.

At the time it was like one huge relay race – and in some ways still is. One person did a piece of work that needed to serve as an input for the next person, who integrated that output with new data, and then it goes to the next, and the next, and so on. We realised that a lot of insight and detail was being lost at these handovers, as these specialists rarely get involved with their colleagues from other disciplines.

Then each piece of software they used would have its own data format and storage, so the software from stage three of this relay race would be unable to read the data used in stage one.

And finally, everyone was spending so long just searching for the data they needed. How do you identify and find the files you’re looking for, and where are they? We were convinced there had to be a way to solve all these problems together; to help all these specialists work with the data behind these major decisions, and convert it into something everyone can understand.

Without knowing it, both Behzad and I had been experimenting separately with Machine Learning because we felt it could be something important for geoscience. Behzad was also working with Dimitrios, who was a phenomenal programmer, and in the space of a Skype call, we realised we all had the same idea, and the same long list of geoscience problems that could be solved with this technology. 

In the very early days, before we had any office space, the three of us worked out of an attic. It was a simple space, but it became the place where the first version of the idea really took shape, long before it had a name, a structure, or a company around it.  

 

Earthnet structure 1

 Working on the EarthNET structure, Bergen, September 2016

 

 

Recognising the important of automation to a geoscientist’s workload 

It seems strange now, but back then Google DeepMind’s ‘showcase’ was how they solved the Atari games. Just learning to play Pong seemed impressive!

So the technology was still in its infancy. But fortunately, a lot of the organisations developing Machine Learning were making this technology open source, which meant if we adopted it, we would never be far behind the research front.

This enabled us to really accelerate what we were doing. Just as importantly, we began to see that this technology could address another problem – that so much of the work individual experts in geoscience carry out is highly manual and repetitive. It takes up a lot of time and doesn’t actually require a lot of brain power.

With Machine Learning you could automate this, saving hours of work. Not only is the automation many times faster, but we also believed then – and do now – that Machine Learning does it with better quality, and more consistency. When you pipeline that into this relay race, and integrate the single discipline steps with each other, you also get the means of quality control. You can feed the output of one step plus input to the next, and build a QC layer on top, which will spot something that doesn't look right.

The data available in our industry was and remains messy. There are many tools, working slightly differently from each other, and inevitably those tools make measurement mistakes. With that QC layer in place, you can trace that back to the root cause, fix it, and then just run the pipeline again. The mistake is gone, and any similar mistakes are gone too. It’s a huge advantage.

 

 

Convincing a sceptical industry 

Of course, in the early days of technology like this, people are always sceptical. Following the company being formed in 2016, we started going to brands in the industry, describing these benefits, making pitches and bringing along a slide deck.

 

image29

 Pitching to potential investors, February 2017 

 

And nobody believed it. It quickly became clear that a PowerPoint presentation alone would not be enough to convince anyone that the technology could work. So Dimitrios built a prototype, hard-coding everything. It was primitive, but it worked, and it already contained many of the components we still use today. 

That was a crucial step in the company’s development, as we were now able to bring the software with us, demonstrate it live, and deliver real-time outputs during those meetings. We could do in an hour what would otherwise have taken everyone around the table weeks or even months to complete. 

That was the moment people started to believe in what we were building, and we secured our first contract in January 2017. It was quite something to finally start getting paid for it.

 

image28

 The first office of 9 sqm, BTO Bergen, 2017 

 

 

Scaling the company and building the new paradigm 

Originally, we didn’t imagine this would become a large enterprise, it was simply a very interesting space to work in. But gradually, we realised it could become something much bigger: a new paradigm. 

As a company, we became more ambitious, growing commercially and attracting investors. By 2020, Saudi Aramco Energy Ventures came on board as our first investor, followed by Equinor, Wintershall Dea, Sumitomo, as well as soft funding from research institutes. This marked the beginning of a more rapid scaling phase, allowing us to expand the team. 

We knew we were first into this game, which is always an advantage.  But being early is only part of it, you also have to be the best. Put those two together, and you have a strong foundation for success. 

 

 

Staging a revolution takes longer than you think 

There is a revolution already underway, but it has not yet fully arrived.   Challenges such as the data silo problem still persist as the dominant pattern. The idea of open data platforms is still gaining momentum, but the market is not yet ready to realise its full potential. 

Automation is also advancing. We are still some way from it becoming mainstream, but all operators are now experimenting with it in one form or another. 

These barriers often take time to come down. Over the past ten years, we have learned that change in this industry takes far longer than we initially expected. 

But even back in 2016, we were convinced this would work. That it would be revolutionary. And it is revolutionary. As Earth Science Analytics marks its tenth anniversary, we remain just as certain of that today as we ever were. 

 

image11

 First EAGE Annual Conference as Earth Science Analytics, Copenhagen, 2018 

 

 

 

 

avatar

Eirik Larsen

Eirik Larsen is co-founder of Earth Science Analytics. He has 20+ years’ experience from the E&P industry. He has held various technical and managerial roles in E&P energy companies where he has worked with exploration, field development, production and research. He holds a MSC and a PhD in geology from the University of Bergen.

RELATED ARTICLES