By doing this Artificial/Intelligent series, I’m trying to learn a bit about what AI has to offer, as well as the ways in which it's terrifying to me as a journalist.
It's not unique to the writing industry. I think all the industries are concerned about the rise of generative AI. My analogy is that this is the equivalent to what the calculator did to maths. You still have mathematicians. It's just that you no longer work the same way, and I think it’s the same with those tools. They're tools in the end. They're not intelligent, despite their name.
To start us off, could you explain a little bit about yourself to our readers and the kind of work you do?
My background is in geology, so I'm an Earth scientist. My research is on two things: climate change and past climates of the Earth, but also the energy transition. The transition to cleaner energy has a lot to do with the subsurface and with geology. Think, for instance, of geothermal energy, hydrogen storage, or even just carbon capture and storage.
I’ve been using AI as a tool since around 2015 as a way to essentially unbias the interpretation of unstructured data – essentially to gain quantitative information from images fast and with less bias.
One of the reasons I wanted to talk to you is because there is this perception, rightly or wrongly, that AI is a net negative when it comes to the environment. I wanted to get your take on the various ways in which it is true or isn't true?
Let's start with the negative, which is energy consumption. The data centres are consuming more energy today, in terms of terawatt hours than, say, the entirety of the healthcare system in the US for instance. That consumption is increasing because we rely more and more on AI, and that comes with concerns about the source of that energy. So a lot of the data centres are trying to use renewable energy, which is great, but because renewable energy is intermittent, it means that at night they have to revert back to fossil fuels. The second negative is cooling, because it's done with water and, anywhere where you have water scarcity, which is more and more the case, that's also a negative.
But I don't necessarily think AI is overall net negative. I think these are engineering problems that have solutions. And also, AI has a lot to do with helping us manage the planet and manage climate change. Let me give you a few examples.
So part of what I do is work with satellite imagery. Satellites capture a lot of information pretty much at least once a week, if not once a day. You have an image of a given point at the surface of the planet and to analyse this manually is just impossible, so we have decades of information that's basically sitting there that has been only partially used. But now, thanks to the automation that deep learning AI gives you, you can actually gain insight from that data and you can then act on this insight to make decisions about what makes sense in terms of environmental policies or climate change remediation.
Also, one thing that's maybe less known is that one application of deep learning is to basically serve as a cheap and fast simulation tool. For instance, weather prediction is enormous because it has an impact on travel and energy. The problem with predicting climate is that it's an enormous task and you need a supercomputer. This takes so long that you can only make a prediction every six hours. So in between predictions, what do you do? The solution has been for many years what is known as surrogate models, also known as reduced order models. Often – or now I would say almost exclusively – these are basically deep learning models.
How does it work? Well, imagine that you have a good physics model. You can use this as training data. Those [deep learning] models operate 1,000 times faster than the real model, which means that now you can have half-hourly predictions of what the weather will be on your phone. That can actually save energy, so it’s a complex issue and it depends what you do.
In your own work, do you feel any kind of conflict between helping the environment by doing the research you're doing, but on the other hand, the compute power is having a negative impact?
Anything you do that uses energy will have a negative impact on the environment, right? If you send an icebreaker to the Arctic to understand climate change, that has a negative impact on climate change itself. It's always a little bit of a conundrum. You have to think about what you do and how it could ultimately help with the environment. So, for instance, when we do some research on the energy transition or characterising the subsurface or characterising satellite imagery to monitor climate change, I actually think that it's a net positive. Ultimately, once the algorithms are trained and once the insight is gained, we can have an order of magnitude greater positive impact on the environment.
Now, if you use ChatGPT to do a web search, for instance, that is a negative impact – and there's no reason to do it. You can use Google, right? I wouldn’t trust ChatGPT for my search anyway, but you have to be aware that you're using a technology that's extremely power-hungry. It's a question of how you use it.
In your field, if I can turn it to you in the creative industries, that's another really interesting one. If you use AI in creative ways to create a product that is then viewed by millions of people, I think that's a fair use. But most people use it to create silly pictures of cats or their mother-in-law looking like a witch. This is fun, but it is what fuels this power consumption. It's not just the business. It's also the private use of AI.
But I don't think things are black and white, so you can't say: “Well, researchers can use the tool, but the general public shouldn't be using it.” Firstly, the reason we have an explosion in AI tools right now is in part because of the prospect for commercialisation. It really motivates companies because they make a lot of money. The other thing is I think that AI is coming into the workplace, and the sooner you can use these tools – even if it's in a playful way – the sooner you're aware of what the tools are and what their limitations are. That’s better for the next generation, or even for our generation, in the workplace. You also have to put that into the balance.
So actually, if we're looking for a solution, it is not about controlling who uses the tool and for what purpose. The solution is to find ways to do less power-hungry computing and to really promote that transition to sustainable, renewable energies that are less intermittent and more usable. That's part of my research in that space, so I have a vested interest of course, but I really think that we should fix the problem at the source. AI is here to stay, so whether we like it or not, people will use it to do funny cat pictures and whatnot. But we shouldn't put the blame on the consumer. We should put the blame on the system, and try to fix the system.
On that note, is there work happening in the AI sector to mitigate these environmental impacts that we've discussed?
There is lots of work happening. That's not my own work, so I can’t speak intelligibly about all of it. But I know there is a lot of work on computers and chips that would use less power, or that would compute in very different ways that might be more efficient. There is, for instance, a tendency to try to look at biological computing, because the way our brains compute things is much more efficient and much less energy-hungry. We just need a sandwich, and we can operate and do very complex calculations.
There's, of course, a tonne of work on energy transition, with solar panels being one of the fastest growing industries in terms of efficiency. Batteries as well are a big, big market. It's mostly fuelled by electric vehicles, by the way, but you know it has implications for anything that uses energy. There's also work on the networks themselves, to use much smaller models or easier-to-train models. If you need less time for training, you need less energy.
The optics are not good around using that much energy. It's also expensive, so there’s an intrinsic motivation by the companies to reduce their footprint because they want to make money ultimately. Spending it on the electric bill is not the greatest use of their resources.
AI companies are increasingly trying to make use of renewable energy
(Credit: Sebastian Ganso/Pixabay)
You made the point about optics and I think that's really interesting. Do you think the issue of environmental impact is overstated in discussions about AI, or is it kind of pitched at the right level?
I think it's pitched at the right level, and the reason I think it's pitched at the right level is that if you look at the trends, from 2015 to 2024, I think that the energy usage has roughly doubled in that period. That's just for those data centres. So we're looking at a sector that's very much intersectional – it's in every industry – and that has doubled its consumption of electricity. They are using renewable energy, which is good, but they're also using all the renewable energy available to the market – which means that if they are using it, others are not using it.
We should think about it. Universities are definitely thinking about it. It’s a movement in the sector anyway. I don't think it's doom and gloom. I think there are solutions, but I also don't think we should sweep that under the rug.
There's a final question I'm asking everyone I speak to for this series. It's a more general one. Should I be worried about AI?
Well, there's a famous line about AI, which is that you're not going to be replaced by AI, but you're going to be replaced by a human who knows how to use AI. I think it's true. I think it is a shift in society. It's a tool that offers you a lot of possibilities. It's also a tool that has a lot of traps where you shouldn’t over-trust it.
So I think you should not be worried about AI per se, but you should be worried about not moving with the trend and being basically trained on how to efficiently use that tool. You can waste a lot of time using those tools inefficiently – I’ve done that – so you have to learn how to be efficient with them. But ignoring them is also dangerous.
0 Comments