• 2 days ago

Visit our website:
http://www.france24.com

Like us on Facebook:
https://www.facebook.com/FRANCE24.English

Follow us on Twitter:
https://twitter.com/France24_en

Category

🗞
News
Transcript
00:00To talk more about AI and its implications,
00:02Dr. Sasha Luccioni joins me now on set.
00:05She's a leading scientist in AI ethics and sustainability
00:08with a decade of research and industry expertise.
00:11She's also the climate lead at Hugging Face,
00:14a global startup in open source artificial intelligence.
00:17Sasha, it's such a pleasure to have you with us
00:19on the program today.
00:20Pleasure's mine.
00:21First of all, could you explain a bit your research?
00:23Introduce us to the research that you do
00:25and the work you do.
00:26So my research really focuses on understanding
00:29the environmental impacts of artificial intelligence.
00:31So the energy that it uses,
00:32the carbon emissions that it emits,
00:34the water usage, because currently with the boom
00:37of artificial intelligence around the world,
00:38we're seeing that the planetary impacts are real
00:41and are growing, and we need to know more about them.
00:45Now, a big part of your research
00:46is sustainability practices in AI.
00:49And as a part of your work,
00:50you evaluate the carbon emissions of AI models.
00:53This is not necessarily what we think of first
00:56when we think of artificial intelligence,
00:57we just think something online.
00:59Yeah, it's really interesting because it's not material.
01:02When we talk with a chatbot or with a voice assistant,
01:04we don't really see where it's running,
01:05but it's running on a data center somewhere
01:08using energy that was sometimes generated with coal,
01:11just sometimes with natural gas.
01:12So that really generates carbon emissions.
01:14And these data centers are getting bigger and bigger.
01:16The AI models themselves are getting bigger and bigger,
01:18and they're used in more and more applications, right?
01:21From navigation to chatbots to web search engines.
01:24And so all of this is growing in a way
01:27that's so far away from where we are
01:28that we have trouble actually even understanding
01:31how big it is.
01:32Now, there was an opinion piece
01:34in the French newspaper Le Monde by a group of NGOs.
01:38It was titled,
01:39AI is accelerating ecological disaster,
01:42reinforcing injustice, and worsening power concentration.
01:46They raised the question of whether AI
01:47can truly serve the general interest
01:49when they're driven by governments
01:51and tech giants, essentially.
01:54In your opinion, how do we even begin
01:56to address these issues, these concerns?
01:58I think the start of everything is transparency.
02:01Currently, we're operating with a lack of information,
02:04a lack of understanding of impacts,
02:06a lack of understanding of data, where it's coming from.
02:08So I think that in order to empower users
02:11and NGOs and nonprofits,
02:12we really need to improve the transparency
02:14and legislation is gonna be a key part of that.
02:17And actually, that's why the AI Action Summit
02:19is so impactful because it brings together
02:22groups of people that normally don't cross paths, right?
02:24Tech CEOs, government leaders, researchers,
02:27and nonprofit and citizen groups.
02:30And so bringing everyone in the same room
02:32and getting us to talk is really the first step.
02:35Now, we have to talk about ethics
02:37because it's the one thing that keeps coming up
02:39every time we talk about artificial intelligence,
02:41and rightly so.
02:43There are the privacy concerns,
02:44but there's also the question of inequality
02:47stemming from lack of accessibility to AI.
02:50In your opinion, what are the biggest ethics concerns
02:54in the face of rising AI?
02:56So actually, we recently wrote a paper
02:58with a researcher from INRIA and Meredith Whitaker,
03:01who's the president of the Signal Foundation.
03:03And it's about topics around ethics,
03:05sustainability, and profit.
03:07And actually, they're all really linked
03:08because currently we see bigger and bigger models
03:11trained on more and more data that's our data,
03:13really data that's gathered from the internet.
03:15So there's a surveillance capitalism part of things,
03:17and with bigger and bigger environmental impacts.
03:19And so I think that it's all really linked
03:21because, for example, a huge model is hard to understand,
03:25and it also can discriminate in ways
03:28that we don't understand because we can't study it.
03:30And so there's all these questions of who owns the models,
03:33whose data it is, who's represented in that data.
03:36And also, in terms of cost, the more expensive the model,
03:39the less people can afford to use it, right?
03:41And so smaller models can be owned
03:43by people from the Global South, academia,
03:46whereas bigger models tend to be created
03:48and deployed by big tech companies.
03:50So really, for me, focusing on bias, ethics,
03:53and kind of frugality or efficiency,
03:55they're all really linked.
03:57And as you very well may know already,
04:00there's the data center deal that's been struck
04:03between France and the UAE.
04:06It's obviously a huge project
04:08with billions and billions of euros invested.
04:12In economic terms, it may be an advantage,
04:15but from an environmental point of view,
04:17it's going to just majorly pollute.
04:19So if the data centers are in France,
04:21actually the energy here is at least low carbon.
04:23It's not renewable quite, but it's low carbon.
04:25So there's that.
04:26But I think that currently there's this emphasis
04:29on sovereign compute, on owning data centers in countries.
04:33And I think that often goes
04:35against sustainability principles.
04:37So it's true that I think that if we keep building
04:40these big data centers, it may just make things worse.
04:42But if we think about alternative ways of, for example,
04:45distributing compute or making it more efficient,
04:48more kind of democratized as well, then we can,
04:52but of course that doesn't come with the same pizzazz
04:54as announcing a multi-billion dollar deal,
04:57but saying, oh, hey, we took some empty warehouses
05:00in urban areas that were disused
05:02and we're building some smaller data centers
05:03and recycling the heat so that it heats people's homes
05:06is a lot more frugal and sustainable as a solution.
05:10But of course, it's not a sexy baby.
05:12It's not as sexy, but it's important though.
05:14Right, exactly.
05:16Now, I think we need to talk about how AI
05:18can be a force of good,
05:19because we don't want it to sound too dystopian
05:21and we can't dismiss the advantages that AI could bring.
05:25It can be deployed to perhaps tackle climate change,
05:28disaster relief, even wildlife conservation.
05:31Can you give us more examples of what these might look like?
05:34Definitely, I'm actually part of a volunteer organization
05:37called Climate Change AI.
05:38And for five years now, we've been bringing together people,
05:41funding grants, funding projects, organizing workshops.
05:44And essentially, there are so many ways
05:45in which AI can be a force for climate action,
05:48everything from designing new batteries.
05:50So for example, coming up with new combinations of molecules
05:52that are better than the lithium ion batteries
05:54that we currently use, biodiversity monitoring.
05:57So for example, in Canada, I work with a team
06:00of entomologists to monitor moths
06:02in areas where there's no data.
06:04And so we have these AI enabled cameras
06:06that detect actually new species
06:08that we never even knew existed and we can track them.
06:11And then of course, everything with remote sensing.
06:15You can even monitor the health of coral reefs from space
06:18using AI and LIDAR imagery and satellite imagery together.
06:22You can actually, without having to like dive and go check,
06:24you can actually see how coral reefs are doing.
06:25So there's all this good out there and it's not all bad,
06:29but I think it's important to understand
06:30that AI is not a single technology.
06:32There's so many different approaches
06:33and what we hear about Chad GPT
06:35and all these deep-seek models,
06:37they're not necessarily the ones that are climate positive.
06:40And those climate positive models are usually smaller,
06:42very task-specific,
06:43but we don't really hear about them as much.
06:46Lastly, if you will, what responsibilities are
06:49or should be at the forefront of governments and businesses
06:53who increasingly need to create AI systems
06:56which are ethical, transparent, and sustainable?
06:59I mean, what do they need to be doing differently
07:01that they're not doing now?
07:02I think we should be asking for more accountability
07:04from tech companies because especially
07:06if governments are adopting these tools
07:08as part of their internal or external processes,
07:12we need to make sure that, for example,
07:14we know how much energy they're using,
07:15we know what data they were trained on,
07:16we know how the data that we put into these models
07:19is being used, right?
07:20Is it a privacy violation?
07:22I think currently we're often governments and people,
07:25citizens alike, are taking it as granted.
07:27Oh, we're gonna use this tool
07:27because it's supposedly the best one, right?
07:29Because some tech company said
07:31that it has the best accuracy,
07:32but we should be really questioning this discourse more
07:36and asking enough questions
07:38so we can make informed decisions
07:39without just believing that it's the best and greatest.
07:43Indeed, very important indeed.
07:45Dr. Sasha Luciani, thank you so much
07:47for being with us on the program today.
07:48It's a pleasure to have you with us.
07:49Thank you for having me.

Recommended