For years, major car companies have been working to add autonomous features to vehicles on the path to fully automated driving (think lane departure warnings, exterior cameras, and smart cruise control). Tesla in particular is betting big on the technology; Elon Musk has said the company will be “worth basically zero” if it can’t get a handle on self-driving features.
Recent fatal crashes involving self-driving Teslas have heightened safety concerns around autonomous driving, but a new study(Opens in a new window) by the Massachusetts Institute of Technology (MIT) identifies a new risk: carbon emissions from the computing power.
A global fleet of autonomous vehicles could exceed the carbon emissions of all data centers today, which already make up a staggering 0.3% of global emissions. This means autonomous vehicles would essentially duplicate the energy the world currently consumes to run websites, streaming services, data storage, and all the other functions of the internet.
“Making a car move autonomously takes a lot of computation,” says Soumya Sudhakar, a graduate student in aeronautics and astronautics at MIT and first author on the study. “In data centers today, a lot of the workload is AI-driven. Autonomous vehicles will also [rely on] AI to interact with the environment, understand what’s going on, and avoid obstacles.”
MIT researcher Soumya Sudhakar presenting her findings at TEDx Boston.
(Credit: TEDx YouTube)
“These are really large amounts of emissions when we are trying to get to net zero emissions as a society,” says Sudhakar in a TEDx talk(Opens in a new window). She notes emissions from activities like listening to podcasts, working from home, and streaming Netflix are easy to put out of sight, out of mind.
“One reason we don’t think about the direct connection between computing and emissions is that a lot of those emissions get abstracted away into data centers. But those centers consume electricity and produce a lot of emissions,” she says.
To quantify carbon emissions from autonomous vehicles, the researchers considered four factors:
-
The number of autonomous vehicles on the road. There are 1.2 billion cars today.
-
How many hours per day each vehicle is driven.
-
The computer power needed to support those vehicles for that amount of time per day.
-
The carbon intensity of the power source (gas/goal/wind/solar).
“On its own, that looks like a deceptively simple equation,” Sudhakar says. “But each of those variables contains a lot of uncertainty because we are considering an emerging application that is not here yet.”
Equation to quantify the carbon emissions of autonomous vehicles.
(Credit: TEDx Boston)
The results are revealing. To keep things simple, the team used the average carbon intensity of power used in 2020. They estimated a maximum of 1 billion autonomous vehicles on the road (or, the majority of the 1.2 billion vehicles today), driving for one hour a day, with an 840-watt computer.
With those factors, autonomous vehicle emissions from computational power would be equivalent to all data centers today, or 0.3% of global emissions. That’s roughly equivalent to the entire country of Argentina. With a more powerful, 3,100-watt computer, they would emit more than double that, or about 1% of global emissions.
“After seeing the results, this makes a lot of sense, but it is not something that is on a lot of people’s radar,” says Sertac Karaman, associate professor of aeronautics and astronautics and director of the Laboratory for Information and Decision Systems (LIDS). “These vehicles could actually be using a ton of computer power. They have a 360-degree view of the world, so while we have two eyes, they may have 20 eyes, looking all over the place and trying to understand all the things that are happening at the same time.”
Tesla’s proprietary AI chip, called Dojo.
(Credit: Tesla)
The big question is: “What can we do about this now so it doesn’t happen in the future?” The researchers outlined four main focus areas.
Recommended by Our Editors
The first is decarbonization of the electrical grid. This will dramatically reduce carbon impact “especially if you assume all autonomous vehicles will also be electric vehicles,” says Sudhakar, referring to the increasing adoption(Opens in a new window) of electric vehicles across the world.
The second is accelerating the efficiency of the chips themselves “so we can do more computation with less power.” Sudhakar says the rate of improvement in chip efficiency has slowed down in recent years. This may be something for Musk to consider in development of Tesla’s Dojo Chip, an ultra-powerful microchip on which the company plans to build its AI future.
Tesla AI data centers with Dojo chips.
(Credit: Tesla)
The third way to reduce computing-related emissions is programming the vehicles to reduce idling time and other movements humans do in their vehicles, which require energy.
Finally, the fourth focus area is getting the computer to stop “thinking” when it’s no longer worth it, reducing unnecessary computations. “We do this all the time as humans,” says Sudhakar. “When we want coffee, we could sit and think for a really long time about what the path to the coffee shop is—the traffic, weather, time of day, intersections—we could come up with a lot of simulations, but [at some point] thinking too much may offset any potential gains.”
She believes robots can be programmed to understand their total carbon emissions, and use machine learning to manage their own carbon footprint from computing. But that’s something automakers like Tesla, or self-driving chip manufacturers like Qualcomm, would need to tell their systems to consider.
Get Our Best Stories!
Sign up for What’s New Now to get our top stories delivered to your inbox every morning.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.
Hits: 0