During last night’s earnings call with investors, Elon Musk threw out an all-time late-night dorm room bong sesh of an idea: what if AWS, but for Tesla?

Musk, who loves to riff on earnings calls, compared the unused compute power of millions of idle Tesla vehicles to Amazon’s cloud service business. If they’re just sitting there, he mused, why not put them to good use to run AI models? (Also, have you ever really looked at your hands? No, I mean really looked?)

“There’s a potential… when the car is not moving to actually run distributed inference,” Musk said. “If you imagine the future perhaps where there’s a fleet of 100 million Teslas and on average, they’ve got like maybe a kilowatt of inference compute. That’s 100 gigawatts of inference compute, distributed all around the world.”

So, to summarize, you buy a Tesla. It’s your property. But Musk wants to freely use the unused compute power in your vehicle for… something? Possibly AI-related? Hopefully not the blockchain. (Tesla is an AI company now, by the way. Musk said so himself during the call.)

Would Tesla pay you for this? Unclear

Would Tesla pay you for this? Unclear. After all, this is Musk at his most hypothetical. Still, I wouldn’t put it past him to just try to take compute power from his customers’ vehicles without consent or compensation. GM was giving your driving data to insurance companies without your consent! Baby, it’s a free-for-all.

But before we can even treat this as a serious idea, we need to figure out if it’s even possible. I reached out to Sam Anthony, former chief technology officer at Perceptive Automata, a now-defunct company that built modules for self-driving cars to allow them to do “theory of mind” tasks.

Anthony said, as a concept, it’s “perfectly possible” to split large computing tasks out over lots of small nodes. We’ve seen it done with Bitcoin mining or Folding@home, a distributed computing project to develop new therapeutics. But just because something is possible doesn’t necessarily make it a good idea.

Anthony said there are two main issues that make cars — and electric cars, in particular — imperfect nodes for a distributed computing project. First, you have to rely on the car’s battery, or if it’s plugged in, the charging station’s energy source, for power. And that power usually doesn’t come free, with owners paying retail rates for electricity. Second, connectivity and speed are a “big issue” in distributed computing, Anthony said.

“Inference in particular is a part of the [machine learning] workflow where your speed is essential.”

“Inference in particular is a part of the [machine learning] workflow where your speed is essential,” he added. “You’re not doing a ton of offline inference overnight, you’re answering questions as they’re asked — this is the big inference issue the AI companies are running into right now — which makes the connectivity and availability issues of cars (which, you know, move around) even more of an issue.”

In Musk’s mind, the distributed network would only work when the cars are parked or otherwise immobile. Still, Anthony argues that no one would willingly create a distributed computer architecture out of millions of car ECUs (electronic control units) unless they were somehow forced to do it.

“It’s somebody with a very weird looking hammer imagining the existence of deeply implausible nails,” he said.

To be sure, computer scientists have been trying to create fast computers out of many small, idle nodes for a very long time. One of the earliest examples was SETI@home, in which Berkeley researchers thought they could find extraterrestrial life by tapping a volunteer network of distributed computers to analyze radio data. So why not a Tesla@home?

To be sure, computer scientists have been trying to create fast computers out of many small, idle nodes for a very long time

For one thing, the more geographically distributed the nodes, the harder it is to get them to work in concert with one another, said Phil Koopman, a professor of electrical and computer engineering at Carnegie Mellon University who co-authored a book about supercomputers.

Like Anthony, Koopman acknowledged that the project could work as long as the vehicles were plugged in while computing to avoid draining the battery. Good Wi-Fi was also a necessary component, so the Tesla would likely need to be parked at home overnight for the distributed network to function properly. But even then, you’d likely run into obstacles while growing the project in order to make it useful for AI computing.

“Scalability to that size is always challenging and rarely succeeds to the degree it is worth doing that instead of building a data center,” Koopman said. “The devil is in the details, so I’d want to see some serious experimental confirmation it is viable.” 

Musk loves to pontificate on what’s possible in a future overrun by autonomous connected vehicles. Things like a 24/7 robotaxi service in which your vehicle is out earning you passive income while you sleep sound awesome in theory. But when the rubber meets the road, Musk’s big ideas tend to deflate.

“For now it is an interesting idea,” Koopman said, “but we need to keep in mind that most cool ideas like this do not pencil out to be practical.”

Shares:

Leave a Reply

Your email address will not be published. Required fields are marked *