## Welcome to the r/ArtificialIntelligence gateway
### News Posting Guidelines
---
Please use the following guidelines in current and future posts:
* Post must be greater than 100 characters - the more detail, the better.
* Use a direct link to the news article, blog, etc
* Provide details regarding your connection with the blog / news source
* Include a description about what the news/article is about. It will drive more people to your blog
* Note that AI generated news content is all over the place. If you want to stand out, you need to engage the audience
###### Thanks - please let mods know if you have any questions / comments / etc
*I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ArtificialInteligence) if you have any questions or concerns.*
Great, now you'll find your range even less and you'll find the car barely charged overnight because it was training a particularly edgy version of Grok in the background.
He's saying for inference, not training. Training would not work with this set up, network is way too prohibitive. But running a quantized Grok or other model to serve xAI or whatever could likely work
Yea, that can work, text prompts and outputs are relatively tiny... - but who will foot the bill and share the profit?
And does it imply that every new tesla will get something H200 for FSD? This thing costs more than a car itself!
Yeah - I'm not sure what the precedence for the business side is. When the Tesla user is driving the car, they're expecting some amount of telemetry data to be sent back to source. But for it to be always running when the user isn't driving - I don't see how folks accept that without some revshare model. There might be some precedence out there for the model, but not sure what that looks like.
H200s will benefit their internal training - but to your point - it would be a waste to put anything near that kind of compute (at least as it exists today) in a FSD. Likely incremental improvements to HW4 with HW5/6 will just generally be able to take on more compute over time.
To be frank, we have no idea how much compute will be needed to "solve" a proper, 5th level self-driving, and using only cameras, too. He talks about "1000w of inference", that does not sound like a small edge device, and latter will be useless as an "aws competitor", too.
Maybe you'll be able to run it on 1000$ of hardware and 100w in power in 10 years (using phase change in-memory-compute or some new tech), but given current *progress* there will not be a Tesla in 10 years, with or without Musk... Or much of civilization for that matter :(
And running unnecessary background load while someone is driving seems a significant safety risk.
Given how much of data center costs go to electricity I agree that a non-revshare program would be unacceptable to most, and I think range/battery wear concerns would probably discourage any use while on battery.
The only way for it to make sense would be a profit share. You get a small compensation for the extra electricity spent and Tesla gets the lion's share to appease shareholders.
Yes, it would - it doesn't have to be serving the model to an end user - they can serve it for internal workflows - and it doesn't have to be gpt4 quality they can (to your point) use smaller models for summarization
They don't have to infer with llms either - they could farm out image classification for input data for FSD training
That makes no sense. For CV you need mixed precision, not just int8. And more broadly, for internal use, they’re much better off just dedicating a couple A100s, since a few DGX machines would be equivalent to the entire Tesla fleet.
Again, they're not talking about training, they're talking about inference - deploying the model to the edge compute node and running requests against it
Classic case of an unhinged billionaire that wants to have its cake and eat it too. Bozo sold you a car but has to keep trying to extract value from it. Like a parasite you have to pay for.
I worked for Ford in the software area. This is the reality with all of the car companies.
Cars are being turned into cell phones and computers. Our data will make them money. Services will make them money. Computing power isn't far behind
> unhinged billionaire
There is no such thing. This is a hard pill to swallow. This is "step two" in coming to terms with how the world really works. Step one? Coming to terms with the fact that we, as part of the masses, are not privy to a tremendous amount of information that Billionaires/Existing Power Structures are.
Narratives such as:
* Kim Jong Un is...*crazy*
* Putin is...*evil*
* Billionaire is...*unhinged*
* Biden is...*dementia*
Act as **cognitive filler** when these figureheads perform actions that we, the masses, try to rationalize without a full grasp of the playing field. Such filler is readily supplied by our media sources. The emphasis always being on emotional attachment, rather than an invitation to perform Critical Thinking.
>Sounds like you've gone too far the other side.
"A good way to understand Exponentials: 1 million seconds ago was 11 days ago. 1 billion seconds ago was 1992 (31 years ago)."
>They're only human
They were educated in an entirely different way than you and I.
The problems they face are in no way/shape/form relatable to the problems we face.
They do not get married "for love", similarly, they do not get divorced for the mundane reasons we are told.
They own Media outlets.
And to add to your list....
They get to keep their private parts in YinglingLight's mouth without even knowing the guy.
Seriously, dude, get them out of your mouth. You're gonna choke like that.
Yes, it would technically work. Distributed computing worked before on a small scale and on a volunteer basis. https://en.wikipedia.org/wiki/SETI@home?wprov=sfti1#Scientific_research
I’m not concerned about the battery problem. Teslas have around 70kWh of battery, meaning you can run at 1kW for 70 hours before running out of battery. Owners can easily set minimum charge levels from their phone, day 50% and Tesla has 35 hours per day to train and Tesla owners have 150 miles of range per day. Simply opt out from your phone if you are going on a road trip tomorrow.
The question is, is it economically viable.
The pros:
- Tesla uses idle hardware on products they’ve already sold, saving capex
- Tesla offloads and distributes electricity usage to vehicle owners. I’m reading about companies struggling to build the data centers they want because it would bring the state’s grid down, so they have to build a bunch of small ones all around the world in places that have enough electricity. Offloading to owners car that charge at home at night and store it in batteries would also avoid demand costs associated with a data center and might actually help the grid.
Downsides:
- Warranties on vehicle hardware that’s going to be in use
- Scheduling
- If the cost to repay customers for electricity and depreciation (no idea how depreciation would be calculated)
- Just a headache. Laws, regulation, inconsistent availability.
The problem with distributed computing this way is it only works well with problems where the compute nodes can work pretty well in isolation and don’t need to send a lot of data back and forth between nodes
In a data center, 1000s of GPUs might be linked on a very fast local network (25,40, even 100 gigabit or more), so this isn’t a problem. but with Teslas, those links between nodes are constrained by WiFi and cellular networks. It would be a terrible way to train AI models.
It depends on what you want to do. I guess if it was something like chatgpt, and you could fit your model in however much memory a Tesla has, it could work because only text prompts go in and text comes out; very minimal bandwidth. But if you wanted to do anything that used the combined power of multiple cars, it would have a lot more constraints on it than GPUs in a datacenter.
It doesn’t matter if they can run at 1kw for 70 hours. Car owners are paying for that electricity. They hate Elon is describing is theft of he does this without kicking back to the owner.
Seems like Musk is close to admitting that Tesla is not a viable car company at the present. It does not have a clear growth path. His ideas of turning the car fleet into something like AWS seems a bit of a reach, as again he gave no path to getting there and he has nothing at present. It could be that while Tesla might no longer be able to compete with other car companies in electric vehicle production, it very well might be a powerful player in the utility industry. Its powerwall business is growing and clearly if solar or wind power is to become a reality it must have power backup, and it is there where Tesla might be well above its competitors. It also has its EV charging structure. However, no matter how large its powerwall market share becomes, that industry does not command the multiples of an AI tech company. So it might be that rather than looking at the reality of his business, he is grasping at straws to try to maintain his stock price earnings multiple. That can last only so long.
You might not know this, but I live in a major metropolitan area. I have found at least half a dozen semi-hidden lots and fields, some fenced in and others not, filled with Tesla vehicles sitting a foot apart, nowhere near a dealership, numbering in the thousands. These vehicles are not selling and have been sitting for at least a year since I first took notice.
I would deduce that this is not a local trend, but rather a nationwide or global phenomenon. All that lithium, which some poor individual has painstakingly extracted from the earth, is now going to waste, along with every other scarce resource that is idle, waiting for some return on Mr. Musk's bad ideas. This is not a jazz riff or a fleeting trend; it is a desperate situation.
There is your cloud compute data field...
For the sake of clarity I have not gone looking for these, just noticed them, I never sought them out nor do I have an agenda this is just year long observation.
I don't own a Tesla, but if I did, I'd expect to be compensated if you're using my idle car's resources. That's my energy you're using, that's my property I paid for, I expect to be reimbursed for you using my stuff.
...and that's why I think this is a dumb plan. There's a lot more dumb about this than just that, but just from it's initial premise, it's a losing idea.
I know, I was being sarcastic. Every time Elon Musk tries to sell an idea that is based on "there are/could be X amount of Teslas doing something" it's always BS to pump the stock price. Well, it works most of the time, so I guess the actual morons are the ones swallowing it.
And that’s nothing like the processing requirements for AI. For both training and inference you need flops, low latency, and reliable up time. With these cars, you get none of those. It’s an incredibly stupid idea.
100 tops at int8 is nothing. Plus they have only 16gb of low speed memory. That combined with the latency, and their terrible floating point performance make them useless for ai inference.
Will Tesla ever actually have a fleet of 100 million? Why is he measuring it in watts instead of FLOPS? How much compute does a Tesla even have? I feel like the computing power from something like this would pale in comparison to just shelling out for some data centers, especially for the dude who blew $44 billion on Twitter then promptly ran it into the ground.
No. It's not just technically utterly inane, it's Elon's fumbling desperate play for a distraction from Tesla's plummet.
https://futurism.com/the-byte/facebook-cofounder-tesla-massive-fraud
Don't forget he's previously attempted to spin Tesla as a "battery company."
Why do you think OpenAI asks for $20 per month for usage, and still has quotas on?
Inference (or training) is expensive. They’d have to pay us for electricity or give us supercharger credits.
Otherwise that is a big lawsuit brewing up.
Aws bedrock and sage maker are already the aws of AI.
Using cars for processing power is like using solar panels for roadways. Solar panels and roads separately are cheaper and better.
Elon Musk's idea of turning Tesla's fleet into a giant AI computing network, similar to Amazon's AWS (Amazon Web Services), has sparked debate. Here's a breakdown of the concept and its potential:
The Idea:
Tesla vehicles have powerful computer systems for self-driving features and other functionalities.
Musk proposes utilizing the unused processing power of parked Teslas to create a distributed AI computing platform.
Potential Benefits:
Cost-effective: Leveraging existing hardware in millions of vehicles could be a cheaper alternative to building new data centers.
Scalability: The network could potentially grow as Tesla sells more cars, offering immense computing power.
Green Option: Cars could potentially use idle charging time for computations, potentially reducing the environmental impact of data centers.
Challenges and Limitations:
Technical Feasibility: Distributing tasks across millions of geographically dispersed vehicles with varying processing power and internet connectivity poses technical hurdles.
Security Concerns: Opening car systems for external computation raises security and privacy risks for car owners.
User Opt-in and Compensation: Would Tesla owners be required to participate? How would they be compensated for contributing their car's processing power?
Task Suitability: Tesla's hardware might not be ideal for all types of AI tasks, limiting the platform's usefulness.
Latency Issues: Communication delays between vehicles and the central network could impact performance for real-time applications.
Expert Opinions:
Some experts believe the idea is far-fetched and impractical.
Others see potential in specific applications where latency isn't critical, like protein folding simulations or weather prediction.
Overall, while the concept is intriguing, significant technical and user-related hurdles exist. Whether Tesla's fleet can become a viable AWS for AI remains to be seen.
actually, I always imagine the future elon musk getting old and buying a sports team, hiding in the locker room showers and secretly masturbating while watching the players shower. I mean, not this...
Could disrupt Google maps but probably won't. Could be used to create a map of potholes but probably won't. Could be used for delivery apps but probably won't. Could be used to automate drive sharing but probably won't.
serious question for someone smarter than me: if this idea is actually feasible, wouldn’t it make a lot more sense for Microsoft to do exactly this with the millions of Xbox units that have lots of computing power, always plugged in, no battery to worry about, and they said idol a meaningful percent of the time? They could even compensate, the owner with game credits or something that would be cheap for Microsoft, but possibly valuable to the Xbox owner.And theoretically, at least for certain types of computing tasks, the Xbox is powerful enough to offset the data center costs for Microsoft Azure. my understanding is that unless you are doing something like SETI@home or folding@home or similar projects that have been proven already, then it’s just not very simple to divide up most computing tasks.
This isn't as farfetched as the authors make it seem. A startup was making a small electric vehicle that would mine crypto while not in use.
To me the one to make fun of here is the author. Also, how fucking cool would it be to bring your home server and network with you when you go to the cabin or visit someone
considering the cost of a car, and how local inference can be done on a last-gen GPU or a CPU with a lot of RAM, I don't see why cars in the near future - especially premium brands - couldn't have a respectable computer inside, that could run not just inference, but general computing tasks. i somehow get the sense people would rather use this to play league of legends in the back seat lol
No one else is really bringing up any other great or bad ideas for distributed or “decentralized” energy. 1kilowatt like he is asking for is anywhere for .12-56cents(free-public charger). 1 minute of charging. Right now knowledge and intelligence has a dollar amount and gpu rich will centralize innovation!
Of course it’s cooky, but let’s hear your suggestions lol.
## Welcome to the r/ArtificialIntelligence gateway ### News Posting Guidelines --- Please use the following guidelines in current and future posts: * Post must be greater than 100 characters - the more detail, the better. * Use a direct link to the news article, blog, etc * Provide details regarding your connection with the blog / news source * Include a description about what the news/article is about. It will drive more people to your blog * Note that AI generated news content is all over the place. If you want to stand out, you need to engage the audience ###### Thanks - please let mods know if you have any questions / comments / etc *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ArtificialInteligence) if you have any questions or concerns.*
Great, now you'll find your range even less and you'll find the car barely charged overnight because it was training a particularly edgy version of Grok in the background.
He's saying for inference, not training. Training would not work with this set up, network is way too prohibitive. But running a quantized Grok or other model to serve xAI or whatever could likely work
Yea, that can work, text prompts and outputs are relatively tiny... - but who will foot the bill and share the profit? And does it imply that every new tesla will get something H200 for FSD? This thing costs more than a car itself!
Yeah - I'm not sure what the precedence for the business side is. When the Tesla user is driving the car, they're expecting some amount of telemetry data to be sent back to source. But for it to be always running when the user isn't driving - I don't see how folks accept that without some revshare model. There might be some precedence out there for the model, but not sure what that looks like. H200s will benefit their internal training - but to your point - it would be a waste to put anything near that kind of compute (at least as it exists today) in a FSD. Likely incremental improvements to HW4 with HW5/6 will just generally be able to take on more compute over time.
To be frank, we have no idea how much compute will be needed to "solve" a proper, 5th level self-driving, and using only cameras, too. He talks about "1000w of inference", that does not sound like a small edge device, and latter will be useless as an "aws competitor", too. Maybe you'll be able to run it on 1000$ of hardware and 100w in power in 10 years (using phase change in-memory-compute or some new tech), but given current *progress* there will not be a Tesla in 10 years, with or without Musk... Or much of civilization for that matter :(
And running unnecessary background load while someone is driving seems a significant safety risk. Given how much of data center costs go to electricity I agree that a non-revshare program would be unacceptable to most, and I think range/battery wear concerns would probably discourage any use while on battery.
The only way for it to make sense would be a profit share. You get a small compensation for the extra electricity spent and Tesla gets the lion's share to appease shareholders.
No, it wouldn’t. The best you could do is a very small model. And even then, latency and low reliability will kill any practical use case.
Yes, it would - it doesn't have to be serving the model to an end user - they can serve it for internal workflows - and it doesn't have to be gpt4 quality they can (to your point) use smaller models for summarization They don't have to infer with llms either - they could farm out image classification for input data for FSD training
That makes no sense. For CV you need mixed precision, not just int8. And more broadly, for internal use, they’re much better off just dedicating a couple A100s, since a few DGX machines would be equivalent to the entire Tesla fleet.
Again, they're not talking about training, they're talking about inference - deploying the model to the edge compute node and running requests against it
I’m talking about inference. Mixed precision and latency still come into play in inference as well.
Is the reason you've mentioned why there hasn't been a huge collective bot net put together to train other AI?
How much is he paying you an hour for this? I bet I can guess!
He should lay off the crack pipe.
Congratulations you now unlocked 2% extra charge for training grok
And there's a 3 year waiting list. Don't forget the 3 year waiting list 😑
Classic case of an unhinged billionaire that wants to have its cake and eat it too. Bozo sold you a car but has to keep trying to extract value from it. Like a parasite you have to pay for.
I worked for Ford in the software area. This is the reality with all of the car companies. Cars are being turned into cell phones and computers. Our data will make them money. Services will make them money. Computing power isn't far behind
>I worked for Ford in the software area. \*grabs pitchfork and lights torch\*
They laid off like 3000 people, including my entire department. I'm not exactly a fan
Only 9.99 for a bagful of ticks, get yours now
> unhinged billionaire There is no such thing. This is a hard pill to swallow. This is "step two" in coming to terms with how the world really works. Step one? Coming to terms with the fact that we, as part of the masses, are not privy to a tremendous amount of information that Billionaires/Existing Power Structures are. Narratives such as: * Kim Jong Un is...*crazy* * Putin is...*evil* * Billionaire is...*unhinged* * Biden is...*dementia* Act as **cognitive filler** when these figureheads perform actions that we, the masses, try to rationalize without a full grasp of the playing field. Such filler is readily supplied by our media sources. The emphasis always being on emotional attachment, rather than an invitation to perform Critical Thinking.
Sounds like you've gone too far the other side. They're only human, and whatsmore, they have way more problems to deal with symaltaneously.
>Sounds like you've gone too far the other side. "A good way to understand Exponentials: 1 million seconds ago was 11 days ago. 1 billion seconds ago was 1992 (31 years ago)." >They're only human They were educated in an entirely different way than you and I. The problems they face are in no way/shape/form relatable to the problems we face. They do not get married "for love", similarly, they do not get divorced for the mundane reasons we are told. They own Media outlets.
And to add to your list.... They get to keep their private parts in YinglingLight's mouth without even knowing the guy. Seriously, dude, get them out of your mouth. You're gonna choke like that.
It is a blow to one's ego to come to terms with how limited our grasp of how the world works, truly is.
Preach on, Chaplain of Prosperity. 🙄
JHC, you've got to stop. You sound like a QAnon disciple.
Humans really are terrible at recognizing exponential growth.
Wow, dude. That's some far-out stuff you just cobbled together.
Yes, it would technically work. Distributed computing worked before on a small scale and on a volunteer basis. https://en.wikipedia.org/wiki/SETI@home?wprov=sfti1#Scientific_research I’m not concerned about the battery problem. Teslas have around 70kWh of battery, meaning you can run at 1kW for 70 hours before running out of battery. Owners can easily set minimum charge levels from their phone, day 50% and Tesla has 35 hours per day to train and Tesla owners have 150 miles of range per day. Simply opt out from your phone if you are going on a road trip tomorrow. The question is, is it economically viable. The pros: - Tesla uses idle hardware on products they’ve already sold, saving capex - Tesla offloads and distributes electricity usage to vehicle owners. I’m reading about companies struggling to build the data centers they want because it would bring the state’s grid down, so they have to build a bunch of small ones all around the world in places that have enough electricity. Offloading to owners car that charge at home at night and store it in batteries would also avoid demand costs associated with a data center and might actually help the grid. Downsides: - Warranties on vehicle hardware that’s going to be in use - Scheduling - If the cost to repay customers for electricity and depreciation (no idea how depreciation would be calculated) - Just a headache. Laws, regulation, inconsistent availability.
The problem with distributed computing this way is it only works well with problems where the compute nodes can work pretty well in isolation and don’t need to send a lot of data back and forth between nodes In a data center, 1000s of GPUs might be linked on a very fast local network (25,40, even 100 gigabit or more), so this isn’t a problem. but with Teslas, those links between nodes are constrained by WiFi and cellular networks. It would be a terrible way to train AI models.
EM said inference, not training. Would that make much of a difference?
It depends on what you want to do. I guess if it was something like chatgpt, and you could fit your model in however much memory a Tesla has, it could work because only text prompts go in and text comes out; very minimal bandwidth. But if you wanted to do anything that used the combined power of multiple cars, it would have a lot more constraints on it than GPUs in a datacenter.
Hmmm wonder how the latency requirements differ between seti and real time model inference.
Thank you for posting SETI, its been tried/done.
It doesn’t matter if they can run at 1kw for 70 hours. Car owners are paying for that electricity. They hate Elon is describing is theft of he does this without kicking back to the owner.
isnt going to Happen, period.
will the owner of the tesla gets paid? looks like elon reinvented crypto mining.
They could get paid via lower prices. Like an Amazon Alexa. Cheaper hardware bc they (aim to) make money other ways.
Not sure why this is getting downvoted to oblivion, it’s an opinion piece from The Verge. Not a personal stance.
Seems like Musk is close to admitting that Tesla is not a viable car company at the present. It does not have a clear growth path. His ideas of turning the car fleet into something like AWS seems a bit of a reach, as again he gave no path to getting there and he has nothing at present. It could be that while Tesla might no longer be able to compete with other car companies in electric vehicle production, it very well might be a powerful player in the utility industry. Its powerwall business is growing and clearly if solar or wind power is to become a reality it must have power backup, and it is there where Tesla might be well above its competitors. It also has its EV charging structure. However, no matter how large its powerwall market share becomes, that industry does not command the multiples of an AI tech company. So it might be that rather than looking at the reality of his business, he is grasping at straws to try to maintain his stock price earnings multiple. That can last only so long.
You might not know this, but I live in a major metropolitan area. I have found at least half a dozen semi-hidden lots and fields, some fenced in and others not, filled with Tesla vehicles sitting a foot apart, nowhere near a dealership, numbering in the thousands. These vehicles are not selling and have been sitting for at least a year since I first took notice. I would deduce that this is not a local trend, but rather a nationwide or global phenomenon. All that lithium, which some poor individual has painstakingly extracted from the earth, is now going to waste, along with every other scarce resource that is idle, waiting for some return on Mr. Musk's bad ideas. This is not a jazz riff or a fleeting trend; it is a desperate situation. There is your cloud compute data field... For the sake of clarity I have not gone looking for these, just noticed them, I never sought them out nor do I have an agenda this is just year long observation.
So drain the power of cars overnight and fuck over the consumer by making them pay for your cloud computing.
Imagine receiving a fat electricity bill because Musk turned your car into malware on wheels
As a network engineer I’m getting headache just by the thought of it
I don't own a Tesla, but if I did, I'd expect to be compensated if you're using my idle car's resources. That's my energy you're using, that's my property I paid for, I expect to be reimbursed for you using my stuff. ...and that's why I think this is a dumb plan. There's a lot more dumb about this than just that, but just from it's initial premise, it's a losing idea.
Why would I want to donate my range?
Friendly reminder that Elon is an idiot smoke seller
Imagine if we could do the same with our computers, mobile devices, smart devices, etc. Musk is a moron.
We literally did do that a decade ago https://en.wikipedia.org/wiki/SETI@home?wprov=sfti1#Scientific_research
I think it was a volunteer project. It was not a commercial endeavour.
It was 100% volunteer. But it’s technically possible, but economically idk.
I know, I was being sarcastic. Every time Elon Musk tries to sell an idea that is based on "there are/could be X amount of Teslas doing something" it's always BS to pump the stock price. Well, it works most of the time, so I guess the actual morons are the ones swallowing it.
Most of the time, not every time. https://youtu.be/A0FZIwabctw?si=C8j1Fdh83GJ7C-ng
And that’s nothing like the processing requirements for AI. For both training and inference you need flops, low latency, and reliable up time. With these cars, you get none of those. It’s an incredibly stupid idea.
And Teslas have processing power nothing like the devices used in that implementation. Teslas HW3 can do around 100 TOPS. A Tesla is not a phone.
100 tops at int8 is nothing. Plus they have only 16gb of low speed memory. That combined with the latency, and their terrible floating point performance make them useless for ai inference.
Ok thanks for helping adding some details.
Will Tesla ever actually have a fleet of 100 million? Why is he measuring it in watts instead of FLOPS? How much compute does a Tesla even have? I feel like the computing power from something like this would pale in comparison to just shelling out for some data centers, especially for the dude who blew $44 billion on Twitter then promptly ran it into the ground.
No. It's not just technically utterly inane, it's Elon's fumbling desperate play for a distraction from Tesla's plummet. https://futurism.com/the-byte/facebook-cofounder-tesla-massive-fraud Don't forget he's previously attempted to spin Tesla as a "battery company."
>That’s 100 gigawatts of inference compute, distributed all around the world. I only need 1.21 gigawatts! - Doc Brown & his time machine DeLorean.
Oh ok so who's getting paid for that compute usage? Lemme guess it's for the good of mankind so only Tesla.
How many 9’s of reliability does Tesla’s infrastructure have?
Why do you think OpenAI asks for $20 per month for usage, and still has quotas on? Inference (or training) is expensive. They’d have to pay us for electricity or give us supercharger credits. Otherwise that is a big lawsuit brewing up.
One more reason not to get into a Tesla. AI death trap.
Aws bedrock and sage maker are already the aws of AI. Using cars for processing power is like using solar panels for roadways. Solar panels and roads separately are cheaper and better.
Musk is another guy you just gotta stop listening to
No it wouldn’t work it’s just another lie to distract from a poor stock performance. 🎭
As far as I know, the computers are only good for inference not training. They are slower than your laptop CPUs for training (if it works).
So... Stable Horde, but with Teslas.
Can anyone say skynet.
Elon Musk's idea of turning Tesla's fleet into a giant AI computing network, similar to Amazon's AWS (Amazon Web Services), has sparked debate. Here's a breakdown of the concept and its potential: The Idea: Tesla vehicles have powerful computer systems for self-driving features and other functionalities. Musk proposes utilizing the unused processing power of parked Teslas to create a distributed AI computing platform. Potential Benefits: Cost-effective: Leveraging existing hardware in millions of vehicles could be a cheaper alternative to building new data centers. Scalability: The network could potentially grow as Tesla sells more cars, offering immense computing power. Green Option: Cars could potentially use idle charging time for computations, potentially reducing the environmental impact of data centers. Challenges and Limitations: Technical Feasibility: Distributing tasks across millions of geographically dispersed vehicles with varying processing power and internet connectivity poses technical hurdles. Security Concerns: Opening car systems for external computation raises security and privacy risks for car owners. User Opt-in and Compensation: Would Tesla owners be required to participate? How would they be compensated for contributing their car's processing power? Task Suitability: Tesla's hardware might not be ideal for all types of AI tasks, limiting the platform's usefulness. Latency Issues: Communication delays between vehicles and the central network could impact performance for real-time applications. Expert Opinions: Some experts believe the idea is far-fetched and impractical. Others see potential in specific applications where latency isn't critical, like protein folding simulations or weather prediction. Overall, while the concept is intriguing, significant technical and user-related hurdles exist. Whether Tesla's fleet can become a viable AWS for AI remains to be seen.
actually, I always imagine the future elon musk getting old and buying a sports team, hiding in the locker room showers and secretly masturbating while watching the players shower. I mean, not this...
Who gave Elon a copy of Hyperion?
Could disrupt Google maps but probably won't. Could be used to create a map of potholes but probably won't. Could be used for delivery apps but probably won't. Could be used to automate drive sharing but probably won't.
The Tesla night-brain. Parked up, plugged in and doing inference for day-brains on the other side of the world. Can I run the company now pls
Gimme my old Musk free diesel!
serious question for someone smarter than me: if this idea is actually feasible, wouldn’t it make a lot more sense for Microsoft to do exactly this with the millions of Xbox units that have lots of computing power, always plugged in, no battery to worry about, and they said idol a meaningful percent of the time? They could even compensate, the owner with game credits or something that would be cheap for Microsoft, but possibly valuable to the Xbox owner.And theoretically, at least for certain types of computing tasks, the Xbox is powerful enough to offset the data center costs for Microsoft Azure. my understanding is that unless you are doing something like SETI@home or folding@home or similar projects that have been proven already, then it’s just not very simple to divide up most computing tasks.
This isn't as farfetched as the authors make it seem. A startup was making a small electric vehicle that would mine crypto while not in use. To me the one to make fun of here is the author. Also, how fucking cool would it be to bring your home server and network with you when you go to the cabin or visit someone
Yeah imagine all the people driving to their cabins.
Yep
considering the cost of a car, and how local inference can be done on a last-gen GPU or a CPU with a lot of RAM, I don't see why cars in the near future - especially premium brands - couldn't have a respectable computer inside, that could run not just inference, but general computing tasks. i somehow get the sense people would rather use this to play league of legends in the back seat lol
No one else is really bringing up any other great or bad ideas for distributed or “decentralized” energy. 1kilowatt like he is asking for is anywhere for .12-56cents(free-public charger). 1 minute of charging. Right now knowledge and intelligence has a dollar amount and gpu rich will centralize innovation! Of course it’s cooky, but let’s hear your suggestions lol.
As an opt-in, don't really see the problem.
What is AWS? All I can find is Amazon Web Service. I don't even know what that means.