Guy whose cars run into stopped fire trucks thinks he’s an expert on computer vision.
Someone died because one of those cars thought the broadside of a white semi trailer was the sky and drove under it
I thought it mistook the semi that was crossing a divided highway as an overpass and attempted to drive under it?
This is exactly how Ed Truck died. His capa was detated from his head.
But was it a “stealthy” fire truck??
Yes, but the car only had enough low-light cameras but not enough rudimentary AI
Speaking of fire trucks has anyone here ever read the emergency response procedures for teslas in severe accidents? When I was a volunteer we gave it a look over.
If I remember right, Depending on the model they recommend up to 8,000 gallons (~30k liters) to keep an overheating battery’s temp stable in case of fire or exposure to high heat. I’ll link the resource page here.
Our engine holds 700 gallons (5.2k liters) and the typical tanker in our area holds 2,000 (7.5k liters)
That’s a house fire level response for a single electric vehicle. Just getting that much water moved to a scene would be challenging. We have tankers, but how many city departments can move that much water? You don’t see hydrants on highways. And foam is not effective like it is for normal car fires. The future will be interesting for firefighters.
I found a link on how the Austrian fire workeres handle this. The fire is extinguished first, then the remainders of the car are put into a special roll-off container (Abrollbehälter, AB) and driven to a gravel pit, where the container will be flooded with 21000 litres of water.
That’s interesting. Tesla says the cars shouldn’t be submerged but I wonder if there’s any serious consequence if you did?
After this procedure, the car is irreparably damaged, if this was your question.
As far as safety. It is a big battery after all
30,000 liters is 30m^3, which is a back yard swimming pool full of water.
Now imagine a house on fire with a tesla in the garage or multiple vehicle accidents. Now you need that much more
Now imagine having to use 30,000 litres of water for every Tesla/EV on fire while facing extreme drought conditions caused by global warming.
lookin at you Cali
It is especially important to understand that Tesla’s struggles with navigation are entirely a result of Elon refusing to equip them with LiDAR. This isn’t some “The tech is really new and really complicated and we’re still figuring it out” problem. There is a very good solution to most collision avoidance scenarios, but Elon refuses to let them use it because he’s an idiot.
They also sometimes lock people inside and burn them to death.
https://www.mirror.co.uk/news/world-news/terrified-friends-burned-death-tesla-34087725
For those doing the maths at home:
An F35 who obligingly flies top-towards-you (not exactly something you can do, but hey, maybe they’re turning) is all of 10m tall.
An AIM-120C can very comfortably hit a target at 100km.
At that range, the F-35 takes up 26 arcseconds, or 0.007 degrees. That’s roughly about the size of this period, at a distance of 3 meters away.
[ . ]
Good luck spotting that in a sky of roughly the same colour, full of other objects.
You can place cameras anywhere, they don’t need to be right next to what is being targeted. Nearer ranges will allow AI to misidentify at much higher rates than max standoff ranges of an AIM-120C.
I don’t think you were getting enough credit for ‘misidentify’.
Pffffffff
I can see that bright white dot against the dark mode background on my maximum brightness screen with ease! Therefore your argument is invalid!
with a big enough screen i bet an AI camera could see it too
“I said AI sir!”
Yeah but what about the AI? Have you thought about the AI that would be running it, which never misses, and would totally be a useful existing thing? 😉
And if it isn’t, just frankenstein another AI against it. The solution to lacking AI is more AI, obviously.
Just for reference: JWST has an optical resolution of 0.07 arcseconds. It’s a mirror 22 feet in diameter though, not something you’d put inside a missile guidance package.
JWST operates in space, i.e. there is no atmosphere blurr to take into account.
Oh yeah I’m not suggesting we make a missile with JWST mounted on the front!
Well but I am!
Although, we would still need to get it back here… Okay so first we send two more rockets after it! One to return it on and one with the/a human engineer on board to pack it back up.
I mean we can hardly have it return while unpacked. That would damage the delicate heat baffles! And we need those to shield it from the rockt engine at the back of our missile so it doesn’t start targeting itself because it no longer knows where it is/isn’t…
Holy shit. I just realised that the reason they’re building the ELT is so they can mount it on a missile and shoot down an F-35 at some point.
Magnifying glass makes things bigger, checkmate! 🔍🔍
and then also dealing with the F-35 itself, even if you managed to lock on and target it, it will have anti-warfare capabilities you have to contend with.
Yeah, sure. But that doesn’t matter if you point the AI at it with a really good zoom lens, though. And then you have a ton of them, pointed in an directions, like the compound eye of a fly. F35 spotted.
Sir, our air defence is down!
Is it hackers?
No sir, it’s cloudy.
If a fighter jet is within visual range of a camera, it’s already too late. And that’s if there aren’t any clouds.
your not thinking like a musk, not if the government pays the subscription and contract for his early warning camera drone balloon swarm thing or something something they could run on ketamine or something.
imagine it Smithers my electrical spy drones running all day long! and on the government dime!
Part of the reason air defences mostly rely on radar and other parts of the electromagnetic spectrum from at least 3 locations using triangulation to build a precise map of objects in the sky, but just like cameras that doesn’t work when the objects in question are too high or hidden behind objects. From there you can send countermeasures to intercept coordinates and then arm them to search for nearby objects via infrared.
Using the visible part of the electromagnetic spectrum is pretty much useless in modern weapons. I remember seeing even a Tank operator’s display being totally jank because they don’t use normal cameras either, perhaps because they wanted data to train machines to do it instead of human operators? Idk, didn’t make sense to me.
His fucking obsession with computer vision. He’s so convinced he’s right he forgot that clouds exist… and his cars plow straight into obstacles.
Yeah, the “lidar is useless” guy whose cars are consistently crashing into things when visibility is bad is telling us that he can do the same thing with missile targeting systems… Sounds like a great idea
Yeah, well, missiles are supposed to crash into things. The right things? Not his job.
Also night
And that a plane at altitude is too small for wide field cameras which means scanning the sky with narrow fov detectors.
And F-35s are really fast. By the time you recognize and can target it, it’ll fly behind a cloud or something. So not only do you need to make a really fast rocket w/ vision-based AI integrated, it also needs to be able to detect said plane at great distances, as well as maneuver well enough to see it as it exits clouds and whatnot. That’s a lot more complicated than slapping radar on something with heat tracking at close distances.
this has existed for over 2 decades now btw https://en.wikipedia.org/wiki/EuroFIRST_PIRATE
Wouldn’t matter for IR vision, which the F-35 already has.
My friend bought a red car specifically so it could be seen by Tesla’s cameras.
I have bad news for him. They can’t see firetrucks reliably.
No no. See guessing objects from flat images is much better than using math and lidar. Especially if you may have a flawed llm model.
Given how advanced our math and knowledge of radar is, it is literally stupid to use them.
See, those, radar, lidar and math give you a 3d objects.
Oh, wait. It is the other way around.
He’s not, otherwise he would know that “low light sensitivity” cameras aren’t “sensitive in low-light conditions” but “with lower than normal light sensitivity”.
In an imaginary world where cameras are way more expensive, he’d absolutely be pushing LiDAR in cars. The metrics he cares about are cost and marketability (cool factor), or money for short.
someone tell ellen about beyond visual range
Elon Denegerate
Watch as he dances under every low bar
Easy. Just build a giant ball of telescopes and ban water
Edit: Kinda forgot about the horizon. Ban the earth too. Better yet ask the physicists to borrow their ideal frictionless vacuum.
MY LENS IS DIRTY REEEeee oh wait I found alcohol.
That fucker really thinks he’s so smart when all he does is constantly demonstrate what an idiot he is.
His rise really is symbolic of the rot that has taken hold of our society. Truly, our most degenerate moron has risen to the top of the shit pile.
Look up Peter Prinzip
Elon Musk is an idiot
Says the guy that produces AND designed the cyber truck
Look. Just because people hope to not see it and actively avoid looking in the general direction of, does not make the cyber truck invisible.
Oh, if only… If then the owners could be invisible too… One can dream
“It a shit design” is rich coming from the guy whos company can’t get panels to line up on a car.
“It’s a shit design” Says the man responsible for this:
For some reason my kids love them, I just don’t see it. It’s unique, I guess…
I don’t have kids, but when I was a kid I loved Spaghetti-Os and that candy that comes in a toothpaste tube but is literally just gelatinous sugar syrup. I probably would’ve loved the cybertruck too.
It’s a simple design, like a boxcar you’d race with your dad at the local boy scouts event. It appeals to children who don’t understand how airflow works and just like seeing big bulky tank like things. To them, it looks like a Tonka toy.
But in the real world, things like fluid dynamics are important.
It also weighs enough that it cant even pull off being decent by being light like old jeeps. Sure they were literally brick shaped but they could be moved by like 4 guys with relative ease.
It’s in fortnite
Well, my kids don’t play Fortnite, at least not at our house (family rule). So if that’s the case, it’s because their friends told them about it.
Also… Fighters are fast, the point is you should fire the missile before you see it.
The f-35 is built for engagements outside the horizon, like, the target is blocked by the curvature of the earth.
Light sensitive cameras and rudimentary AI…
Pssh, all you need is a gravity lens to bend the light, problem solved!
So we use a black hole to bend space time and look into the past where it isn’t. We know where it isn’t in the past.
but have you considered that if i can’t see it then it doesn’t exist?
Hmm, that’s a good point, like air, right? Total scam.
dont even get me started on air
Looks like being rich and surrounding yourself with yes-people is the #1 cause of sitting confidently at the top of the Dunning-Krueger curve.
He’s riding that curve like one of those surfing dogs.
I think the better analogy is that he has set up camp on top of Mount Stupid and ain’t moving from it.
People will pilgrimage to the peak of Mount Stupid to pray at the alter of his HYPEoxia.
Can’t these things aerosolize you from beyond the fucking horizon? How helpful are those AI powered low light cameras when they’re phase transitioned by a missile launched from a hundred miles away?
You’d need a camera network spanning the entire battlefield. And it’d need telephoto lenses at the very least, because stealth fighters are high and small. And it’d need to stay connected after an initial missile exchange.
I don’t buy for a moment that nobody in the Pentagon has thought of this, and explained why it’s not a dealbreaker in a classified report.
Telephoto lenses have a low field of vision. You’d want very high resolution wide angle sensors. Or maybe a combination of the two, where the wide angle cameras spot interesting things for the narrow angle ones to look closer at.
The difference between the two would be like when they went from U2 spy planes to satellite imagery, going from thin strips of visibility to “here’s the hemisphere containing most of Russia”.
The trick being that wide angle and high resolution means very high expense, and probably a lot of power and ruggedness tradeoffs. For a satellite that’s fine, for this application I kind of think a cluster of narrow-view cameras would be way cheaper and more practical.
BVR and over the horizon radar has been around for decades.
We’re talking about stealth jets here, though…
They don’t give much of a conventional radar return. Which is why Musk even brought up his definitely-new definitely-original idea.
“laughably easy to take down fighter jets”
yeah all you have to do is ban the kid running the elon jet twitter. Seems easy enough to me.
Nah man, just use AI with night cameras. It’s never cloudy or foggy anyway.
rookie mistake, can’t believe i forgot about this.
Let me guess, he’s got an alternative to sell?
He has Tesla’s “Full Self Driving” system, which works with AI and cameras.
He probably wants to just upload his software to US fighter jets for, say, $20 million per unit.He has Tesla’s “Full Self Driving” system, which works
well there you’ve already told your first lie
They used air bunnies. S’all good.
It might actually cost way more than that, changing the jet’s balance and attaching things to them requires major modifications, but him selling the the systems for that alone sounds about right.
Maybe the real plan is to intentionally fuck it up and weaken the US Military. Wouldn’t put it past the Trump admin.
The real plan is to claim they did it (or tried to) and pocketing that sweet DOD money.
The least credible defense of all: “We tried (not really).”
Sucking up to Putin. The next thing he will say is all 2 su57s in existence are much more advanced.