Tesla's Safety Sensor Suite Goes Camera Only, Adopts "Tesla Vision"

2022-10-08 07:25:32 By : Ms. Betty Zuo

Ultrasonic close-range sensors have joined radar in the trash bin over at Tesla, which going forward will build its cars with only cameras onboard to power its driver assistance and collision avoidance features. To put this in context, virtually every other automaker with similar features such as adaptive cruise control, lane-keep assist, blind-spot monitoring, pedestrian detection, collision warning and automated emergency braking, and so on relies on a blend of forward (and sometimes rearward) facing radar, cameras, and ultrasonic sensors, which can detect objects at various distances and—crucially—serve as overlapping backups for one another. Tesla thinks it can do it all with only cameras.

You'll recall that Tesla first made waves like this when it declared its electric cars no longer needed radar sensors for determining what's ahead of the vehicle. Cameras alone would be fine. Again, most other automakers keen on expanding driver assistance features with ever-more autonomy, such as hands-free highway driving systems that steer, brake, and accelerate for your—even on pre-mapped highways—believe that having several redundancies built in via an overlapping set of sensors is the safer choice.

Besides redundancy, wherein, say, a radar sensor picking up an object can be confirmed by a camera "seeing" the same object or vice-versa, there is a case to be made for different, specialized sensors. Longer-range detection is radar's specialty, while cameras detect things such as road markings and the shapes of objects—helping discern between people, bicycles, other cars, and more. The 12 ultrasonic sensors per car Tesla thinks it no longer needs, for example, are designed for close-range detection of other cars or objects; often thought of as "parking sensors," on Teslas these also provide a near-field, 360-degree idea of what's around the car at any given point. This would seem to be key for, say, Tesla's Autopark self-parking feature and its Smart Summon feature, where its cars can leave a parking space and drive themselves around parking lots without anyone inside to the location of the summoner (who can "summon" the car via the Tesla app).

Going forward, sans ultrasonic sensors, Teslas will rely on one rear-facing camera, a camera on each front fender, a sideways-facing camera on each B-pillar, and a trio of forward-facing cameras for all of its driver assists, collision warning, and automated driving systems. That's eight cameras. Based on how Tesla describes those cameras and each one's location on the car, we're assuming the rear- and side-facing units help both with blind-spot monitoring (detecting cars in or approaching the Tesla's blind spot), which also feeds into the Autopilot driver assist's automated lane-change feature, as well as parking maneuvers. Well, those cameras will do that—but not until an unspecified future date.

As Tesla transitions to building cars without ultrasonic sensors, on those new cars it delivers that have gone camera-only, it will disable or limit several features (that'll still work on cars already built and sold with the ultrasonics):

Not coincidentally, these missing features are all related to low-speed, near-proximity operation. We figure Tesla has some work left to do making its cameras better at "seeing" objects close to their vehicles. Just as it's difficult for cameras to detect things far away—is that tiny speck a car hurtling toward us? a fly at closer range?—detecting things at the margins of the cameras' views—in this case, the corners of the car—also isn't as straightforward as detecting objects more squarely in the camera's field of view.

Tesla says the ultrasonic sunsetting coincides with the launch of "Tesla Vision," its "vision-based occupancy network—currently used in Full Self-Driving (FSD) Beta—to replace the inputs generated by USS." Unpacking that in English, Tesla thinks its crowdsourced data harvesting from cars running a test version of its mislabeled "Full Self Driving" driver assistance feature can help its software "learn" and therefore improve over time. Put another way, owners who've signed up as beta testers for FSD are essentially providing Tesla with more test miles, more real-world scenarios for its software to chew over. Tesla says that this "occupancy network will continue to improve rapidly over time." This same ethos now applies to the camera-only strategy, with Teslas so equipped theoretically improving their "vision" over time as more driving scenarios are sent up to Tesla's cloud computing network for analysis, with software upgrades coming later in the form of system updates.

The ultrasonic abandonment kicks off in October of this year—now—with the Model 3 sedan and Model Y crossover models, Tesla's best-sellers, built for "North America, Europe, Middle East, and Taiwan," followed next year by the larger Model S and Model X. Cars ordered with Autopilot, Advanced Autopilot, and Full-Self Driving capability will have those features active. Tesla says its models with Tesla Vision-powered active safety features, sans ultrasonic sensors, retain the same safety ratings as before. Given Tesla's ongoing issues with pushing boundaries with its driver assistance features—specifically, Autopilot, which has had issues detecting objects—removing redundancies in the system sure is an odd look.