Home » Technology » Tesla’s ‘Full Self-Driving’ beta is here, and it looks scary as hell

Share This Post

Technology

Tesla’s ‘Full Self-Driving’ beta is here, and it looks scary as hell

Tesla’s ‘Full Self-Driving’ beta is here, and it looks scary as hell

This week, Tesla began pushing its “Full Self-Driving” (FSD) update to a select group of customers, and the first reactions are now beginning to roll in. The software, which enables drivers to use many of Autopilot’s advanced driver-assist features on local, non-highway streets, is still in beta. As such, it requires constant monitoring while in operation. Or as Tesla warns in its introductory language, “it may do the wrong thing at the worse time.”

Frankly, this looks terrifying — not because it seems erratic or malfunctioning, but because of the way it will inevitably be misused.

Early reactions to the software update range from “that was a little scary” to full-throated enthusiasm for CEO Elon Musk’s willingness to let his customers beta-test features that aren’t ready for wide release. This willingness has helped Tesla maintain its market leader position at the forefront of electric and autonomous vehicle technology, but it also presents a huge risk to the company, especially if those early tests go wrong.

[embedded content]

A Tesla owner who goes by the handle “Tesla Raj” posted a 10-minute video on Thursday that purports to show his experience with FSD. He says he used the feature while driving down “a residential street… with no lane markers” — a function that Tesla’s Autopilot previously was unable to do.

Right off the bat, there are stark differences in how FSD is presented to the driver. The visuals displayed on the instrument cluster look more like training footage from an autonomous vehicle, with transparent orange boxes outlining parked cars and other vehicles on the road and icons that represent road signs. The car’s path is depicted as blue dots stretching out in front of the vehicle. And various messages pop up that tell the driver what the car is going to do, such as “stopping for traffic control in 75 ft.”

The car also made several left- and right-hand turns on its own, which Raj described as “kind of scary, because we’re not used to that.” He also said the turns were “human like” in so far as the vehicle inched out into the opposite lane of traffic to assert itself before making the turn.

Another Tesla owner who lives in Sacramento, California, and tweets under the handle @brandonee916 posted a series of short videos that claim to show a Tesla vehicle using FSD to navigate a host of tricky driving scenarios, including intersections and a roundabout. These videos were first reported by Electrek.

The vehicles in both Tesla Raj and @brandonee916’s tests are driving at moderate speeds, between 25 and 35 mph, which has been very challenging for Tesla. Musk said Tesla Autopilot can handle high-speed driving with its Navigate on Autopilot feature and low speeds with its Smart Summon parking feature. (How well Smart Summon works is up for debate, given the number of Tesla owners reporting bugs in the system.) The company has yet to allow its customers hands-off driving on highways, like Cadillac with its Autopilot competitor Super Cruise. But these medium speeds, where the vehicle is more likely to encounter traffic signals, intersections, and other complexities, is where Tesla has encountered a lot of difficulties.

For now, FSD is only available to Tesla owners in the company’s early access beta-testing program, but Musk has said he expects a “wide release” before the end of 2020. The risk, obviously, is that Tesla’s customers will ignore the company’s warnings and misuse FSD to record themselves performing dangerous stunts — much like they have done for years and continue to do on a regular basis. This type of rule-breaking is to be expected, especially in a society where clout-chasing has become a way of life for many people.

Tesla has said Autopilot should only be used by attentive drivers with both hands on the wheel. But the feature is designed to assist a driver, and it’s not foolproof: there have been several high-profile incidents in which some drivers have engaged Autopilot, crashed, and died.

“Public road testing is a serious responsibility and using untrained consumers to validate beta-level software on public roads is dangerous and inconsistent with existing guidance and industry norms,” said Ed Niedermeyer, communications director for Partners for Automated Vehicle Education, a group that includes nonprofits and AV operators like Waymo, Argo, Cruise, and Zoox. “Moreover, it is extremely important to clarify the line between driver assistance and autonomy. Systems requiring human driver oversight are not self-driving and should not be called self-driving.”

You Deserve to Make Money Even When you are looking for Dates Online.

So we reimagined what a dating should be.

It begins with giving you back power. Get to meet Beautiful people, chat and make money in the process. Earn rewards by chatting, sharing photos, blogging and help give users back their fair share of Internet revenue.https://www.pmdates.com/assets/sources/uploads/5e2ec867e1d61_pmdates392x105.png

Share This Post