Current location - Health Preservation Learning Network - Fitness coach - The video blogger released Tesla's "fully automatic driving" accident video.
The video blogger released Tesla's "fully automatic driving" accident video.
The video blogger released Tesla's "fully automatic driving" accident video.

The video blogger released a video of Tesla's "fully automatic driving" accident. Tesla has been publicly testing "fully automatic driving", and Musk once proudly claimed that there had never been an accident in the FSD beta. The video blogger released a video of Tesla's "fully automatic driving" accident.

The video blogger released a video of Tesla's "fully automatic driving" accident118, and elon musk, CEO of Tesla, claimed that there were no mistakes after the FSD (fully automatic driving) test version was put into use. The day before, Ross Gerber, a shareholder of Tesla, said on Twitter that there had never been a car accident or someone was injured since the release of FSD for more than a year, while 20,000 people died in traffic accidents in the United States in the same period. Musk forwarded this tweet and said with certainty: "Yes".

Two days ago, Musk's definite answer was questioned. A video blogger named AI Addict shared a video of him experiencing the function of FSD, which was shot in San Jose, California with a population of only 1 10,000. According to the video description, the blogger has updated to the version of fsdbetav 10.10 (2021.44.30.15) pushed on October 5th this year. In the video, the vehicle performed well under the control of FSD in most cases, but when driving to a right-turn road, the video reached 3 minutes and 30 seconds. After the vehicle turned right, it did not correctly identify the anti-collision column on the side of the road, and did not hesitate to hit it, and the people in the car were immediately scared. Fortunately, drivers always pay attention to the road conditions and intervene immediately when an accident occurs, so as not to cause a bigger accident.

Of course, Tesla has been reminding completely independent users in the webpage and FSD function description: FSD is in the early testing stage and must be more cautious when using it. It may make the wrong decision at the wrong time, so you must always keep your hand on the steering wheel and pay extra attention to the road conditions. Drivers should always pay attention to the road conditions and be ready to take immediate action, especially when driving in blind spots, intersections and narrow roads. Now it seems that this reminder is still necessary.

The security of Tesla FSD Beta has been questioned, and different versions will have different problems. For example, in the early version of 9.0, there were a series of problems such as not recognizing roadside trees, not recognizing bus lanes, not recognizing one-way streets, and driving directly to parking spaces after turning.

In addition, outside criticism also includes that Tesla directly delivers undeveloped functions to consumers, making it irresponsible for people to use this system on the actual driving road.

Tesla's autopilot system recognizes external objects through the camera and reacts. In general, roadside obstacles, ice cream cones, special lane signs and parking signs can be identified. At present, payment is only open to American car owners.

FSD Beta was officially released and used at the end of September 2020. Interested car owners can apply in the mobile phone, and Tesla will decide whether to enable this function according to the owner's recent driving score. In fact, most car owners can use this function. Tesla will monitor and collect driving data during driving, and adjust and upgrade the optimization system, which is also the reason why FSD provides the upgrade package quickly. In the autonomous driving industry, it is generally believed that laser radar combined with high-precision map is the solution of autonomous driving, while Tesla does the opposite, completely abandoning laser radar and choosing to believe what the camera sees. The video has been uploaded for three days, and Tesla did not say whether the accident was really caused by FSD's failure to recognize the row of green isolation columns on the roadside.

On February, 2002 1 February1day, the National Highway Traffic Safety Administration (NHTSA) requested Tesla to recall 53,000 electric vehicles, because the "rolling parking" function in these vehicles violated the law, which allowed Tesla to pass the intersection with two-way parking signs at a speed of 5.6 miles per hour. American traffic laws require this sign to be near the red light. When there is this sign at the intersection, even if there are no cars or people around, you must stop at the white line of the intersection for 3 seconds and confirm the intersection before you can continue driving. Tesla's software guided the vehicle to pass at low speed and was recalled.

Musk then explained that there is no security problem with this function. He said: "The car just slowed down to about 2 miles (3.2 kilometers). If the vision is clear and there are no cars and pedestrians, continue to drive. "

1 mid-month, Musk just praised FSD Beta. Since the plan began a year ago, there have been no mistakes. However, recently, the measured film of the owner has swollen Musk's face. Because of the wrong judgment of FSD Beta, the vehicle directly hit the anti-collision column on the roadside. Of course, the accident was not serious. After all, the vehicle was not seriously damaged and no one was injured. I don't know if this is included in the "accident" before Musk.

The NHTSA of the United States also reminded: "So far, there is no commercial model that can achieve automatic driving, and every car needs driver control. Although the advanced assisted driving function can help drivers avoid collisions or traffic accidents, drivers must always pay attention to driving conditions and road conditions. "

The video blogger released a video of Tesla's "fully automatic driving" accident. 2 Tesla has been publicly testing "fully automatic driving" (FSD Beta). Last month, Musk proudly declared that Tesla's FSD beta had never had an accident since it was launched a year ago.

However, recently, a video of Tesla using the fully automatic driving FSD beta was exposed, which is considered to be the first accident after the launch of the FSD beta.

Judging from the published video, this Tesla turned right normally, but the route deviated after the turn, and it directly hit the ice cream cone guardrail used to isolate the motorway and bicycle lane, which caused the driver and passengers in the car to exclaim and even swear. The driver immediately took over the vehicle and stepped on the brakes.

Fortunately, the ice cream cone is made of plastic and responds in time. There is only one corner, the front of the car is a little damaged, and everything else is fine.

The owner said that the case has been uploaded to the Tesla feedback department.

It is worth noting that a Tesla Model Y owner had previously complained to the National Highway Traffic Safety Administration (NHTSA) that his vehicle had an accident when using FSD Beta, but this complaint has not been confirmed so far.

Video bloggers released a video of Tesla's "fully automatic driving" accident on February 3 10, and a YouTube user named AI Addit provided the first video evidence to prove that Tesla's "fully automatic driving" (FSD) suite is not as safe as Elon Musk claimed.

AI Addit's video was shot during a driving in downtown San Jose, California, USA. It is the first time to show the scene that Tesla Model 3 activates the FSD system and collides with the bicycle lane guardrail, which proves that FSD should be directly responsible for this traffic accident.

As you can see in the video, the latest version of Tesla's autopilot software FSD Beta 10. 10 instructs Model 3 to turn to the guardrail separating the bicycle lane and the motor vehicle lane. Although the driver hurriedly stepped on the brakes and slammed the steering wheel to avoid it, the vehicle driven by AI hit the guardrail heavily.

What is worrying is that in the video, you can also see that this Tesla car seems to have run a red light for software reasons. It also tried to drive along the tracks, and then tried to drive into the tram lane.

Eddie's driving video is about 9 minutes long, and the crash happened in about 3.5 minutes. The YouTube user seems to be sitting in the driver's seat near the steering wheel, accompanied by a passenger in the back seat. In the video, you can hear Edith's voice: "We crashed, but I have already stepped on the brakes to the end!" The passenger replied, "I can't believe this car hasn't stopped."

Musk claimed last month that there had never been an accident in the FSD test system, but the new video provided evidence that this was not the case. In the video recorded by AI Addit, it can be seen that the driver's hand only touches the steering wheel, instead of holding the steering wheel tightly. The movement of the steering wheel is completely controlled by FSD, so that the car can follow the preset route on the map.

FSD is an advanced driver assistance software designed to support Tesla cars. It belongs to L2 automatic driving system, which means that the system can control the speed and direction of the car and allow drivers to leave the steering wheel with their hands for a short time, but they must always monitor the road and be ready to take over the vehicle at any time. This technology is optional when buying a car, including several automatic driving functions, but it still needs the supervision of the driver.

Just before the crash, the driver could be seen banging on the steering wheel with his hand, but it was too late. Although the damage to the vehicle was not serious, the guardrail that was completely knocked down left a paint mark on the front bumper of the vehicle.

Shortly before the release of this video, Tesla was forced to recall nearly 54,000 electric cars and SUVs because their FSD software could not recognize stop signs. According to the document issued by the US safety regulatory agency on February 1, Tesla has disabled this function through "downloading software updates over the air". In addition, after two meetings with officials of National Highway Traffic Safety Administration (NHTSA), Tesla agreed to recall the affected vehicles.

Many drivers are helping Tesla test the FSD system, which allows vehicles to pass through intersections with stop signs at a speed of 9 kilometers per hour. In the test of AI Addit, the system may not realize that a section of the bicycle lane is blocked. In most of the videos, Model 3 seems to have encountered problems in detecting the green bicycle lane guardrails, which are scattered in the center of San Jose, and cars drive to them several times.

This video seems to highlight other shortcomings of FSD. FSD is still in the testing stage, so it will be improved before it is widely launched. In the video, the car's AI tries to instruct it to turn left into a busy main road, even if a truck is coming. However, it seems that the Fire Services Department has done a good job of "patiently" waiting for pedestrians to cross the road and avoid other vehicles.

At the end of 20021,Tesla said that nearly 60,000 car owners participated in the FSD Beta program. This scheme is only applicable to Tesla owners specially selected by the company and drivers with safety scores as high as 100 (full score 100). However, an investor revealed on June 5438+ 10 last year that his safety score was only 37, but he was allowed to use FSD Beta.

These safety scores are determined by the driver's behavior when driving, and the driver allows Tesla to monitor their driving process using in-vehicle software. Tesla warned that any driver using L2 autopilot system must be ready to intervene at any time.

FSD is an upgraded version of Tesla's driver-assisted driving system Autopilot, which has caused many fatal accidents. So far, 12 fatal accidents involving autonomous driving and a series of non-fatal crashes have been verified.

Although Musk asserted that the safety record of FSD Beta was impeccable in June 5438+10, NHTSA received the first complaint in June 20265438+0/1,claiming that FSD caused a Tesla driver to crash. The accident happened in brea, California, when a Tesla Model Y with FSD software activated drove into the wrong lane and was hit by another car.

Coincidentally, the latest surveillance video shows that in August of 20021year, the autopilot activated by Tesla Model Y crashed into a police car parked on the interstate highway patrol next to Highway 64 in the United States, causing the police car to roll out of control and hit the deputy sheriff standing on the side of the road. Then, the car swept across the sign and finally rolled down the hill and hit a tree. It is said that the driver of this Tesla car is Dr. Devainder Goli. When the car is in the autonomous driving state, he is watching a movie with his mobile phone.

Sam Abuelsamid, an expert on autonomous vehicles, said: "These systems are not autonomous driving systems, but driving assistance systems. All systems are designed for drivers to observe the road, so they should always stay focused. In fact, in Tesla's system, the driver's hand should always be on the steering wheel. Manufacturers may often exaggerate what systems can do without talking about what they can't do. "