Tesla owner tests autopilot on his kids, YouTube removes video

Share This Post

- Advertisement -

The video violates YouTube’s rules that prohibit content that harms the emotional and physical well-being of minors.

YouTube removed a Tesla video for considering that it shows children in a situation considered dangerous to their physical and emotional integrity, reports the American media. CNBC.

- Advertisement -

The offending video, originally posted on the Whole Mars Catalog YouTube channel, was intended to demonstrate the reliability of Full Self-Driving (FSD), an autonomous driving system, and its ability to stop a Tesla vehicle when a child is on his way In the video, tad parkOwner of a Tesla and investor in the brand founded by Elon Musk, he is carrying out the experiment with his own children.

Violation of the rules relating to the safety of minors.

In the first situation, one of them stops in the middle of a road and Tad Park, driving a Tesla Model 3, drives in his direction. The investor repeats the experience: this time the boy is crossing the street. In both cases, the vehicle detects his presence and stops within a reasonable distance.

- Advertisement -

However, as stated on its support page, YouTube sets specific rules regarding content involving younger people. Videos that “endanger the physical and emotional well-being of minors” including “dangerous stunts, dares, or pranks” are prohibited.

A statement supported by Ivy Choi, a spokesperson for YouTube. According to her, the video uploaded by Whole Mars Catalog violated the platform’s policies against harmful and dangerous content for minors.

“I would trust him with the lives of my children”

“I already tried the beta version of Full Self-Driving. I would trust it with the lives of my children,” Tad Park reacted. “I’m sure he’ll spot them. He was also in control of the steering wheel so he could brake at any moment,” he said.

Separately, Tad Park told CNBC that the car never went over eight miles per hour (12 kilometers per hour) and that he made sure she recognized the girl. Removed from YouTube, the video is still available on Twitter, which did not react to this topic.

In response to the controversial images, the National Highway Traffic Safety Administration (NHTSA) issued a statement warning against experimentation with automated driving technologies. “No one should risk their life, or anyone else’s life, to test the performance of automotive technology,” the agency told Bloomberg.

FSD: a technology still perfectible?

Tesla’s FSD software does not make a vehicle fully autonomous. It is available in the United States for an additional $12,000 or a subscription of $199 per month (199 euros). To access the beta version of FSD, a driver must achieve a certain safety score determined by Tesla.

This option allows the driver to enter a destination to which the vehicle is traveling using Autopilot, the Advanced Driver Assistance System (ADAS). However, the driver must always keep their hands on the wheel and be ready to regain control at any time.

This technology generates fears. In early August, the California Department of Motor Vehicles charged Tesla with misrepresenting the Autopilot and FSD systems. According to the agency, the names of the two features and Tesla’s description of them wrongly imply that they allow vehicles to operate fully autonomously.

In June, the NHTSA, the US federal highway safety agency, published crash data for driver-assisted vehicles for the first time. The US body found that Tesla vehicles using Autopilot were involved in 273 accidents between July 20, 2021 and May 21, 2022.

Author: By Louis Mbembe
Source: BFM TV

- Advertisement -

Related Posts