Elon Musk’s Statements About Autopilot Could Be ‘Deepfakes,’ Tesla Defence Lawyers Tell CourT

Tesla’s lawyers have claimed that statements made by CEO Elon Musk about the company’s Autopilot software cannot be trusted and could be deepfakes. The argument was made in response to a lawsuit that blames the company for the death of an Apple engineer, Walter Huang, in a 2018 crash. The family’s attorneys seek to interview Musk about his comments on the software’s safety.

Tesla stated that Musk could not recall the details of such claims and that he could be more vulnerable to deepfakes due to his public profile. The judge in the case, however, found the argument “deeply troubling” and ordered Musk to give a limited, three-hour deposition about his statements.

The use of deepfake technology in litigation raises questions about the reliability of public figures’ statements and the potential for evading accountability.

In 2018, Walter Huang died while driving a Tesla Model X, with the attorneys for his family alleging that Tesla’s driver-assist software was responsible for his death. They sought to interview Musk about his public statements on the safety of the software. In response, Tesla’s lawyers argued that Musk’s statements could be deepfakes and therefore should not be trusted. The argument was made to avoid Musk’s deposition under oath for the lawsuit.

The judge in the case, however, rejected Tesla’s argument, stating that it was concerning that public figures like Musk could use the potential for deepfakes to avoid responsibility for their statements. The judge tentatively ordered Musk to give a limited deposition on his statements about the software’s safety.

While California judges often issue tentative rulings, they are usually finalized with few changes after the hearing. The lawsuit is set to go to trial on July 31.

The use of deepfake technology in litigation is a growing concern, as it raises questions about the reliability of statements made by public figures. Tesla’s argument highlights the potential for deepfakes to be used to evade accountability, which could have significant implications for the legal system.

Leave a Reply

Your email address will not be published. Required fields are marked *