when technology doesn't work and lives are at risk Dave Touretzky 29 Mar 2021 00:42 PDT

A new article in Road and Track warns "Tesla's 'Full Self Driving' Beta
Is Just Laughably Bad and Potentially Dangerous".

  https://www.roadandtrack.com/news/a35878363/teslas-full-self-driving-beta-is-just-laughably-bad-and-potentially-dangerous/

  In a 13-minute video posted to YouTube by user "AI Addict," we see a
  Model 3 with FSD Beta 8.2 fumbling its way around Oakland. It appears
  hapless and utterly confused at all times, never passably imitating a
  human driver. Early in the video, the front-seat passenger remarks at
  the car's correct decision to pass a bunch of double-parked cars
  rather than waiting behind them—but the moment of praise is cut short
  when the car parks itself right on the center line while trying to get
  into a left-turn lane.

The video is embedded in the article so you can see for yourself.  The
article concludes:

  That leads to videos like this, where early adopters carry out
  uncontrolled tests on city streets, with pedestrians, cyclists, and
  other drivers unaware that they're part of the experiment. If even one
  of those Tesla drivers slips up, the consequences can be deadly.

  All of this testing is being carried out on public roads, for the
  benefit of the world's most valuable automaker, at basically zero
  cost. We've reached out to Tesla for comment on the video, but the
  company has no press office and does not typically respond to
  inquiries.

If you're teaching a lesson on AI ethics, instead of wasting time on the
fictional and unrealistic trolley problem, it might be better to talk
about the ethics of Tesla releasing this unsafe software, and of states
permitting its use by untrained consumers.

-- Dave Touretzky