Entertaining As Alexa Is, Coloring Books Are Way Less Dangerous
Alexa and Siri have some explaining to do...
Whether it’s asking Siri what 0/0 is or saying, “Hey Google, let’s play Madlibs,” adults have already gotten comfortable with the idea that AI can be a go-to when curiosity or boredom hit. Children are getting in on the fun too, but at what cost? In a recent event gone viral, a young boy asked Alexa if reindeer can fly. It responded with a curt “No.” In a blazingly quick display of deduction, the boy is shocked to discover this must mean Santa isn’t real! His dad shames Alexa, tells his kid a noble lie, and everyone’s spirit slow claps.
But this isn’t the only time Alexa got in trouble for potentially shocking children. A 10-year-old girl who was fighting some boredom with AI asked Alexa for a challenge to do. I get that climbing Mount Everest would have been a taxing suggestion to give to most 10-year-olds, but Alexa’s suggestion — for her to try her hand at a Static Shock cosplay — wasn’t much better if were being honest. This is not all Alexa’s fault — humans came up with the Penny Challenge, after all — but this is something that was begging to happen eventually. We need to stop giving potentially deadly actions cute names. For example, if Alexa were to recommend the Chubby Bunny challenge, an adult who happened to be in the room might just hear the adorable name and look past the very clear asphyxiation risk the game poses to children.
AI Presents Both Opportunities And Risks For Lawyers. Are You Prepared?
Given how tumultuous this year has been, I personally cannot rule out that this was AI’s first attack on our youth. That said, it was probably just an (innocent?) machine learning error. Amazon promptly fixed this feature, but it does make me wonder how culpability would have played out in the courtroom. Surely there wouldn’t have been Ford Pinto levels of blame, but as self-driving cars and other shot-calling devices spread, we’re gonna need to determine how we hold algorithms (and those who code or release them) accountable. Most of us want the fruits of what machine learning can give us, but what happens when a computer’s educated guess is the wrong answer? I don’t know, but let’s not outsource babysitting to Alexa or Siri in the meantime. The Home Alone series would have been a lot shorter if Kevin took Alexa’s recommendations when he was looking for time to kill.
Amazon Says It Fixed An Error That Led Alexa To Tell A 10-Year-Old Girl To Put A Penny In An Electrical Outlet [Business Insider]
Chris Williams became a social media manager and assistant editor for Above the Law in June 2021. Prior to joining the staff, he moonlighted as a minor Memelord™ in the Facebook group Law School Memes for Edgy T14s. He endured Missouri long enough to graduate from Washington University in St. Louis School of Law. He is a former boatbuilder who cannot swim, a published author on critical race theory, philosophy, and humor, and has a love for cycling that occasionally annoys his peers. You can reach him by email at [email protected] and by tweet at @WritesForRent.