Mattel Pulls Aristotle Children’s Device Over Privacy Concerns

This post was originally published on this site

Mattel announced on Wednesday that it was canceling plans to bring to market a smart device called Aristotle, which was aimed at children from infancy to adolescence and was set to hit stores in 2018. The decision came after child advocacy groups, lawmakers and parents raised concerns about the impact the artificial intelligence device could have had on children’s privacy, development and well-being.

A petition asking Mattel not to release Aristotle, started in May by the Campaign for a Commercial-Free Childhood and the Story of Stuff Project, garnered more than 15,000 signatures and argued that babies and older children shouldn’t be encouraged to form bonds with data-collecting devices.

Last month, Senator Ed Markey, Democrat of Massachusetts, and Representative Joe Barton, Republican of Texas, also sent a letter to Mattel in which they wrote: “This new product has the potential to raise serious privacy concerns as Mattel can build an in-depth profile of children and their family. It appears that never before has a device had the capability to so intimately look into the life of a child.”

Alarm bells first rang among industry insiders when Mattel unveiled Aristotle in January. The voice-activated Wi-Fi device with a companion camera was billed as a “first-of-its kind connected kids room platform” that was designed to “comfort, entertain, teach, and assist during each development state — evolving with a child as their needs change.”

[Video: Aristotle is a voice assistant that grows up with your child Watch on YouTube.]

Aristotle is a voice assistant that grows up with your child
Video by CNET

The product, based on the technology of Amazon’s Alexa, boasted features such as the ability to soothe a crying baby, teach A B C’s, reinforce good manners, play interactive games and help kids with homework. Marketed as an “all-in-one nursery necessity” on Mattel’s website, it also offered e-commerce functionality that would enable Aristotle to automatically reorder baby products based on user feedback.

“One of the things that was so striking about this device is that we had so many different concerns. First of all, when you have a device with a camera and a microphone that’s going to be in young children’s bedrooms, there is the potential to collect so much data on children that can be used and shared with advertisers and retailers,” said Josh Golin, executive director of the Campaign for a Commercial-Free Childhood. “Then there are all these child development concerns about replacing essential parenting functions with a device.”

A spokeswoman for Mattel said that the decision not to bring Aristotle to the marketplace was prompted by new leadership in the company. She said that Sven Gerjets, the company’s new chief technology officer, “conducted an extensive review of the Aristotle product and decided that it did not fully align with Mattel’s new technology strategy.”

Mattel’s recent announcement was met with praise. “This is a huge victory for everyone who believes that corporate profits and experimentation should never come at the expense of children’s privacy and well-being,” Mr. Golin said. “We commend Mattel for listening to the child development experts and thousands of parents who told them a child’s bedroom should be free of corporate surveillance and that essential caregiving functions should never be outsourced to robots.”

Aristotle wasn’t the first electronic device to come under fire — Mattel also was criticized when it released the Wi-Fi interactive Hello Barbie in 2015 — and it very likely won’t be the last.

James Steyer, founder and chief executive of Common Sense Media, noted that breaches to this privacy can and do happen. For instance, the game and toy manufacturer VTech experienced a breach in 2015 in which nearly five million parent accounts and six million student accounts were compromised, including names, emails, addresses, usernames and passwords.

Even though Aristotle didn’t make it to market, Mr. Steyer said he is concerned that “the next version will look more like a toy — say, placed inside a cute teddy bear — and then it will be 2018’s must-have present, followed shortly thereafter by security issues that either researchers or hackers will discover.”

Beyond the privacy concerns, Sherry Turkle, director of the M.I.T. Initiative on Technology and Self and author of “Reclaiming Conversation,” said that the progression of increasingly advanced products with humanlike capabilities can cause irreparable harm to young minds.

“The ground rules of human beinghood are laid down very early,” she said. And what she calls “intimate machines” have “changed the ground rules of how people think about personhood.”

The stakes, according to Dr. Turkle, couldn’t be higher. “This is not at all an anti-technology position. This is about a particular kind of technology, one that pretends empathy,” Dr. Turkle said. “We can’t put children in this position of pretend empathy and then expect that children will know what empathy is. Or give them pretend as-if relationships, and then think that we’ll have children who know what relationships are. It really says a lot about how far we have gone down the path of forgetting what those things are.”

Dr. Dimitri Christakis, co-author of the American Academy of Pediatrics’ 2016 media guidelines for children under 6, said he is “constantly dismayed by how much we are technologizing childhood” and believes it contributes to our dependency on digital devices.

As for the Aristotle, Dr. Christakis said: “I am not a fan. More to the point, Aristotle himself would not be. He said, among other things, ‘Good habits formed at youth make all the difference.’”

Dr. Christakis said that beginning in infancy, children need not only the warmth of human interaction but also to learn to be alone and soothe or entertain themselves — without the constant presence of a digital device.

“I’m glad that there was sufficient uproar and that this product went away, but it’s not the last time we’ll see such things.”