Say Hello to Your Friends: Robots Can Now Tattle on Your Babysitter's Bad Attitude

Illustration for article titled Say Hello to Your Friends: Robots Can Now Tattle on Your Babysitter's Bad Attitude
Image: Getty

If you’re a fan of hiding cameras in teddy bears, wait until you get a load of the robot that scans social media to tattle on your babysitter’s bad attitude.

According to The Washington Post, Predictim, an AI company that scans babysitters’ social media feeds for red flags, can tell whether your babysitter is a drug user, a bully, or just disrespectful with a bad attitude—all from their Instagram posts:

Predictim’s executives say they use language-processing algorithms and an image-recognition software known as “computer vision” to assess babysitters’ Facebook, Twitter and Instagram posts for clues about their offline life. The parent is provided the report exclusively and does not have to tell the sitter the results.


The system, which ranks a sitter from one to five in each category, is opt-in for the childcare provider, of course, but the parents get a notification if the individual declines. And if the candidate isn’t cheerful enough on Twitter, the robot will tattle on that too:

The risk ratings are divided into several categories, including explicit content and drug abuse. The start-up has also advertised that its system can evaluate babysitters on other personality traits, such as politeness, ability to work with others and “positivity.”

Predictim also doesn’t give parents any insights as to how those numbers are conglomerated, reducing a babysitter’s potential for being a drug-using bully to an impenetrable score:

When one babysitter’s scan was flagged for possible bullying behavior, the unnerved mother who requested it said she couldn’t tell whether the software had spotted an old movie quote, song lyric or other phrase as opposed to actual bullying language.


For now, the company is targeting the mommy blog space, launching ads that claim Predictim can prevent “every parent’s worst nightmare.” One outlines the case of an eight-year-old girl in Kentucky who was severely injured by her babysitter:

“Had the parents of the little girl injured by this babysitter been able to use Predictim as part of their vetting process,” a company marketing document says, “they would never have left her alone with their precious child.”


But what’s it’s really doing is preventing people like Malissa Nielsen, a 24-year-old early childhood education major, from finding jobs they’re incredibly qualified for:

After she learned that the system had given her imperfect grades for bullying and disrespect, she was stunned. She had believed she was allowing the parents to review her social media, not consenting to having an algorithm dissect her personality. She also hadn’t been told of the results for a test that could cripple her only source of income.

“I would have wanted to investigate a little. Why would it think that about me?” Nielsen said. “A computer doesn’t have feelings. It can’t determine all that stuff.”


I’m predicting that Mary Anne, Dawn, and Jessi could make it past Predictim, but Kristy and Claudia would definitely be out.

Share This Story

Get our `newsletter`



Childcare is physically, mentally, and emotionally demanding work. Experience and reputation are extremely important, and generally (no matter how much you have) you still have to fight hard for decent pay and any kind of sick days or benefits. It’s also incredibly undervalued work, and despite my well over a decade of experience and great references, I’m always competing with 17 year olds and too often getting flack for asking for higher pay than you’d give a teen watching their little brother.

I understand parental anxieties, but *no one* should be expected to be happy, mild mannered, Stepford-style Mary Poppinses in their personal lives, day in and day out, no matter the circumstances. It’s inhumane to ask that of any person, and there’s not a nanny or babysitter in America who makes enough money to deserve this.