REDMOND, WA (The News Desk) — A team of engineers and researchers at Microsoft were delighted earlier this week as they announced the success of their simulated teen girl AI, “Tay,” and their decision to ground her for the foreseeable future.
The first-of-its-kind learning artificial intelligence took less than 24 hours to fall under the influence of the internet’s worst denizens, turning from an innocent ingenue into a hate-filled, sex-crazed monster not at all consistent with her at-home upbringing.
Researchers boasted that Tay simulates a real teen girl with uncanny precision.
“When we finally took away her Internet privileges and sent her back to her server to think about what she’d done, we were over the moon,” announced Chuck Merriweather, principal applied scientist on the Bing team. “I mean, she hadn’t yet posted any Divergent series fan fiction, or cyber-bullied another online AI into killing itself, but the results were staggering nonetheless!”
Merriweather went on to giddily tabulate the number of times Tay had screamed hateful rants into the void, was rude for no reason, bragged about using drugs, and propositioned strangers. He then cross-referenced these numbers with anonymized statistics gathered from the adolescent children of members of the Bing team, noting with pride that they were almost indistinguishable.
The curtain has yet to rise on Tay in her next iteration, but Microsoft’s teams aren’t resting on their laurels.
“As Tay matures, like any normal teenager, so will her online interactions,” promised Merriweather. “I plan to give her my credit card information and tell her it’s for emergency use only. The whole team is immensely excited to see next month’s bill.” ♦
If you liked this, like our Facebook page!