
Anyone who visited the dormant Twitch channel of Fortnite legend Ninja this weekend may have gotten a lot more than they bargained for. The page, it turns out, was directing users to a livestream of hardcore porn on Sunday instead of videogame battles. Ninja, also known as Tyler Blevins, understandably wasn't happy to learn his channel had veered into NSFW territory—he still has 14.7 million followers on the platform. “Disgusted and so sorry,” he wrote on Twitter. Emmett Shear, the CEO of Twitch, apologized to Ninja for the mistake the same night and said that the channel streaming porn had been taken down.
“We have also suspended these recommendations while we investigate how this content came to be promoted,” Shear wrote in a series of tweets Sunday. “It wasn’t our intent, but it should not have happened. No excuses.”
The timing couldn’t have been worse. Earlier this month, Ninja left Twitch and took his Fortnite talents to Mixer, a rival streaming platform owned by Microsoft. Around the same time, Twitch began experimenting with a new recommendation feature, which turned Ninja’s zombie profile into advertising space for promoting other, active livestreams. Typically, when Twitch creators aren’t active, their channels simply feature past videos. In an apparent attempt to keep viewers on the site for longer, Twitch began experimenting with recommending other, active livestreams on dormant channels instead.
Recommendations can be powerful engines for mobilizing audiences. YouTube has attributed more than 70 percent of time spent on its platform to its recommendation algorithm (which has also been criticized for helping to radicalize its users). The feature Twitch was testing simply displayed the top streams in a certain category. Since Ninja primarily played Fortnite on Twitch, the company tried populating his zombie profile with other popular streams featuring the same game. A bad actor appears to have artificially inflated the number of viewers on their porn stream, which was added to the Fortnite category, sending it to the top of the charts—and thus to Ninja’s page. The incident was a high-profile mistake, but it also raises questions about Twitch’s moderation efforts more broadly.
Louise Matsakis covers Amazon, internet law, and online culture for WIRED.Twitch is by far the largest platform in the world for watching gaming content, and now one of the most popular websites on the entire internet, visited more than Pornhub, according to Alexa rankings. Its stars have turned into bonafide celebrities who can earn seven figures each year in sponsorship deals. Twitch creators predominantly livestream themselves playing videogames and talking, often for up to 18 hours or more at a time. All the while, fans flood an accompanying chat box with jokes, observations, compliments, pleas for attention, and harassment. Roughly a million people, by some estimates, are watching a Twitch stream at any given moment.
To moderate this ecosystem, Twitch relies on a blend of tech tools, professional moderators, and volunteers, just like Reddit or Wikipedia. (It's also recently tried suing bad actors in court.) Streamers appoint volunteer moderators to maintain order on their livestream chats, while the company relies on both artificial intelligence and humans to detect streams that break its rules. Twitch can’t heavily lean on AI designed to analyze written text, like Facebook and Twitter do, because the site isn’t primarily text-based. And it can’t preemptively scan videos for things like nudity before they’re published, because they unfold in real time.
So it might not be surprising that Twitch has hosted some problematic streams over the years. Last year, for example, a livestreamed shooting reportedly stayed up on the site for hours. But moderating live content has been a challenge for every social network, and the problem isn’t unique to Twitch. In March, Facebook was chastised for letting a livestream posted by a mass shooter who killed 51 people in New Zealand go viral. And YouTube, which also has a livestreaming feature, struggles even with videos that aren’t broadcast live. The difference is that significantly more information has been made public about how Facebook and YouTube police offensive content than about how Twitch does, in part because it relies on a more decentralized model.