Twitch is introducing new community guidelines to try and tackle the growing toxicity problem on the platform, including taking into account off-platform behavior when issuing indefinite bans, the company announced today.
In a blog post on Twitch’s community board, the company said that if a streamer uses other mediums to send targeted harassment or hate toward another streamer, it will consider those actions a violation of Twitch’s policies — even though the activity didn’t happen directly on the platform. That means that if a streamer were to antagonize another Twitch user on YouTube, Twitter or Discord, Twitch staff will view those messages as inciting hate and use it as evidence when issuing bans. Twitch’s new rules come just a couple of weeks after Overwatch director Jeff Kaplan announced that the game’s moderation team is monitoring off-platform videos and behavior to assess possible player bans.
A Twitch representative confirmed to Polygon that the company will not actively be monitoring other platforms such as social media accounts, but will continue to rely on reports issued by viewers.
“Users now have the ability to add off-platform conduct to their reports,” the spokesperson said.
These new rules are part of Twitch’s new, stricter policy on harassment, which the company has been working on for months.
“Conduct we deem to be hateful will result in an immediate indefinite suspension,” the blog post reads. “Hate simply has no place in the Twitch community.”
Twitch defines hateful conduct as “any content or activity that promotes, encourages, or facilitates discrimination, denigration, objectification, harassment, or violence” based on certain characteristics, and it is “strictly prohibited,” a Twitch representative told Polygon. Those characteristics are:
- Race, ethnicity, or national origin
- Religion
- Sex, gender, or gender identity
- Sexual orientation
- Age
- Disability or medical condition
- Physical characteristics
- Veteran status
The company also understands that context is important in determining whether a statement or action came with malicious intent. Twitch is updating its current moderation framework to try and better determine the context of a statement. Homophobic remarks, racist slurs and violent trash talk are examples of areas that Twitch will be addressing with its new rules. Twitch is asking streamers to remember that “even if you’re just joking with your friends, you’re still choosing to stream on a service that reaches a large audience.”
“Twitch will consider a number of factors to determine the intent and context of any reported hateful conduct,” the representative told Polygon.
The new guidelines mark one of the biggest overhauls the company has undertaken to combat toxicity and harassment. Twitch is giving streamers until 12 p.m. ET on Feb. 19 to delete clips and videos that violate the new guidelines. The company said it will be reaching out to streamers “whose current and past content may violate these new guidelines” before the new rules go into effect. The changes, however, are only expected to affect a small minority of creators who contribute to toxicity on the platform.
“Our goal is to ensure everyone understands and adheres to the updated Community Guidelines so you can keep creating content for your communities,” the blog post reads.
The full changes are available to read in the revised Community Guidelines.