/cdn.vox-cdn.com/uploads/chorus_image/image/45801848/shutterstock_229091704.0.0.jpg)
We live in a world of reality bubbles, media enclaves and echo chambers in which people share and receive the opinions they already hold.
For game-makers, this poses a problem. In all human societies, conflicts occur at meeting places and borders. Game communities create new places for culture warriors to depart their own safe places and clash with those who do not share their world view.
At GDC today, some veteran custodians of game communities debated the woeful state of many watering holes, and the steps that can be taken to diminish the toxic effect of culture wars. Their session was called "Community Management in the Culture Wars."
Raph Koster is well known as the game-designer and Ultima Online and author of A Theory of Fun for Game Design. Richard Vogel is president of BattleCry Studios which is making Bethesda's multiplayer game of the same name. Gordon Walton is president of ArtCraft Entertainment, currently developing Crowfall. All three have been creating games around communities of players for decades.
As Koster pointed out, we are all responsible for the bubble-filtering that creates mutual incomprehension between cultural groups, broadly defined in the West as the political right or the political left.
"When you ban your annoying uncle from your Facebook account, you collaborate in filtering," he said.
People who gather to enjoy and share their love and interest in a specific game find themselves part of the cultural divide. They witness or participate in its battles.
When you ban your annoying uncle from your Facebook account, you collaborate in filtering
Koster showed research on how homogeneous cultures fight one another when they meet, how people always divide themselves into groups that look and think alike, and behave aggressively towards those who are different. "It's uncomfortable, but it's what the science shows," he said.
While social media exacerbates these divisions, and sometimes benefits from them through higher levels of engagement, they can be deadly for game communities and forums.
Understanding the psychology of individuals and of group behavior can help community managers avoid conflict and maintain healthy and welcoming forms. It can help to avoid the sort of bullying, mobbing, doxing and threatening that has characterized a lot of communal interactions in gaming in the past year.
"If it's happening in your community, the consequences are heavy duty," said Koster, referring to the hurt such behavior causes individuals, and the problems it brings to game community managers.
He said that a good solution is to cleave to Dunbar's number, when it comes to community size. "You don't want more than around 150 regular commenters per forum," he said, conceding that General Discussion forums are "always a pit."
People behave better when in places where they are known
Walton said that "criminals tend to go to big cities," where they can find anonymity, rather than small towns, where they are likely to be recognized. Even communities that require social media sign-in have seen bad behavior. When the community becomes very large, some people don't care if their identity is known. "Scale matters," he said.
Koster said that forums should be subdivided as often as possible. This creates smaller groups that are far less likely to indulge in community rule-breaking or abuse."What drives good behavior is knowing you will bump into someone again in the future," said Vogel.
Sub-divisions within communities are liable to create tribes, which can sometimes be hostile. Community managers should work to avoid this, even where there are obvious areas for disagreement. Many community members are motivated to act as ambassadors between different community elements, thus reducing friction.
Negative actions such as downvoting ought to be avoided, Koster said, adding that in places like Reddit, they encourage brigading and hostile campaigns between rival groups.
Honeypots can also be created to attract chronic trouble-makers, identify them and ban them. But positive incentives for good behavior tend to have the most impact.
Forum-goers rarely bother to read rules, but rules are important. Leaders on the forums follow rules and create an example for everyone else. The most important factor for avoiding bad behavior is a creating sense of social shame for transgressors.
Long-winded terms-and-services forums are way less effective than a single-click question asking new members to behave themselves.
Saying sorry can take a lot of heat out of a bad situation
Anonymity should be avoided, except when giving members an opportunity to report abuse. The gathering of large numbers of community members causes far more problems than establishing a common core identity among the group as a whole. Communities in which members define themselves through their membership tend to ride out problems.
Most important is the issue of the game-maker being honest and straight with its members, especially when the community becomes angry with the developer or publisher for some reason. "Saying, 'suck it up, this is how we are doing things,' is not very helpful," added Vogel.
"It's better to talk in terms of 'we,' than 'you,'" said Koster, adding that sometimes, just saying sorry can take a lot of heat out of a bad situation.
Where there is a community inflammation, there is always the option of banning. "Sometimes, it's better to opt for surgery than inflammation treatment," said Walton. The panel agreed though, that banning accounts often does not solve problems. Listening to, and engaging with the community while understanding fundamentals of human behavior can help community managers navigate increasingly difficult times.