Brain wave: The PhDs changing games

Using player data to polish video games.

You don’t know them. You’ve never met them. They like it that way. But a group of trained experts is changing what we know about how to make and play games.

They’re user research experts. They’re mostly psychologists. Many have PhDs, although they have diverse backgrounds. They have the latest tech at their disposal, and they understand the way you play games better than you do.

"Perception is subjective," says Celia Hodent, the head of UX, or user experience, at Epic Games. Hodent is one of these user research experts. Her statement is something of a philosophy — these experts are trying as hard as they can to temper subjectivity in design.

They don’t trust your gut.

They want data.

They use biometrics to study the way players’ eyes move. They take huge amounts of information created by players — information that can now be tracked due to better technology — and decide what missions to add or take away.

Whether it’s using user data to create new objectives in Assassin’s Creed, or by erasing abusive messages sent online in League of Legends, data and psychology are empowering design like never before.

Nick Yee, who worked at Ubisoft on its user research team and has now started his own consultancy in the same line of work, says the industry is changing dramatically as a result of this movement.

"This is very much a brave new world."

Fortnite new art asset 2016 Epic Games

The brain, icons and you

"User experience is a philosophy ... it’s about having empathy with the audience."

Celia Hodent said these words to a room at the Game Developers Conference in 2014 — and they were met with loud applause. It was a positive reception for a design ethos Hodent has championed for years, although it hasn’t come as easily as she would have liked.

"I’ve had a lot of pushback," she says.

Hodent’s work centers on UX. It involves taking everything the player sees — the interface, the icons, even object placement — and viewing it through a psychological prism.

Unfortunately, Hodent says, some designers see this approach as unnecessary. Especially designers who have been making successful games for decades. Why not go with the gut?

"Very often it’s a tiny thing that can make a huge amount of difference," she says.

Hodent — whose experience extends from Ubisoft to LucasArts before starting at Epic — is currently working on the title Fortnite. The game hasn’t even been released, but it’s been through vigorous testing.

"It seems like such a small detail, but if you’re having trouble conveying that type of information, it doesn’t make the experience any fun."

Especially on things you might not expect.

Like icons.

"Icons can be a huge thing," she says. "And not every icon is going to be intuitive. You don’t want people tripping up on those."

It may seem insignificant, Hodent says, but because everything in a game can be subjective, the wrong color can ruin a player’s experience.

In one Fortnite test, players couldn’t figure out how to craft the right ammunition for the corresponding weapon. Hodent’s tests led designers to change the icons.

"We have a mechanic in the game that is actually about harvesting and making resources, and when you start harvesting, there’s a little indication that tells you if you’re going to destroy the object faster," she says.

"We had a lot of trouble with it. It seems like such a small detail, but if you’re having trouble conveying that type of information, it doesn’t make the experience any fun."

The change? The developers simply changed the colors. That changed the players’ responses entirely.

While it may sound simple, there are complex processes behind these types of changes Hodent says designers may not necessarily have the experience to appreciate.

"Depending on your background, you’re not going to perceive things the same way."

Hodent points to something called Gestalt psychology, which she says has a huge impact on how players perceive games. It’s complex, but part of this school of thought suggests the mind understands stimuli as being grouped rather than as individual parts. Basically, it’s easier to understand things if they’re portrayed as symmetrical or "clean."

Hodent says for games, this has a huge impact. Colors, lines and shapes automatically form meaning. "It’s why some players recognize characters based purely on color," she says. "We know the color red is a warning. It means something dangerous. We have all these codes making us perceive things differently.

"It’s really important when you’re a designer to check whether your audience is perceiving what you’ve designed. It’s very possible that they don’t see what you see."

"The more I work with people, the more they’re excited to test something."

Other solutions come from simply being aware of psychological principles. Hodent says the latest research indicates players aren’t as good at multitasking as they might think.

As a consequence, Fortnite developers had to rethink a series of complicated messages that initially appeared during combat.

"We’re gaining an incredible amount of insight on how people behave by bringing them in and looking at how they feel about a game," she says.

Hodent and her colleagues believe the impact of these changes can be huge. Early intervention gives developers the ability to remove frustrations in the player’s path — frustrations that, left unchecked, could cause them to abandon the game too early.

Remember, Hodent says. Everything is subjective.

Shouting the loudest often wins in design, Hodent says. Her job is to take a step back and take personal perspective out of the issue.

"I’m here to say that if we take a particular action, it may be perceived in the way not intended, so I’d eliminate that option.

"We have to make fast decisions, and sometimes I get pushback."

The science of playtesting is very much still an art.

"The more I work with people, the more they’re excited to test something. It gives you a good feeling to know you helped make a game better, and you have the data to back you up," says Hodent.

"We see players having a lot more fun, and that’s just so satisfying — to know that you’ve improved someone’s experience so much," she says.


Money can’t buy me fun

With so many gaming companies introducing in-game purchases in recent years, psychologists at major studios say one danger lies in manipulating this information to incentivize people to simply spend more money without enhancing the gaming experience.

Ben Lewis-Evans, a consultant at the firm Player Research, which conducts user research and analysis, says the player’s experience is paramount.

"It sounds corny, but we fight for the players," he says.

"That’s across the board. How are they using the game, how fun is it and what experience are they having?"

That’s also the stated goal of Ben Fisher, the design director at Radiant Worlds, the developers behind the creative sandbox MMO SkySaga. Fisher hired Player Research to conduct user research testing — and it’s completely changed the direction of the game.

"We’ve run tests ourselves, but it always helps to have outside data to back that up," he says. "You need the specific knowledge of how the brain works to see if people can understand your visual information."

"We’re not just talking about what we want. We’re pointing to very concrete examples of things going wrong in our games, or things not going as intended."

The team at Player Research is made up of psychologists and neuroscience experts. Some have worked internally at development studios, including Ubisoft, and the team has worked with developers and publishers including Sony and EA.

Lewis-Evans and his team have a specific ability to find solutions to problems, Fisher says, that he couldn’t provide himself.

In particular, Fisher knew players had a problem with simulation sickness. But they couldn’t figure out how to fix it — and it’s not exactly the type of area where the developers can try and experiment with different options. (Doing so is incredibly resource intensive and expensive.)

Adjustments to the field of view and camera height, smoothing camera motion, adjusting HUD presentation and smoothing player movement were advancements Lewis-Evans and his team were able to identify.

"I’ve worked on 60 game projects during my time at Player Research ... we can talk about what we’ve learned from other games and what solutions may work."

Lewis-Evans’ tests were extensive. Play tests, user research, ongoing player diaries and multiplayer tests for different stages of the game.

"Based on the feedback we gave, things would be updated, and it was useful to give this iteration and this iterative feedback."

But Fisher experienced more problems. In particular, players weren’t using a feature critical to the enjoyment of the game. There are multiple reasons why players might be ignoring something — it’s Lewis-Evans’ job to figure out why.

"We made some wide reaching adjustments over our first couple of stages to smooth out the early play experience and make the moment-to-moment gameplay of the core loop more interesting," says Fisher.

Fisher says developers can often feel as though they have the answers for why play testers come up against brick walls. But he says the default solution for the designer — to simply fix the problem — isn’t what’s needed during a play test.

"You see arguments about game design as being an art or science. I think it’s both."

"It’s to find out why the game has put them into a negative state of mind," says Fisher.

And while Hodent has said her struggle has been to have user testing accepted by designers, Lewis-Evans says a similar challenge for him will apply — making sure players understand that this isn’t "design by committee."

"We’re not just talking about what we want," he says. "We’re pointing to very concrete examples of things going wrong in our games, or things not going as intended, that we want to have fixed."

"You see arguments about game design as being an art or science," says Lewis-Evans. "I think it’s both."

"It’s just a matter of getting what the designer wants."

This problem isn’t limited to smaller studios. Jonathan Dankoff, the user research project manager at Ubisoft, has said the negative player connotation with user testing has brought to light the amount of skepticism directed at this line of work.

"One of the important differences we need to understand about user research is that ... often people think this is the film studio model, where we test it when it’s done, then we say to go back and change it to make certain things more palatable.

"That’s not the case at all. It’s our job to be there from day one and help the designers change their ideas to course correct over time."

That’s exactly what Ubisoft has spent the past 12 years improving.

Edward Kenway spies the fluke of a whale from his pirate ship in Assassin’s Creed 4: Black Flag Image: Ubisoft Montreal/Ubisoft
Assassin’s Creed 4: Black Flag

Psychology in the big leagues

Dankoff, who doesn’t have a PhD but a background in marketing, heads up the user testing and research labs at Ubisoft — where scientists and other professionals are attempting to apply these psychological principles to massive franchises like Assassin’s Creed.

Within the Ubisoft team, data analysts, mathematicians and biometrics specialists all work to understand psychological data in their games. It’s a collaborative effort, and that’s the way the studio wants it. The user research employees sit in design teams.

Using advanced tech like head tracking, the studio is able to obtain info from players in order to adjust their strategies.

For instance, in the Assassin’s Creed series, the user testing team debuted a feature wherein players are actually asked to give each mission a rating out of five stars.

"There are super interesting learnings we can use from that to see which missions our players like, and then match that information with their behavioral data."

"We are science informing art ... and there is never a right answer. All we can do is help people make better decisions with better information."

Part of Dankoff’s job is to determine what players aren’t doing well. In Assassin’s Creed 4: Black Flag, players needed to obtain a Diving Bell before they could progress in the game. The developers wanted to use this as a "gate" to make sure players understood core concepts before they moved forward.

But the data showed players were frustrated and couldn’t afford the bell. They became exasperated.

"Using a few consecutive studies, we were able to identify the parts of the economic loop that players didn’t understand or engage with and advertise them better," says Dankoff.

It’s important, he says, to use the fact data as a guide. Not a bible. The data is only as good as the minds interpreting it, and it can be interpreted wrongly.

"Simple data-driven teams would have just reduced the price of the bell to the amount that players have when they get there and called it a day, as it fixed the problem and players could progress again and everything is fine in the world."

The real solution, he says, was more organic. The data was used to buff up the tutorials about the economic systems, rather than solving a symptom of the problem.

"This resulted in greater overall comprehension of the systems, a rewarding experience and challenge for the player, and it didn’t compromise the design intent of the creative team."

User research labs at Ubisoft are now in their 12th year — the company is getting quite comfortable with embedding user research within its design teams.

"We are science informing art ... and there is never a right answer. All we can do is help people make better decisions with better information," says Dankoff.

League of Legends
League of Legends

A higher purpose

Sometimes those better decisions have to come with a little coercion.

Jeffrey Lin at Riot Games has spent the last few years on a quest to eradicate the amount of hate and toxic speech in the company’s game, League of Legends.

He’s winning.

Armed with massive amounts of data, Lin’s experiments have seen toxic and offensive language containing homophobia, sexism and racism contained to just 2 percent of League of Legends matches, after years of experiments that showed players were 320 percent more likely to quit if they were exposed to such hateful speech.

Small changes made a big difference. A new team-building mechanism stopped rushing players to grab the best characters, and streamlined communication between players by only showing which champions had been picked after every member of a team made their decisions.

"We’ve come to a culture where we are very scientifically rigorous around all major decisions."

Players who won their games after these changes saw a 23 percent drop in negative communications. When they lost, they saw a massive 36 percent fall. Gamers who used positive speech were more likely to end up winning.

More recently, Lin has been using these huge amounts of data to create an artificial intelligence system that will automatically detect unsocial activity and warn the players responsible — or ban them if they continue.

During its first month of use, this artificial intelligence system reduced toxicity in games by 40 percent.

"I think the first few years here showed that ... we can solve these types of problems," he says.

"There are best practices even in the smallest of features."

This type of result motivates Lin, who has a PhD in psychology, to do more. Having access to a living community like League of Legends allows him to uncover all sorts of behavior.

"Working at Valve and now here, you find that you can answer some age-old questions about psychology about communities and societies and how they evolve," he says.

Lin’s work at Valve included biometrics testing on games like Left 4 Dead 2. After Lin joined Riot, he went on a hiring spree to recruit scientists who were hardcore gamers.

"We’ve come to a culture where we are very scientifically rigorous around all major decisions," he says.

That culture has spread — just a few years ago, the data scientists were in a separate building. Now they’re spread throughout the company, sitting next to engineers and designers, all attempting to make incremental progress on a game that is essentially a living organism.

If a designer has an idea? They need a hypothesis, and then it’s put to the test.

"There’s an expectation here now that you can’t just back up your arguments with your opinion as a designer," says Lin.

"That’s a scary science, because you don’t know what you’re looking for."

"We teach our scientists that you need a hypothesis. It could be that you think the best game length is 35 minutes and that you think this based on motivation theory. So you need a framework to either confirm or deny."

But there’s a problem, Lin says, with going too far with this type of thinking. While a game will be able to act more as a "live service" and access huge amounts of data, Lin fears the day when designers start to think of the data as "god."

"There’s a concern that studios and designers will lean more toward data-driven decisions," he says. "They may not have answers, or a hypothesis, but do some analytics and try to figure out what the answer should be.

"And that’s a scary science, because you don’t know what you’re looking for. Designing games in that way could be a very dangerous path."

"It does make it harder for indies to keep up with that because they don’t have the same kinds of resources and the same kinds of labs," he says.

Ultimately, however, while data science and psychology are changing the way games are made, Lin, Hodent and others admit there’s only so much they can do.

Nick Yee — whose firm has created "player profiles" based on massive surveys to inform developers — says the challenge in the future will be balancing how to make games entertaining and informed by data and psychology without being exploitative.

"We’ve seen that dance on the edge with social games," he says.

"We’re interested in how we create more engaging games. I think part of that is just trying to understand players better." Babykayak