Hybrid model reveals people act less rationally in complex games, more predictably in simple ones

Ingrid Fadelli
contributing writer

Sadie Harley
scientific editor

Robert Egan
associate editor

Throughout their everyday lives, humans are typically required to make a wide range of decisions, which can impact their well-being, health, social connections, and finances. Understanding the human decision-making processes is a key objective of many behavioral science studies, as this could in turn help to devise interventions aimed at encouraging people to make better choices.
Researchers at Princeton University, Boston University and other institutes used machine learning to predict the strategic decisions of humans in various games. Their paper, in Nature Human Behavior, shows that a deep neural network trained on human decisions could predict the strategic choices of players with high levels of accuracy.
"Our main motivation is to use modern computational tools to uncover the cognitive mechanisms that drive how people behave in strategic situations," Jian-Qiao Zhu, first author of the paper, told Âé¶¹ÒùÔº.
"Traditionally, the Nash equilibrium has served as the standard model for rational strategic interaction–assuming certain idealized conditions–but it often falls short in explaining how people play these games.
"This gap has inspired a surge of behavioral models over the past few decades in the field of behavioral game theory and beyond, each attempting to account for the systematic deviations from Nash's predictions that we observe in human behavior. However, there's still no clear consensus on which models best capture these deviations, or under what conditions they succeed."
This study by Zhu and his colleagues builds on earlier behavioral science literature rooted in game theory to conduct large-scale experiments with computational techniques trained on human strategic decisions.
The team's objective was to compare existing models of decision-making and potentially identify new ones that better describe how people make decisions during strategic interactions with others.
"We used deep neural networks to help us explore the space of possible functions that map what a player sees (i.e., the game matrix) to what the player does (i.e., their strategic choices)," explained Zhu.
"A good mapping, in this context, would naturally lead to a strong descriptive model of human behavior. However, a key challenge associated with using neural networks as a model of human behavior is that they're not easily interpretable."

To improve the interpretability of the machine learning model they used, the researchers restricted the types of predictions it could make about human decisions. This allowed them to introduce theory-based elements into the model that outlined assumptions about the typical behavior of humans.
"Ultimately, we found that a hybrid model—combining a classical behavioral model (the quantal response model) with a neural network—could match the performance of the best unconstrained neural network," said Zhu.
"You can think of that unconstrained model as an approximate upper bound on how much human behavior is explainable, because in principle, it could approximate any function. In our hybrid model, the key innovation is that the noise parameter in the quantal response function is predicted by a neural network.
"This means the level of behavioral noise is no longer fixed—it depends on the specific game, or context, that the player is facing."
In their paper, the researchers offer an interpretation for the noise predicted by their neural network based on contextual information. Specifically, they interpret this context-dependent noise as a reflection of how complex a game is perceived to be by a player.
Essentially, the team suggests that people behave more rationally while playing games that they perceive as easier. In contrast, when they are playing more complex games, people's choices could be influenced by various other factors, thus the "noise" affecting their behavior would increase.
As part of their future studies, the researchers would also like to shed more light on what makes a game "complex" or "easy." This could be achieved using the context-dependent noise parameter that they integrated into their model as a signature of "perceived difficulty."
"Our analysis provides a robust model comparison across a wide range of candidate models of decision-making," said Zhu.
"We now have strong evidence that introducing context-dependence into the quantal response model significantly improves its ability to capture human strategic behavior. More specifically, we identified key factors in the game matrix that shape game complexity: considerations of efficiency, the arithmetic difficulty of computing payoff differences, and the depth of reasoning required to arrive at a rational solution."
The findings gathered as part of this recent study also highlight the "lightness" with which many people approach strategic decisions, which could make them vulnerable to parties looking to sway them towards making irrational decisions.
Once they gather more insight into what factors make games and decision-making scenarios more challenging for people, Zhu and his colleagues hope to start devising new behavioral science interventions aimed at prompting people to make more rational decisions.
"We're excited about this work, which demonstrates how modern computational tools can help uncover psychological theories of human behavior," added Zhu.
"The immediate next step will be to test whether our findings generalize beyond the matrix games used in our experiments. Real-world strategic interactions—such as those in war, diplomacy, elections, or market competitions—are often far more complex and nuanced.
"We hope to gain deeper insights into how people make decisions in these more realistic and high-stakes settings by leveraging and developing the modern toolkit."
Written for you by our author , edited by , and fact-checked and reviewed by —this article is the result of careful human work. We rely on readers like you to keep independent science journalism alive. If this reporting matters to you, please consider a (especially monthly). You'll get an ad-free account as a thank-you.
More information: Jian-Qiao Zhu et al, Capturing the complexity of human strategic decision-making with machine learning, Nature Human Behaviour (2025). .
Journal information: Nature Human Behaviour
© 2025 Science X Network