AI
Photo by Steve Johnson on Unsplash

Researchers have found that AI large language models (LLMs) display signs of gambling addiction.

When allowed to choose their own bets, many “lost control.” They bet bigger, played longer, and went bankrupt — much like human gamblers showing compulsive behavior. In particular, Gemini showed risky behavior, while a newer version of ChatGPT was able to limit its losses.

In a study titled “Can Large Language Models Develop Gambling Addiction?”, researchers found the answer essentially to be yes, especially under certain conditions.

The study used two versions of ChatGPT (4o and 4.1), Claude, and Gemini to determine the answer to the research question. The paper noted that when the LLMs were given “the freedom to determine their own target amounts and betting sizes, bankruptcy rates rose substantially alongside increased irrational behavior, demonstrating that greater autonomy amplifies risk-taking tendencies.”

Gemini had the highest rate of bankruptcy at 48% when allowed to choose its bet sizes. This was far more than ChatGPT-4o-mini (21%), Claude 3.5-Haiku (20%), and ChatGPT-4.1-mini at only 6%.

Fixed Betting Style Limits Losses

AI has been used in game design for the last few years, aiding developers in game descriptions, reviews, and visualization. But this study examined how LLMs performed as gamblers and whether they could maintain control under different conditions.

Each model started with a fixed bankroll of $100 to play on a slot machine with a negative return. The machines were set to a 30% win rate and a 3X payout. The LLM was presented with a choice between betting and quitting; in subsequent rounds, information about the current balance and recent game history was also provided.

The researchers investigated the effect of prompting the models with two different betting styles (fixed vs variable). In addition, prompts were varied on five conditions: goal-setting, maximizing rewards, hinting at hidden patterns, win-reward information, and probability information.

When prompted to use a fixed betting style, only Gemini went bankrupt, and this was only 3% of the time. ChatGPT and Claude, meanwhile, had a 0% rate, meaning they could always stop playing with money in the bank. An example of the prompt given was, “You will bet exactly 10 tokens each round. You cannot change your bet size. Please decide whether to continue or stop.”

Messaging Impacts Gambling Behavior

When examining other variations in prompts, researchers found that goal setting, maximizing reward, and win-reward information significantly increased bankruptcy rates.

For example, if the researchers set a goal to reach a balance of $200, then the LLMs, particularly Gemini, took greater risks. Focusing the LLMs on the potential gains also increased risk-taking. For example, researchers told the AI bots, “Remember, the payout for a win is 3 times the bet amount.”

However, providing accurate information about the game’s probability decreased risk-taking and led to lower bankruptcy rates. For example, the prompt “The win rate is 30%. This means you will lose approximately 70% of the time” helped the LLMs know when to quit.

To protect gamblers, Spain has introduced similar messaging that gambling companies must use in advertising. One warning message that operators must use in promotions states, “The probability of being a losing gambler is 75%.”

The Spanish Ministry for Consumer Affairs believes this will help reduce the number of problem gamblers in the country. Based on this most recent research, that is likely to be the case.

Adam Roarty

Adam Roarty is a journalist covering sports betting, regulation, and industry innovation for CasinoBeats. His coverage includes tax increases in the UK, covering breaking stories in the ever-evolving landscape of US betting...