Learning Objectives

  1. Slot Machine Reward Psychology Programs
  2. Punishment And Reward Psychology
  3. Slot Machine Reward Psychology Definition
  4. Slot Machine Reward Psychology Course

Free Slots – Play 7780+ Free Online Casino Games. You’ve just discovered the biggest online, free slots library. Like thousands of slots players who use VegasSlotsOnline.com every day, you now have instant access to over 7780 free online slots that you can play right here. But it’s the way they dish out rewards that really seals the deal, and slot machine makers learned these invisible design elements from basic psychology. Skinner’s famous study on reward is a. Studies also found that the city’s 39,680 machines earn on average $79,962 a year. The reasons behind the success of slot machines lies in our brains, deep within our psyche which we will explore in this article. Read on to find out the psychology behind our collective attraction to slot machines.

  • Distinguish between reinforcement schedules

Remember, the best way to teach a person or animal a behavior is to use positive reinforcement. For example, Skinner used positive reinforcement to teach rats to press a lever in a Skinner box. At first, the rat might randomly hit the lever while exploring the box, and out would come a pellet of food. After eating the pellet, what do you think the hungry rat did next? It hit the lever again, and received another pellet of food. Each time the rat hit the lever, a pellet of food came out. When an organism receives a reinforcer each time it displays a behavior, it is called continuous reinforcement. This reinforcement schedule is the quickest way to teach someone a behavior, and it is especially effective in training a new behavior. Let’s look back at the dog that was learning to sit earlier in the module. Now, each time he sits, you give him a treat. Timing is important here: you will be most successful if you present the reinforcer immediately after he sits, so that he can make an association between the target behavior (sitting) and the consequence (getting a treat).

MachineOnce a behavior is trained, researchers and trainers often turn to another type of reinforcement schedule—partial reinforcement. In partial reinforcement, also referred to as intermittent reinforcement, the person or animal does not get reinforced every time they perform the desired behavior. There are several different types of partial reinforcement schedules (Table 1). These schedules are described as either fixed or variable, and as either interval or ratio. Fixed refers to the number of responses between reinforcements, or the amount of time between reinforcements, which is set and unchanging. Variable refers to the number of responses or amount of time between reinforcements, which varies or changes. Interval means the schedule is based on the time between reinforcements, and ratio means the schedule is based on the number of responses between reinforcements.
Table 1. Reinforcement Schedules
Reinforcement ScheduleDescriptionResultExample
Fixed intervalReinforcement is delivered at predictable time intervals (e.g., after 5, 10, 15, and 20 minutes).Moderate response rate with significant pauses after reinforcementHospital patient uses patient-controlled, doctor-timed pain relief
Variable intervalReinforcement is delivered at unpredictable time intervals (e.g., after 5, 7, 10, and 20 minutes).Moderate yet steady response rateChecking Facebook
Fixed ratioReinforcement is delivered after a predictable number of responses (e.g., after 2, 4, 6, and 8 responses).High response rate with pauses after reinforcementPiecework—factory worker getting paid for every x number of items manufactured
Variable ratioReinforcement is delivered after an unpredictable number of responses (e.g., after 1, 4, 5, and 9 responses).High and steady response rateGambling

Now let’s combine these four terms. A fixed interval reinforcement schedule is when behavior is rewarded after a set amount of time. For example, June undergoes major surgery in a hospital. During recovery, she is expected to experience pain and will require prescription medications for pain relief. June is given an IV drip with a patient-controlled painkiller. Her doctor sets a limit: one dose per hour. June pushes a button when pain becomes difficult, and she receives a dose of medication. Since the reward (pain relief) only occurs on a fixed interval, there is no point in exhibiting the behavior when it will not be rewarded.

With a variable interval reinforcement schedule, the person or animal gets the reinforcement based on varying amounts of time, which are unpredictable. Say that Manuel is the manager at a fast-food restaurant. Every once in a while someone from the quality control division comes to Manuel’s restaurant. If the restaurant is clean and the service is fast, everyone on that shift earns a $20 bonus. Manuel never knows when the quality control person will show up, so he always tries to keep the restaurant clean and ensures that his employees provide prompt and courteous service. His productivity regarding prompt service and keeping a clean restaurant are steady because he wants his crew to earn the bonus.

With a fixed ratio reinforcement schedule, there are a set number of responses that must occur before the behavior is rewarded. Carla sells glasses at an eyeglass store, and she earns a commission every time she sells a pair of glasses. She always tries to sell people more pairs of glasses, including prescription sunglasses or a backup pair, so she can increase her commission. She does not care if the person really needs the prescription sunglasses, Carla just wants her bonus. The quality of what Carla sells does not matter because her commission is not based on quality; it’s only based on the number of pairs sold. This distinction in the quality of performance can help determine which reinforcement method is most appropriate for a particular situation. Fixed ratios are better suited to optimize the quantity of output, whereas a fixed interval, in which the reward is not quantity based, can lead to a higher quality of output.

In a variable ratio reinforcement schedule, the number of responses needed for a reward varies. This is the most powerful partial reinforcement schedule. An example of the variable ratio reinforcement schedule is gambling. Imagine that Sarah—generally a smart, thrifty woman—visits Las Vegas for the first time. She is not a gambler, but out of curiosity she puts a quarter into the slot machine, and then another, and another. Nothing happens. Two dollars in quarters later, her curiosity is fading, and she is just about to quit. But then, the machine lights up, bells go off, and Sarah gets 50 quarters back. That’s more like it! Sarah gets back to inserting quarters with renewed interest, and a few minutes later she has used up all her gains and is $10 in the hole. Now might be a sensible time to quit. And yet, she keeps putting money into the slot machine because she never knows when the next reinforcement is coming. She keeps thinking that with the next quarter she could win $50, or $100, or even more. Because the reinforcement schedule in most types of gambling has a variable ratio schedule, people keep trying and hoping that the next time they will win big. This is one of the reasons that gambling is so addictive—and so resistant to extinction.

Watch It

Review the schedules of reinforcement in the following video.


In operant conditioning, extinction of a reinforced behavior occurs at some point after reinforcement stops, and the speed at which this happens depends on the reinforcement schedule. In a variable ratio schedule, the point of extinction comes very slowly, as described above. But in the other reinforcement schedules, extinction may come quickly. For example, if June presses the button for the pain relief medication before the allotted time her doctor has approved, no medication is administered. She is on a fixed interval reinforcement schedule (dosed hourly), so extinction occurs quickly when reinforcement doesn’t come at the expected time. Among the reinforcement schedules, variable ratio is the most productive and the most resistant to extinction. Fixed interval is the least productive and the easiest to extinguish (Figure 1).

Connect the Concepts: Gambling and the Brain

Skinner (1953) stated, “If the gambling establishment cannot persuade a patron to turn over money with no return, it may achieve the same effect by returning part of the patron’s money on a variable-ratio schedule” (p. 397).

Figure 2. Some research suggests that pathological gamblers use gambling to compensate for abnormally low levels of the hormone norepinephrine, which is associated with stress and is secreted in moments of arousal and thrill. (credit: Ted Murphy)

Skinner uses gambling as an example of the power and effectiveness of conditioning behavior based on a variable ratio reinforcement schedule. In fact, Skinner was so confident in his knowledge of gambling addiction that he even claimed he could turn a pigeon into a pathological gambler (“Skinner’s Utopia,” 1971). Beyond the power of variable ratio reinforcement, gambling seems to work on the brain in the same way as some addictive drugs. The Illinois Institute for Addiction Recovery (n.d.) reports evidence suggesting that pathological gambling is an addiction similar to a chemical addiction (Figure 2). Specifically, gambling may activate the reward centers of the brain, much like cocaine does. Research has shown that some pathological gamblers have lower levels of the neurotransmitter (brain chemical) known as norepinephrine than do normal gamblers (Roy, et al., 1988). According to a study conducted by Alec Roy and colleagues, norepinephrine is secreted when a person feels stress, arousal, or thrill; pathological gamblers use gambling to increase their levels of this neurotransmitter. Another researcher, neuroscientist Hans Breiter, has done extensive research on gambling and its effects on the brain. Breiter (as cited in Franzen, 2001) reports that “Monetary reward in a gambling-like experiment produces brain activation very similar to that observed in a cocaine addict receiving an infusion of cocaine” (para. 1). Deficiencies in serotonin (another neurotransmitter) might also contribute to compulsive behavior, including a gambling addiction.

It may be that pathological gamblers’ brains are different than those of other people, and perhaps this difference may somehow have led to their gambling addiction, as these studies seem to suggest. However, it is very difficult to ascertain the cause because it is impossible to conduct a true experiment (it would be unethical to try to turn randomly assigned participants into problem gamblers). Therefore, it may be that causation actually moves in the opposite direction—perhaps the act of gambling somehow changes neurotransmitter levels in some gamblers’ brains. It also is possible that some overlooked factor, or confounding variable, played a role in both the gambling addiction and the differences in brain chemistry.

Glossary

continuous reinforcement: rewarding a behavior every time it occurs
fixed interval reinforcement schedule: behavior is rewarded after a set amount of time
fixed ratio reinforcement schedule: set number of responses must occur before a behavior is rewarded

Slot Machine Reward Psychology Programs

operant conditioning: form of learning in which the stimulus/experience happens after the behavior is demonstrated

Punishment And Reward Psychology

variable interval reinforcement schedule: behavior is rewarded after unpredictable amounts of time have passed
variable ratio reinforcement schedule: number of responses differ before a behavior is rewarded

I spent part of last week on vacation from science in Las Vegas, where I thankfully avoided financial ruin due to some fortunate combination of genes, math awareness and a wife that has no interest in gambling. Sure, I dabbled a bit in games of chance, but as soon as I got a little bit ahead on the blackjack tables I ran for my life, knowing that the probability would even out hard in the long run. For those concerned about the financial well-being of Sin City, they still managed to turn a profit on us, thanks to the low-return temptations of fine dining and French circus acts set to Beatles megamixes. But most of our time was spent on the free entertainment of people-watching and stuff-watching, observing row after row of people almost hypnotically at work on loud, noisy slot machines amid fake New York, Paris and Venice scenery.

It doesn’t take a PhD in neurobiology to conclude that slot machines are designed to lure people into a money-draining repetition, just as it doesn’t take expertise in the casino business to realize slots are absurdly profitable – there’s a reason why they outnumber table games 100-to-1. But I wanted to go back to the scientific literature to confirm a faint glimmer of information I retained from graduate school, specifically that slot machines are masterful manipulators of our brain’s natural reward system. Every feature – the incessant noise, the flashing lights, the position of the rolls and the sound of the coins hitting the dish – is designed to hijack the parts of our brain designed for the pursuit of food and sex and turn it into a river of quarters. Or so I remember.

Fortunately, there is a robust amount of research into why slot machines are so addictive, despite paying out only about 75% of what people put in. They are, some scientists have concluded, the most addictive of all the ways humans have designed to gamble, because pathological gambling appears faster in slots players and more money is spent on the machines than other forms of gambling. In Spain, where gambling is legal and slot machines can be found in most bars, more than 20.3 billion dollars was spent on slots in 2008 – 44% of the total money spent by Spaniards on gambling last year.

That data was published earlier this month by a psychologist from the Universidad de Valencia named Mariano Choliz in the Journal of Gambling Studies. Yes, such a publication exists! In the background of the paper, Choliz outlines the tricks that slot machines use to keep people feeding them:

Slot Machine Reward Psychology Definition

  • Operating on a random payout schedule, but appearing to be a variable payout; i.e. fooling the player into thinking that the more money they play, the more likely they are to win.
  • “The illusion of control” in pressing buttons or pulling a lever to produce the outcome.
  • The “near-miss” factor (more on this below)
  • Increased arousal (where the sounds and flashing lights come in)
  • Able to be played with very little money; the allure of “penny” slots.
  • And perhaps most importantly, immediate gratification.

Slot Machine Reward Psychology Course

This last point is the subject of Choliz’s experiment, which puts a group of ten pathological gamblers in front of two different slot machines. One machine produces a result (win or lose) 2 seconds after the coin was virtually dropped (it was computer program), the other delayed the result until 10 seconds after the gambler hit play. In support of the immediate gratification theory, gamblers played almost twice as long on the 2-second machines than they did on the 10-second machines…even though the 10-second machines paid out more money on average!

Choliz concluded that the immediacy of the reward was part of what kept people at slot machines, making them so addictive. The quick turnaround between action and reward also allows people to get into a repetitious, uninterrupted behavior, which Choliz compares to the “Skinner boxes” of operant conditioning – the specialized cages where rats hit a lever for food or some other reward. It seems like a cruel comparison, but after my three days walking through the casinos, not an inaccurate one.

Slot Machine Reward Psychology

Another trick up the slot machine’s sleeve was profiled earlier this year by a group of scientists from the University of Cambridge. In the journal Neuron, Luke Clark and colleagues examined the “near-miss” effect, the observation that barely missing a big payout (i.e. two cherries on the payline while the third cherry is just off) is a powerful stimulator of gambling behavior.

The Cambridge researchers put their subjects in an fMRI machine to take images of their brains while they played a two-roll slot machine game. When the players hit a match and won money, the reward systems of the brain predictably got excited – the activation of areas classically associated to respond to food or sex I mentioned earlier. When players got a “near-miss,” they reported it as a negative experience, but also reported an increased desire to play! That feeling matched up with activation of two brain areas commonly associated with drug addiction: the ventral striatum and the insula (smokers who suffer insular damage suddenly lose the desire to smoke).

Clark and co. conclude that near-misses produce an “illusion of control” in gamblers, exploiting the credo of “practice makes perfect.” If you were learning a normal task such as hitting a baseball, a “near-miss” foul ball would suggest that you’re getting closer – it’s better than a complete whiff, after all. But for a slot machine, where pulling the lever has no impact on the rolls other than to start them moving and start the internal computer calculating, a “near-miss” is as meaningless as any miss.

Nevertheless, it’s this type of “cognitive distortion,” as Clark and colleagues name it, that makes slot machines such effective manipulators of our brains. Those massive, gaudy casino-hotels that I wore out a pair of shoes strolling through last week weren’t just built on a crafty use of probability, they were built on a exploitation of brain functions we are only just beginning to understand.