The real decisions as to whether someone will buy your services or products…or you, often has very little to do with your actual service or product.
More often, it has to do with how people make decisions…what causes their behavior.
This is especially true when people are experiencing information overload or are simply overwhelmed when you are talking with them.
Today you see nine categories of “biases” that cause people to make the decisions that they make. What’s cool is that most of these are mental short cuts that are typically ignored by people in the field.
Each factor can be directly applied to influencing others and selling of your work.
Here’s a quick example and a key to understanding everything that follows:
If Paul has a belief about politics, Paul will see and process only further information that is consistent with that belief.
Paul will disregard all other information no matter how important and accurate it is.
I tried to get the son of one of my friends to play catch with me a few weeks ago.
“No, Kevin, it’s dangerous.”
(It was summer time, this foto is from a game of catch I played with my son 4 years ago.)
“Explain.”
“When I was a little kid I got hit in the head with a baseball.”
And my young friend develops a phobia of playing catch because of a one trial learning experience he had when he was a kid. “Catch” becomes risky. Thus as life goes on he never plays catch again in his life. Every time he’s faced with an invitation he has a mental shortcut that brings him instantly to “no.” This is how people “make their decisions.”
That’s a mental short cut and those mental short cuts evolved to keep the human alive.
More importantly, it’s important to know that asking someone to do something they are phobic of can be akin to asking them to put their life at risk or risk experiencing great pain for a very tiny payoff.
For those kids who do learn to play catch some of them go on to play football.
When the game is on and a football player catches a ball on the side line during the game and there is a question (a close call) as to whether he is in bounds or out of bounds, typically there is no doubt in the minds of fans watching the same video as to whether the player is in or out of bounds. They will view the same reply and come up with different decisions, based upon their biases. The team you are cheering for caught the ball in bounds. The team you are wanting to lose caught it out of bounds. It’s that simple. You see what you want to see until you practice seeing through a “real lens.”
How do people perceive you? Your products? Your service?
What happens when someone is deciding to buy you, your services or product? The VERY SAME biases that go into whether the player was in bounds, Paul’s opinion about politics or my friends decision about playing catch, are the SAME BIASES that people filter information through to decide whether they will buy from YOU, whether they will do business with YOU, whether they will even TALK with you.
Good or bad, there are certain types of information that cause people to decide what they are going to do.
Mental Short Cuts Not Commonly Discussed by Others
The U.S. military reveals 9 core sectors of “bias,” for information. (There are more than 9, these are very important.) That means there are 9 different categories of information that stop people from thinking about what matters in making good decisions.
All of these factors are crucial for influence because decision making is the outcome of every attempt to influence another person. (You want them to make a decision.)
Simply put, if you want someone to buy you and your products, and you are their best option, you must take advantage of the processes that cause people to make decisions.
Research into how people make decisions while under pressure could help the U.S. military improve training for its leaders and lead to better decision-support systems. Studies have shown that when people process information, they develop unconscious strategies, mental short cuts – or biases – that simplify their decisions. The Georgia Tech Research Institute (GTRI) is revealing how these biases affect people when they’re dealing with lots of information – and little time to form conclusions.
“The immediate application for this research is to develop training programs to improve decision-making,” said Dennis Folds, a principal research scientist in GTRI’s Electronic Systems Laboratory. “Yet our findings could also help design new types of decision-support systems.” The research indicated that nine different kinds of biases can lead to errors in judgment when people are dealing with a lot of information. Meanwhile, the error rate was not as high as researchers expected for individuals under time pressure. Also, the study revealed that subjects who were trained to spot conditions that lead to decision-making biases were better at detecting “false-alarm opportunities.”
The Army Research Institute funded Folds to conduct a series of experiments that combined a high volume of data with time pressures. The experiments simulated the changing reality of military decision-makers. Commanders today communicate more directly with field personnel. The amount and variety of information at their disposal has escalated with sources ranging from real-time sensors and voice communications to archived data. The result can be ambiguous, disjointed information rather than integrated, organized reports.
“This puts far greater pressure on leaders, who must make faster decisions while sifting through more data,” Folds noted. In his experiments, he considered previous research on seven specific biases that affect individuals who must wrestle with large amounts of data. The bullets are from his findings, the elaboration below each finding is mine.
- Absence of evidence. Missing, relevant information is not properly considered.
People make decisions without having all the information they need to make a great decision. They might try one perfume, like it buy it and then buy it for the rest of their life. Or maybe it’s looking at 5 houses instead of 25 when you got to buy a house. There is a limit of course, but people often make decisions without missing info.
- Availability. Recent events or well-known conjecture provide convenient explanations.
Listen to conversations and you’ll hear people attribute bizarre things to The President or The Democrats. “The stock market is up, Trump must be doing a good job.” “Under Trump we have the lowest unemployment rate in history.” People said the same thing about Obama and of course it’s a major stretch to tie unemployment to one person even when that person is the President.
- Oversensitivity to consistency. People give more weight to multiple reports of information, even if the data came from the same source.
If 10 people say they believe the same thing, people are more inclined to believe the 10 vs. the 1 who says the opposite. After all if one person is different from the group, the group must be right. More interesting perhaps is that the same person can repeat a message 10 times and it is received in a similar way as 10 different people saying that message. Repetition is the mother of learning, whether learning is right or not.
- Persistence of discredited information. Information once deemed relevant continues to influence even after it has been discredited.
Tell a story and regardless of how it is found to be wrong and then “corrected” tomorrow, it doesn’t matter because people will remember and believe the first heard story even in the light of intense discrediting.
- Randomness. People perceive a causal relationship when two or more events share some similarity, although the events aren’t related.
Imagine a couple goes on a vacation. They have an argument the last day of their trip. When it comes time to plan the next trip one of the two says, “…as long as we don’t have a fight on the last day….”
- Sample size. Evidence from small samples is seen as having the same significance as larger samples.
Bloggers quote research from sample sizes as small as 10 or 12 people as if they matter in supporting a claim that “something works.” What you discover is that larger sample sizes are important. After you have a large sample size you are looking to a representative group of people. If you wanted to say “American’s think,” or “American’s believe,” you’d want a number of people from each state, each culture, each gender. Then you have a sample that matters. And sometimes you discover that what is true for men doesn’t apply to women and so on.
- Vividness. When people perceive information directly, it has greater impact than information they receive secondhand — even if the secondhand information has more substance.
Being live in person is far, far more valuable than someone getting the same information online or from a friend. The closer you are to the person you are talking with the more important the information becomes.
To test the affects of these biases, Folds had experiment subjects view an inbox on a computer screen containing a variety of text messages, maps, photographs and video and audio recordings. Subjects (the majority being Georgia Tech ROTC students) were instructed to report certain military situations, such as incidents of sniper fire or acts of suspected sabotage. They were not to report other events, such as normal accidents in an urban area unrelated to enemy activity.
To decide whether or not an event should be reported, subjects reviewed a series of messages that contained both bonafide evidence as well as information created to trigger the biases that cause poor decisions. In each trial, subjects were allowed enough time to spend an average of 20 seconds per element data plus one additional minute for reporting; they were also asked to attach information that supported their decision.
In the first experiment, all seven biases appeared with the greatest number of errors caused by vividness and oversensitivity to consistency. In addition, Folds discovered two new biases that can hinder the quality of rapid decisions:
- Superficial similarity. Evidence is considered relevant because of some superficial attribute, such as a key word in a message title. For example, a hostage situation might have been reported earlier, and then another message shows up in the inbox with the word “hostage” in its header, although the message’s actual content has nothing to do with hostages.
- Sensationalist appeal. Items containing exaggerated claims or threats influence a decision-maker even when there is no substance to the content.
Claims or threats that wire into the fears that specific groups of people have, carry a lot of weight to people.
Folds was surprised at how well subjects could perform the task while under pressure, he said. Although he expected an accuracy rate of about 50 percent, subjects correctly reported 70 percent of incidents.
In a second experiment, researchers divided subjects into two groups, using one as a control group while training the other group how to spot conditions that spark decision-making biases. Subjects who received training were able to detect about twice as many “false-alarm opportunities” as the control group.
The biggest difference between the two groups involved “persistence of discredited information” and “small sample” biases. Forty-eight percent of trained subjects were able to recognize when a “persistence” bias existed compared to 18 percent of the control group. Fifty percent of trained subjects caught the “sample-size” traps versus 11 percent of the control group. Although training helped participants recognize when traps existed, it didn’t help them identify the specific bias. “When subjects were under pressure to make decisions rapidly, the distinctiveness of the categories fell apart,” Folds explained. “That’s significant, because it helps us tailor training efforts.”
The experiments also revealed what kind of information is meaningful to decision-makers, Folds noted. Software designed especially for the trials tracks when subjects open a document for the first time and when they go back for a second time or third look. The amount of time that subjects spend reviewing data – along with the data they attach to reports showed a decided preference for text messages over other formats.
Where can you find more information like this? The only place we know that consolidates the newest influence research studies:
Science of Influence (V.2): The Master’s Home Study Course
by Kevin Hogan
You don’t have to have mastered or even listened to Volume 1 to fully enjoy and master the content of Volume 2. All new and independent of the other 5 sets in the series, this program is unique.
Here are just some of the nuggets you will learn when you receive these 12 cds in Volume 2 in the series:
- The newest research studies on what affects the persuasion process
- The One Question that someone MUST say “Yes” to every time!
- Learn what may be the single most important element of influence you have ever been introduced to. I have NEVER released this information on audio, video or in manual form
- Discover how skeptical and non-skeptical people perceive and respond to persuasive messages in a VERY different fashion. (Hint: If you don’t know this information you will automatically lose almost 1/4 of all of your encounters.)
- Ethical techniques to hypnotically enter another person’s mind and reshuffle their deck!
- Discover which of the desire to gain or the fear of loss is TRULY the far greater motivator and how to harness that power in your persuasive messages.
- The one way that reciprocity can blow up and completely backfire.
- How to prepare your unconscious mind to always present the right body language at the right time.
- There is one KEY factor in making your clients’ decisions permanent: Here it is!
- How to specifically use Hypnotic Confusion in influential messages.
- The most effective non-coercive way to gain compliance on record.
- How do you create metaphors…based upon the person/audience you are speaking to?
- So much more!
Science of Influence: Tactics and Techniques of Persuasion