*Psychology Primer*

Primer

Like having soup as the first course of a meal to limber up the stomach or stretching before a marathon to loosen up the muscles, the psychological concepts below will help make the information presented later easier to digest and take in stride. Understanding these ideas will be beneficial in all areas of life in expanding one’s consciousness.

Cognitive Dissonance

What is Cognitive Dissonance?

Cognitive dissonance describes when we avoid having conflicting beliefs and attitudes because it makes us feel uncomfortable. The clash is usually dealt with by rejecting, debunking, or avoiding new information.

Individual effects

Rejecting, rationalizing, or avoiding information that conflicts with our beliefs can lead us to make poor decisions. This is because the information is not rejected because it is false but because it makes us uncomfortable. Information that is both true and useful can often have this effect. Decisions made in the absence of true and useful information can have harmful consequences.

Systemic effects

Looking further into the effects of cognitive dissonance leads to troubling conclusions across academia and political society. If researchers tend to analyze information in a way that supports conclusions that are consistent with their own beliefs, then cognitive dissonance may threaten the objective methodology that underpins much of academia today.

The effectiveness of social causes is also threatened by cognitive dissonance. The change they often call for requires many people to change their existing beliefs and behavior. This is not possible if a significant portion of us do not consider evidence that conflicts with the beliefs or behaviors these causes seek to alter. Environmentalism and its associated climate change action movements are a good example. Most of us care for nature and want to preserve it. But the evidence championed by these movements often indicates that we aren’t doing enough as individuals. Many of us are part of the problem. Such evidence shows us that our behaviors are often at odds with our beliefs.

Seeing this contradiction, many of us respond by either rationalizing our behaviors, rejecting environmentalism and the evidence it relies on, or adopting the belief that our individual actions have a negligible effect on the environment. This prevents the widespread behavioral change many environmental causes call for.

Cognitive dissonance may also facilitate a political divide. When we believe strongly in a political leader or ideology, we are more likely to dismiss information that does not support their message. In other words, we often ignore or distort evidence that challenges our political beliefs. This is part of the reason why it is so difficult to change someone’s mind on political issues. Voters are likely to remain loyal to their chosen candidates and party even when evidence that should challenge those loyalties is presented

Why it happens

Cognitive dissonance occurs when there is an uncomfortable tension between two or more beliefs that are held simultaneously. This most commonly occurs when our behaviors do not align with our attitudes – we believe one thing, but act against those beliefs. The strength of cognitive dissonance, or the pain it causes, depends on the number and relative weight of the conflicting beliefs. This mental conflict and the resulting discomfort motivates us to pick between beliefs by justifying and rationalizing one while rejecting or reducing the importance of the others.

We tend to pick the belief or idea that is most familiar and ingrained in us. Changing our beliefs isn’t easy, nor is changing the attitudes and behaviour associated with them. As a result, we usually stick with the beliefs we already hold, as opposed to adopting new ones that are presented to us. In fact, many of us go further by avoiding situations or information that might clash with our existing beliefs to create dissonance.”

How to avoid it

There is no way of avoiding cognitive dissonance itself. Remember that cognitive dissonance is just the discomfort we feel when our beliefs or attitudes contradict each other. What can be mitigated, is our natural response to this discomfort (ie. how we approach dissonance reduction).

As said before, our natural response to cognitive dissonance is to rationalize our existing beliefs or reject and avoid information that conflicts with them to cause dissonance. We have already noted the harms associated with doing this. Changing our beliefs when they are challenged by new information is often better than ignoring this information or rationalizing the existing beliefs which may be wrongly held. We should look to make this a more viable response to internal conflict. This is called a conditioned or “learned reflexive response.”

A general strategy may be to accept that conflict and the resulting change can be good for us. We can all think of past behaviors and attitudes that we are thankful to have changed. And although, as Festinger said, conflict and change “may be painful or involve loss,” it often doesn’t. Thinking of change negatively may cause us to avoid employing it when in dissonance. So, we should instead seek to associate change with gratification and gain. This may condition us to favour it as a response to mental conflict rather than rejecting, rationalizing, or avoiding information.

And as always, being aware of a cognitive bias that normally occurs subsciously can help us recognize when our decisions are influenced by it. This is to say, understanding and looking for cognitive dissonance in our decision-making can help us realize when our decision to reject rationalize, or avoid new information is caused by it.

Note: For example let’s take Bob who is an environmentalist. Bob devotes his whole life to protecting the environment and is a hardworking activist. Bob also loves hamburgers. All Bob’s friends are hamburger connoisseurs. Bob attends hamburger conventions every weekend and spends all his free time on his hamburger blog. One day, Bob reads a research paper that shows that meat accounts for 60% of all greenhouse gases. Bob cannot believe it! Bob had devoted his entire life to protect the environment, but now it turns out his favorite food and hobby is contributing to environmental harm. “Maybe the author of that paper is biased and just wants money. Maybe this is fake, sensationalist, news to get attention. Maybe the study wasn’t done properly. Maybe the author has an ulterior motive,” Bob convinced himself. Because he is now faced with two conflicting beliefs or ideas, his mind will do what it can to ease the discomfort, the cognitive dissonance. To avoid the dissonance (critically thinking), Bob chooses to ignore the article and even avoids any negative information against hamburgers altogether.

So Bob is happy. His mind is at ease. He convinces himself that ignorance is bliss. Bob can now have his cake (or hamburger in this case) and eat it too. Until one day, he learns that hamburgers contain toxic ingredients that could potentially kill him. Not just him, but all his friends and family, everyone he loves and cares about. Only now does he finally decide to grit through the cognitive dissonance and think things through a little more, to actually spend time researching this new revelation and practice critical thinking no matter how mentally taxing it may feel at first.

Note: Cognitive Dissonance: Cynthia McKinney (former Congresswoman; first AA woman to represent GA), Dr. Laurie Manwell, Prof. Peter Dale Scott Original Archive

Note: New Bill Legalizes Government Propaganda and Disinformation on American Citizens (Smith Mundt Act) – 2012

the Dunning-Kruger Effect

What is the Dunning-Kruger Effect?

The Dunning-Kruger effect effect occurs when a person’s lack of knowledge and skills in a certain area cause them to overestimate their own competence. By contrast, this effect also causes those who excel in a given area to think the task is simple for everyone, and underestimate their relative abilities as well.

Individual effects

As a result of the Dunning-Kruger effect, you may not know what you’re good at, because you assume that what comes easily to you also comes easily to everyone else. You are therefore robbed of the ability to spot your own specialties and talents.

Moreover, when you excel at what is challenging to you, you might accidentally fall prey to the belief that that thing is where your talents lie. In reality, you may just be a below-average performer finally approaching average levels.

As you can see, this discrepancy may cause you to make bad choices around opportunities or careers you pursue. You may have found yourself turning to peers asking, “What am I good at?” This isn’t a bad choice to make. Understanding the Dunning-Kruger effect can help you discern when to trust your own abilities, and when to seek advice out from others who may view you more objectively than yourself.

The effect can also cause you to become disappointed when your self-recognized “talents” are not recognized by others. Perhaps you expect an upcoming promotion, and it goes to someone who is surprised to even be considered. It’s not unlikely that your average performance had you thinking you were doing particularly well, while her expertise had her thinking she was average.

Thinking you are better than you are at something can cause you to miss out on opportunities to learn from others, who truly are more skilled or more knowledgeable. Furthermore, thinking you are average at something when you really have great skill can cause you to miss opportunities to teach and spread knowledge to others.

Systemic effects

As a society, we therefore miss learning from the best of the best, because their confidence keeps them behind closed doors. At center stage, all too often, can be people of below-average capabilities.

Unfortunately, those who are the most ignorant—in the bottom 25% of any skill—also overestimate themselves the most. In the context of our democracy, this means our most uninformed citizens are also our most confident ones. Not only are these ignorant people extremely resistant to being taught—since they believe they know the most—they are also guilty of sharing the most information (read: misinformation).

At its core, the Dunning-Kruger effect preys on just that: not a lack of information, but rather an abundance of misinformation. We know when we know nothing, but it is information that is wrong that causes us to think we know everything, and absentmindedly press “share.”

Our society’s disease of self-awareness causes ignorant and misinformed people to have the confidence to claim the microphone, while experts and well-informed folks are behind the stage, rolling up the curtains. This phenomenon spreads misinformation and ill-informed views throughout our social worlds, causing us to miss real learning opportunities we could gain from one another.

How it all started

The Dunning-Kruger effect was first discovered and written about in 1999, by researchers David Dunning and Justin Kruger at Cornell University.

The researchers spotted how much people overestimated their own abilities in daily life—think of the guy in class who keeps raising his hand to relay his useless ideas—and coined the term “dual burden”. Dual burden was used to describe that these people suffer from two things: ignorance, and ignorance of their own ignorance. The researchers tested random participants on tests of humour, grammar, and logical reasoning. They found that people who ranked in the bottom 25% of any of these test scores tended to predict themselves to be at the top of the pack. When they scored in the 12th percentile, they estimated themselves to be in the 62nd.

On the flip side, people in the top 25% predicted their scores to be slightly lower than they actually were.

Dunning and Kruger conducted a similar study on Cornell students emerging from final exams. They asked the students to predict their own test scores, then followed up when they got their real ones. Their results held up.

Analyses of these results attributed the discrepancies in self-estimation to metacognitive skill (the ability to think about your own thinking). In effect, improving the skills of the participants—on humour, grammar, and logical reasoning—helped them recognize the limitations of their own abilities, and predict their own scores better on subsequent trials.

A misunderstood effect

The most important mistake people make about the Dunning-Kruger effect, according to Dr. Dunning, has to do with who falls victim to it. “The effect is about us, not them,” he wrote to me. “The lesson of the effect was always about how we should be humble and cautious about ourselves.” The Dunning-Kruger effect is not about dumb people. It’s mostly about all of us…”

Note: For example let’s take Rob. Rob is a highly educated and successful individual. One day his close friend tells him that aliens exist. Rob laughs. Rob is a big movie guy and has seen all the Alien and Species films. He exclaims that the idea of aliens was made-up for the movies. The first thought that comes to his mind when thinking of aliens is that of someone sitting in their mother’s basement with an Area 51 shirt and tinfoil hat on, screaming that aliens really exist and that they are going to invade Earth on their YouTube channel. He thinks his friend has gone crazy. His friend later shows him evidence of Pentagon officials testifying about UFOs in a congressional hearing, the CIA making their UFO records public, Apollo Astronaut Edgar Mitchell’s emails to Jon Pedosta and whistle-blower, physicist Bob Lazar’s revelations.

Rob is flabbergasted! He realizes that he had never actually done any real research about the existence of extra-terrestrial life save for a few random internet articles and videos that showed up in his news feed. Rob is humbled and apologizes to his friend for calling him crazy. He realizes that his buddy knows more about this subject than he does and how he ignorantly overestimated his own knowledge on the subject of extra terrestrial life. Rob now listens and defers to his friend about topics surrounding ETs. This eventually leads him to learn about advanced ancient civilizations from his friend too, another subject he would have scoffed at previously. Rob eventually finds himself wondering if extra-terrestrials could be likened to the concept of angels when thinking about stories and art from various myths, legends, and religions about advanced, benevolent beings coming from the sky. Then he wonders if advanced races could exist in the inner earth like Atlantians or Lemurians. Then he starts wondering about malevolent beings. All ideas he would never have been able to entertain before. “As above, so below.”

“It is the mark of an educated mind to be able to entertain a thought without accepting it.” – Aristotle

The Sunk Cost Fallacy

“What is the Sunk Cost Fallacy?

The Sunk Cost Fallacy describes our tendency to follow through on an endeavor if we have already invested time, effort, or money into it, whether or not the current costs outweigh the benefits.

Where this bias occurs
Imagine that you bought a concert ticket a few weeks ago for $50. On the day of the concert, you feel sick and it’s raining outside. You know that traffic will be worse because of the rain and that you risk getting sicker by going to the concert. Although it seems as though the current drawbacks outweigh the benefits, why are you still likely to choose to go to the concert?

This is known as the sunk cost fallacy. We are likely to continue an endeavor if we have already invested in it, whether it be a monetary investment or the effort that we put into the decision. That often means we go against evidence that shows it is no longer the best decision, such as sickness or weather affecting the event.

Individual effects

In economic terms, sunk costs are costs that have already been incurred and cannot be recovered. In the previous example, the $50 spent on concert tickets would not be recovered whether or not you attended the concert. It therefore should not be a factor in our current decision-making, because it is irrational to use irrecoverable costs as a rationale for making a present decision. If we acted rationally, only future costs and benefits would be taken into account, because regardless of what we have already invested, we will not get it back whether or not we follow through on the decision.

Why it happens

The sunk cost fallacy occurs because we are not purely rational decision-makers and are often influenced by our emotions. When we have previously invested in a choice, we are likely to feel guilty or regretful if we do not follow through on that decision. The sunk cost fallacy is associated with the commitment bias, where we continue to support our past decisions despite new evidence suggesting that it isn’t the best course of action.

We fail to take into account that whatever time, effort or money that we have already expended will not be recovered. We end up making decisions based on past costs and instead of present and future costs and benefits, which are the only ones that rationally should make a difference.

Why it is important

As can be seen by the various examples discussed in this article, the sunk cost fallacy impacts many aspects of our daily life, as well as bigger decisions that have long-term effects. The sunk cost fallacy means that we are making decisions that are irrational and lead to suboptimal outcomes. We are focused on our past investments instead of our present and future costs and benefits, meaning that we commit ourselves to decisions that are no longer in our best interests.

How to avoid it

While it is difficult to overcome inherent cognitive fallacies, if we are aware of the sunk cost fallacy, we can try to ensure we are focusing on current and future costs and benefits instead of past commitments. We should focus on concrete actions instead of the feeling of wastefulness or guilt that accompanies dropping an earlier commitment, as studies have shown that when we are deterred from making decisions based on our emotions, the effects of the sunk cost fallacy are reduced.”

Confirmation Bias

Individual effects

This bias can lead us to make poor decisions because it distorts the reality from which we draw evidence. Under experimental conditions, decision-makers have a tendency to actively seek information and assign greater value to evidence confirming their existing beliefs rather than entertaining new ones. This can be considered a form of bias in evidence collection. Conclusions we draw from biased evidence are more likely to be false than those we draw from objective evidence. This is because they are farther from reality.

Systemic effects

In the aggregate, individual confirmation bias can have troubling implications. If we’re so deeply entrenched in our preconceptions that we only consider evidence that supports them, broader socio-political cooperation (that often requires considering other viewpoints) can be hindered. Major social divides and stalled policy-making may begin with our tendency to favor information that confirms our existing beliefs and ignore evidence that does not.

Why it happens

Confirmation bias is a cognitive shortcut we use when gathering and interpreting information. Evaluating evidence takes time and energy, and so our brain looks for shortcuts to make the process more efficient.

Our brains use shortcuts

These shortcuts are called “heuristics.” There is debate whether or not confirmation bias can be formally categorized as a heuristic. But one thing is certain: it is a cognitive strategy that we use to look for evidence that best supports our hypotheses. The most readily available hypotheses are the ones we already have.

It makes sense that we do this. We often need to make sense of information quickly, and forming new explanations or beliefs takes time. We have adapted to take the path of least resistance, often out of necessity.

Imagine our ancestors hunting. An angery animal is charging towards them, and they only have a few seconds to decide whether to hold their ground or run. There is no time to consider all the different variables involved in a fully informed decision. Past experience and instinct might cause them to look at the size of the animal and run. However, the presence of another hunting group now tilts the chances of successful conflict in their favor. Many evolutionary scientists have stated that our use of these shortcuts to make quick decisions in the modern world is based on survival instincts.

It makes us feel good about ourselves

Another reason we sometimes show confirmation bias is that it protects our self-esteem.

No one likes feeling bad about themselves — and realizing that a belief we value is false can have this effect. Deeply held views often form our identities, so disproving them can sometimes be painful. We might even believe being wrong suggests we lack intelligence. As a result, we often look for information that supports rather than disproves our existing beliefs.

This can also explain why confirmation bias extends to groups. In an influential 2002 peer-reviewed paper, clinical psychologist Harriet Lerner and political psychologist Phillip Tetlock posit that when we interact with others, we tend to adopt similar beliefs in order to better fit into the group.

They call this confirmatory thought, “a one-sided attempt to rationalize a particular point of view.” This is juxtaposed with exploratory thought, which entails “even-handed consideration of alternative points of view.” Confirmatory thought in interpersonal settings can produce “groupthink,” in which the desire for conformity in the group results in dysfunctional decision making. So, while confirmation bias is often an individual phenomenon, it can also take place in groups of people.

How to avoid it

When we make decisions, this bias is most likely to occur when we are gathering information. It is also likely to occur subconsciously, meaning that we are probably unaware of its influence on our decision-making.

As such, the first step to avoiding confirmation bias is being aware that it is a problem. By understanding its effect and how it works, we are more likely to identify it in our decision-making. Psychology professor and author Robert Cialdini suggests two approaches to recognizing when these biases are influencing our decision making:

1. Listen to your gut feeling. We often have a physical reaction to unfavorable requests, like when a salesperson is pushing us too far. Even if we have complied with similarly unfavorable requests in the past, we should not use that precedent as a reference point.

2. Recall past actions and ask yourself: “knowing what I know, if I could go back in time, would I make the same commitment?”

Second, because the bias is most likely to occur early in the decision-making process, we should focus on starting with a neutral fact base. This can be achieved by having one (or ideally, multiple) third parties who gather facts to form a more objective body of information.

Third, when hypotheses are being drawn from the assembled data, decision-makers should also consider having inter-personal discussions that explicitly aim at identifying individual cognitive bias in the hypothesis selection and evaluation. While it is likely impossible to eliminate confirmation bias completely, these measures may help manage cognitive bias and make better decisions in light of it.”

Descartes’ Methodic Doubt

In the first Meditation Descartes argues that our ordinary experience of the world cannot provide the kind of guaranteed foundation on which all other knowledge can be based. We are often disappointed to learn that what we have been taught are merely prejudices, or that what our senses tell us is incorrect. That should make us wonder about whether all the other things that we think are obvious might likewise be mistaken. In order to test whether what we think we know is truly correct, Descartes suggests that we adopt a method that will avoid error by tracing what we know back to a firm foundation of indubitable beliefs.

Of course, it is possible that there are no absolutely unshakeable truths. It is also possible that we might discover that our prejudices cannot be removed or that beliefs we think are ultimate foundations for all our other beliefs are not really ultimate at all. The point of our meditations is to challenge those beliefs, even if we have held them for a long time. And that self-critique will take a real effort.

In order to determine whether there is anything we can know with certainty, Descartes says that we first have to doubt everything we know. Such a radical doubt might not seem reasonable, and Descartes certainly does not mean that we really should doubt everything. What he suggests, though, is that in order to see if there is some belief that cannot be doubted, we should temporarily pretend that everything we know is questionable. This pretense is what is called a hypothetical doubt. To make sure that we take the pretense seriously, Descartes suggests that there might be good arguments to think that such doubting is justified (and thus more than simply something we should pretend to do).”

Plato’s Cave

“In book seven of Plato’s The Republic, he tells us about some people chained in a cave, forced to watch shadows across a stone wall. The group of prisoners has been living there in chains since their birth. They have never seen the outside world, only shadows of it. They have no knowledge of anything beyond their miserable lives in the cave.

The prisoners are chained facing a wall and can’t turn their heads. There’s a fire behind them which produces some light. Occasionally people pass by that fire with animals and objects or figures that are cast in the wall, and the prisoners can see their shadows. All that the prisoners know are those shadows. They name them and believe they are real entities. They talk about the shadows with enthusiasm and are fascinated by them, thinking that if you pay attention, you can succeed in life.

One day, a prisoner manages to free himself from the chains and step outside the cave to see the outer world. At first, the sun burns his eyes, but then they adjust, and he finds everything so colorful, exciting, full of life. He sees the real forms of the things he knew as shadows, like rabbits, birds, flowers, people, objects, he even sees the sky and stars.

Previously, he had been looking only at phantoms; now, he is nearer to the true nature of being.” – Plato

People explain to him that everything he sees is real, and the shadows are just mere reflections. Although he cannot understand this at first, he then adjusts and sees how the sun is responsible for light and producing the shadows.

The prisoner gets back to the cave and tells everyone what he had just witnessed, but no one believes him. His eyes had adjusted to the sun, and now he can’t see the shadows clearly as he did before. They tell him he’s crazy, and violently resist while he tries to free them (they even try to kill him).”

IN-SHADOW – A Modern Odyssey – Original Archive

Embark on a visionary journey through the fragmented unconscious of our modern times, and with courage face the Shadow. Through Shadow into Light.” – 5+ million views

No tree, it is said, can grow to heaven unless its roots reach down to hell.” – C.G. Jung

Note: Analysis can be found here.

Mass Psychosis – How an Entire Population Becomes Mentally Ill – Original Archive

After Skool – 2.1M subscribers
“In this video we are going to explore the most dangerous of all psychic epidemics, the mass psychosis. A mass psychosis is an epidemic of madness and it occurs when a large portion of a society loses touch with reality and descends into delusions. Such a phenomenon is not a thing of fiction. Two examples of mass psychoses are the American and European witch hunts 16th and 17th centuries and the rise of totalitarianism in the 20th century.

This video will aim to answer questions surrounding mass psychosis: What is it? How does is start? Has it happened before? Are we experiencing one right now? And if so, how can the stages of a mass psychosis be reversed?”

Note: The United States and New Zealand are the only countries where drug makers are allowed to market prescription drugs directly to consumers. The U.S. consumer drug advertising boom on television began in 1997, when the FDA relaxed its guidelines relating to broadcast media.

Considering this, When did it become legal to advertise drugs on TV? The FDA first allowed the practice of direct-to-consumer advertising in 1997, and since then, the number of TV commercials for prescription drugs has skyrocketed. Interestingly, only the United States and New Zealand allow for direct-to-consumer advertising for pharmaceuticals.

Note: The Most Medicated Country In The World: 46% Of Americans Have Taken A Pharma Drug Within The Last 30 Days

Note: Per capita prescription drug spending in the United States exceeds that in all other countries

Note: The pharmaceutical and health products industry has far outpaced all other industries in lobbying

Note: Pfizer paid the largest criminal fine in U.S. history – lawsuit details

Note: Medical error is the third leading cause of death in the US.

Note: Advertisements featuring doctors promoting cigarettes, alcohol, opium, cocaine, heroine, amphetamines, lithium, benzodiazepines, DDT, and arsenic, often for children, in the early to mid 1900s.