Authority/Respect and Agenda-type Control

Could moralistic thinking cause people to act unethically?  I think so.  Is it guaranteed to do so, no.

Moral thinking is culturally or religiously based.  Ethical thinking is founded on innate human ability of sensory input (pleasure/pain), mental input (emotional pleasure/pain, concepts of fairness/reciprocity, & empathy), rational thought, and good questions.

It is hard, although not impossible, to use aspects of ethics harm/care and fairness/reciprocity to do bad things.  The saying, “The road to hell is paved with good intentions” rings true from time to time.  Typically, the bad that results from trying to do good is the result of unintended consequences.

The elements of morality that diverge from ethics are much more easily used for bad than good however.  We will address some concerns with Authority/Respect first.

After World War II there was a famous study conducted by Stanley Milgram (1963) – the “Milgram Experiment” and it bears great importance on the topic of what psychologist refer to as “Authoritarian Behavior”.

Direct from Wikipedia:

The Milgram experiment was a series of social psychology experiments conducted by Yale University psychologist Stanley Milgram, which measured the willingness of study participants to obey an authority figure who instructed them to perform acts that conflicted with their personal conscience.   The experiments began in July 1961, three months after the start of the trial of Nazi war criminal Adolf Eichmann in Jerusalem. Milgram devised the experiments to answer this question: “Could it be that Eichmann and his million accomplices in the Holocaust were just following orders? Could we call them all accomplices?”

Milgram summarized the experiment in his 1974 article, “The Perils of Obedience”, writing:

The legal and philosophic aspects of obedience are of enormous importance, but they say very little about how most people behave in concrete situations. I set up a simple experiment at Yale University to test how much pain an ordinary citizen would inflict on another person simply because he was ordered to by an experimental scientist. Stark authority was pitted against the subjects’ [participants’] strongest moral imperatives against hurting others, and, with the subjects’ [participants’] ears ringing with the screams of the victims, authority won more often than not. The extreme willingness of adults to go to almost any lengths on the command of an authority constitutes the chief finding of the study and the fact most urgently demanding explanation.

Ordinary people, simply doing their jobs, and without any particular hostility on their part, can become agents in a terrible destructive process. Moreover, even when the destructive effects of their work become patently clear, and they are asked to carry out actions incompatible with fundamental standards of morality, relatively few people have the resources needed to resist authority.

Results

Before conducting the experiment, Milgram polled fourteen Yale University senior-year psychology majors as to what they thought would be the results. All of the poll respondents believed that only a few (average 1.2%) would be prepared to inflict the maximum voltage. Milgram also informally polled his colleagues and found that they, too, believed very few subjects would progress beyond a very strong shock.

In Milgram’s first set of experiments, 65 percent (26 of 40) of experiment participants administered the experiment’s final 450-volt shock, though many were very uncomfortable doing so; at some point, every participant paused and questioned the experiment, some said they would refund the money they were paid for participating in the experiment. Only one participant steadfastly refused to administer shocks before the 300-volt level.

Later, Prof. Milgram and other psychologists performed variations of the experiment throughout the world, with similar results although unlike the Yale experiment, resistance to the experimenter was reported anecdotally elsewhere.  Moreover, Milgram later investigated the effect of the experiment’s locale on obedience levels, (e.g. one experiment was held in a respectable university, the other in an unregistered, backstreet office in a bustling city; the greater the locale’s respectability, the greater the obedience rate). Apart from confirming the original results, the variations have tested variables in the experimental set-up.

Dr. Thomas Blass of the University of Maryland, Baltimore County performed a meta-analysis on the results of repeated performances of the experiment. He found that the percentage of participants who are prepared to inflict fatal voltages remains remarkably constant, 61–66 percent, regardless of time or place.[verification needed]

There is a little-known coda to the Milgram Experiment, reported by Philip Zimbardo: none of the participants who refused to administer the final shocks insisted that the experiment itself be terminated, nor left the room to check the health of the victim without requesting permission to leave, as per Milgram’s notes and recollections, when Zimbardo asked him about that point.[citation needed]

Milgram created a documentary film titled Obedience showing the experiment and its results. He also produced a series of five social psychology films, some of which dealt with his experiments.

[end of Wikipedia excerpt]

Authoritarian behavior it seems is something the majority of people throughout the world are susceptible to.  Authoritarian Personalities are a bit less common and come in three main categories; those who are highly susceptible to authority, those who desperately want to be in a position of authority, and those referred to as “double high’s” because they both want to be in a position of authority but are also very susceptible to people of greater authority than themselves. (reference, “Conservatives without Conscience” by John Dean)

One distinguishable characteristic of people with Authoritarian Personalities is that they place a very high importance on their personal or group agendas.  In-group/Loyalty is a complementary trait of those that hold authority in high regard.  Their personal submission to the agenda or ‘the mission’ will obscure their personal set of beliefs so much so that any cost could be asked to bear and any wrong to others could be justified to accomplish the mission.

While Authoritarian Personalities are to some extent a personality disorder, Milton’s experiments and subsequent ones tell us that the majority of the population are very susceptible to the demands of authority; and will cooperation with authority figures to a very unsettling level.  In an effort to make this issue a bit more personally attainable for those that question their own susceptibility to such constraints I will broaden the definition of authority.  Most people think of authority as a person who has a will.  I would argue that authority can not only not be a person but also not have a will.

A meme can be the authority figure.  Memes can and often do take the form an agenda.  A simple statement of ‘truth’ can become a sacred value and therefore have authority inherent within the idea or command.  A Sacred Value is a belief or value that a person or groups of people attribute almost infinite value to the point were they are willing to die for it.

Given the ease at which people are willing to do bad or evil things under the influence of authority to other people (or living creatures) I would strongly argue that authority is not only NOT a foundation of ethical truth, rather a very strong method of control which can pervert or suspend ones own internal ethics in a highly negative way.  Use of authority over others is one of the strongest means used to pervert otherwise ethical people.

Jonathan Haidt makes a good argument for authority as a tenant of morality; and I accept it as such.  Authority over people is something governments and religions have as pillars of their foundations; respect for authority is an integral part of most cultures – starting with your parents.  I’m not advocating being disrespectful but I am advocating questioning authority.  Authority figures have to earn respect, and their commands have to make sense.  Unquestioning obedience to any person or idea is a receipt for disaster.  No group of people or society has ever suffered from being too inquisitive or demanding of good reasons for compliance to initiatives.

Authority is not a tenant of ethics.  People or groups in a position of authority should utilize ethics.  A person’s or group’s authority is derived from their members compliance to grant them authority.  Authority over someone should be used with great restraint.  The only time an individual can truly lose authority over themselves is if they have committed an unethical act.  We have historically set up institutions such as courts to deal with these conditions.  Punishment of unethical behavior is justified and ethical.  Concepts of fairness/reciprocity and harm/care (ethics) are what justice is based on.  And the punishment should fit the crime.

 

 

 

2 thoughts on Authority/Respect and Agenda-type Control

  1. Most people have an unsettling amount of deference to authority, but do we know why some people are more or less likely to defer to authority than others? Is it learned/cultural? Genetic? Can someone with an authoritarian personality today, learn to become less authoritarian over time and would they be likely to ever make that choice?

    • The nature vs nurture debate (genetics verse culture) is a false dichotomy. Both are at play and to some extent both effect each other. Germany, Italy, and Japan were all fascist countries 50-years ago. Today, Germany is the leader of the free world, the Japanese have raised generations of “ambassadors for Peace,” and Italy is reasonably liberal; and the US is becoming more fascist by the day. So in one to two generations, the cultures of all of these countries have flipped.

      People can train themselves to think with a more liberal mindset and develop better ethical understanding; and so can cultures. Regardless of your natural wiring, you can be culturally conditioned to bend the arc of your genetic fate. Human are uniqued in that we are organic computers that can self program. While we are born with some hard wiring and software, we can add, subtract, and modify our software by shear will or cultural conditioning.

Leave a Reply