Difference between revisions of "Propaganda"
From Gender and Tech Resources
m (→The Institute for Propaganda Analysis (IPA)) (Tag: VisualEditor) |
m (→Documentaries) |
||
(34 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
− | Propaganda is "''any form of communication in support of national objectives designed to influence the opinions, emotions, attitudes, or behavior of any group in | + | Propaganda is "''any form of communication in support of national objectives designed to influence the opinions, emotions, attitudes, or behavior of any group in order to benefit the sponsor, either directly or indirectly''". Governments have always tried to control people. Those in authority want control of the people’s hearts, minds and allegiances, and block or censor dissident voices. Probably every conflict is fought on at least two grounds: the battlefield and the minds of the people via propaganda. The “good guys” and the “bad guys” can often both be guilty of misleading their people with distortions, exaggerations, subjectivity, inaccuracy and even fabrications, in order to receive support and a sense of legitimacy. The good guise and the bad guise. Black and white. A fight for supremacy, for government. The king is dead, long live the king. We can learn how their game is played for how to deal with (counteract) these propaganda wars <ref>The Semantics of “Good” & “Evil” http://theanarchistlibrary.org/library/robert-anton-wilson-the-semantics-of-good-evil |
</ref>. | </ref>. | ||
− | The term “propaganda” first came into common use in Europe as a result of Pope Gregory XV creating the Congregation for the Propagation of the Faith. This was a commission of cardinals charged with spreading the faith and regulating church affairs in heathen lands. A College of Propaganda was set up to train priests for the missions. The word came into common use again when World War I began. | + | The term “propaganda” first came into common use in Europe as a result of Pope Gregory XV creating the Congregation for the Propagation of the Faith. This was a commission of cardinals charged with spreading the faith and regulating church affairs in heathen lands. A College of Propaganda was set up to train priests for the missions. The word came into common use again when World War I began. No matter the word used, the battle for our minds is as old as human history. The Greeks had games, theater, assembly, law courts, and festivals for propagandising ideas and beliefs. The conflict between kings and Parliament in England was a struggle in which propaganda was involved. Propaganda was one of the weapons used in the movement for American independence, and in the French Revolution. The discipline of public relations (PR) started as a profession after the first world war as the commercial benefits of careful propaganda were realised. |
− | + | In western nations, most people seem to believe propaganda happens, but only in other nations. Meanwhile the western military-industrial-machine is rife with propaganda that is easy to detect, and some propaganda that is harder to detect, or when detected, it can be hard to find the source of it, and exactly why it is done. | |
− | + | [[File:Brainwash.jpg|320px|thumb|right]] | |
− | == The Institute for Propaganda Analysis | + | == The Institute for Propaganda Analysis == |
− | In 1936 Boston merchant Edward Filene helped establish the short-lived Institute for Propaganda Analysis which sought to educate Americans to | + | In 1936 Boston merchant Edward Filene helped establish the short-lived Institute for Propaganda Analysis which sought to educate Americans to recognise propaganda techniques. It's seven propaganda methods have become somewhat of a standard. |
=== Bandwagon === | === Bandwagon === | ||
The 'bandwagon' pumps up the value of 'joining the party'. Bandwagon is a fallacy, or mistake, in argumentation. Related to the emotional appeal in persuasion, or pathos, the 'bandwagon' approach involves convincing a readership that the majority of people agree with the writer's argument. This technique suggests that just because a large majority of people agree, the reader should, too. The bandwagon plays heavily on the human need for belonging, making the group a desirable place to be. | The 'bandwagon' pumps up the value of 'joining the party'. Bandwagon is a fallacy, or mistake, in argumentation. Related to the emotional appeal in persuasion, or pathos, the 'bandwagon' approach involves convincing a readership that the majority of people agree with the writer's argument. This technique suggests that just because a large majority of people agree, the reader should, too. The bandwagon plays heavily on the human need for belonging, making the group a desirable place to be. | ||
− | + | Commercial writers often make statements like "Over 5 million people have called…," adding the name of a company. This approach works because of the social pressure of majority opinion. Or an author states, "Everyone is doing whatever it takes to make himself happy. When you recognise that, you don't feel guilty for doing what everyone else is also doing." This approach works because the author argues that what everyone is doing is correct, equating popularity with truth. Review writers use it when they inform their audience that a book or a song has been number-one for several weeks, adding "Check it out." If the readers do not, they risk being left behind. | |
=== Card-stacking === | === Card-stacking === | ||
− | Card-stacking builds a highly-biased case for | + | Card-stacking (alias Ignoring the Counterevidence, One-Sided Assessment, Slanting and Suppressed Evidence) builds a highly-biased case for position (and cause). In 'card-stacking', deliberate action is taken to bias an argument, with opposing evidence being buried or discredited, whilst the case for one's own position is exaggerated at every opportunity. Thus the testimonial of supporters is used, but not that of opponents. Coincidences and serendipity may be artificially created, making deliberate action seem like random occurrence. Things 'just seem to happen'. |
+ | |||
+ | It is by no means always fallacious to present a one-sided argument. For example, it is not a defense attorney's job to present the evidence for a defendant's guilt, that is the job of the prosecutor. Nor can we expect a salesman to list "what could possibly go wrong". Or politicians to give us all sides of a story in an election campaign. As is usual with fallacies, we have to take the context of the argument into consideration. One-sidedness is fallacious in contexts where we have a right to demand objectivity. IMHO, two such contexts are news stories and scientific writing. Slanting in a news story or scientific production may lead the reader into drawing false conclusions, which means that the story is a boobytrap and the reader's reasoning inadvertently fallacious. | ||
+ | |||
=== Glittering Generalities === | === Glittering Generalities === | ||
− | Glittering generalities uses power words to evoke emotions. | + | Glittering generalities uses power words to evoke emotions replacing rational argument and clear evidence. This is a combination of the generalisation fallacy where one thing is applied to another thing, hypnotic talk that puts people into a light trance (darkened rooms and flashing lights are tale-telling signs of that), and the use of nouns giving a sense of substance while there is none. |
+ | |||
+ | ''Ladies and Gentlemen, it is with the greatest pleasure that I welcome you to this most anonymous of operations. We are gathered here together on twitter and facebook on the brink of a worldwide collapse to which we must all rise in concert [raises fist], for all that is necessary for the triumph of evil is that good people do nothing, which I will never do and I know you will never accept.'' | ||
+ | |||
=== Name-calling === | === Name-calling === | ||
− | + | 'Name-calling' (alias Mud Slinging alias Demonisation) is apparently a much used political practice <ref>Name Calling: An intricate map of who's insulted who http://laphamsquarterly.org/comedy/charts-graphs/name-calling | |
+ | </ref> and one of the easiest to do: Take a random person and denigrate them. Show that you can and will do this to any opponents. You can do it to an apparently strong person, to demonstrate that you are not afraid and will take on and defeat even the powerful. You can do it to a weak person, to show that nobody is safe from your ire. You can do it to an ordinary-next-door person, to show that 'people like you' are not safe either. And apparently, name-calling is one of the most common tactics people use to hurt others or disparage them. Most people who indulge in name-calling know that the label or name they choose to describe another individual is not factually accurate. Likewise, it’s normal to feel hurt and defensive when a person starts labeling you or calling you names. When you are on the receiving end you can lose in two ways: The first way is in loss of self-esteem (if you let it) and the second way in discovering that after the insult, there is no path of resolution being offered to you (if you need the other party for that). | ||
+ | |||
+ | And there is also a much darker version of name-calling where the name-caller has '''not''' abandoned intelligent conversation in favor of an emotional outburst, but intentionally facilitates demonisation. Mud sticks, as we all know (a fallacy right there, but hey). Name-calling associates a target (group) with something that is despised or is inferior in some way. Now, if anyone associates with the target (group), the mud will also stick to them. The more the other person or persons become socially isolated, the more people will avoid the target (group). The result is a spiral of isolation neutralising opponents and sending a chilling warning to those who might defend the target (group) or follow in that person's path. | ||
+ | |||
=== Plain Folks === | === Plain Folks === | ||
− | + | ||
+ | [[File:Spin.jpg|320px|thumb|right|Dees Illustration http://www.deesillustration.com/]] | ||
+ | |||
+ | 'Plain folks' makes a leader seem ordinary to increase trust and credibility. IOW, 'plain folks' attempts to short circuit reasoning by asserting that the arguer is just like you and therefore you should believe them: | ||
+ | |||
+ | * Wear ordinary plain clean clothes, no designer gear, no 'flashy' messages. When visiting particular groups, you can dress to show you are like them. | ||
+ | * Use simple words, simple grammar and short sentences. Pause regularly. When talking with a particular group of people, use their lingo, but do not use local dialects and professional jargon. Perhaps use just a few of the local words, or parallel words that boil down to 'I may not be you, but I'm so like you it is the same thing, anyway'. ''I know tax increases are a bad idea. I pay taxes too.'' | ||
+ | * Do 'normal' things. Be seen doing chores around the house. Go out running. Walk the dog. Play with your kids. Appear interested in things and people. Be surprised. Be normal. Send those 'I'm like you' body language signals. | ||
+ | |||
+ | This seems to have been integrated into "perception management" (more on that in [[Psychological warfare]]). | ||
+ | |||
+ | In 2011, Frank Luntz, a Republican strategist and a US expert on crafting the perfect political message, said, "I’m so scared of this anti-Wall Street effort. I’m frightened to death. They’re having an impact on what the American people think of capitalism." Next Luntz offered tips on how Republicans could discuss the grievances of the Occupiers, and help the governors better handle all these new questions from constituents about "income inequality" and "paying your fair share." Yahoo News sat in on the session, and counted 10 do’s and don’ts from Luntz covering how Republicans should fight back by changing the way they discuss the movement <ref>How Republicans are being taught to talk about Occupy Wall Street http://news.yahoo.com/blogs/ticket/republicans-being-taught-talk-occupy-wall-street-133707949.html | ||
+ | </ref>. And young turks followed up on that <ref>Leaked: Republicans Scared of Occupy Wall Street https://www.youtube.com/watch?v=7B3Fw5TPJK8 | ||
+ | </ref>. | ||
+ | |||
=== Testimonial === | === Testimonial === | ||
− | + | In Testimonial (alias Appeal to Authority or Celebrity) the testimony of an independent person is seen as more trustworthy. Appealing to celebrity is one of the most common forms of fallacious appeal to authority. Celebrity endorsement of products is so common that we hardly notice it or wonder why a sports(wo)man is trying to sell us underwear or a cause. Or an actor medicine. ''I'm not a doctor, but I play one on TV. '' | |
+ | |||
+ | Not only do celebrities endorse products they have no expertise with, it is also not without interest, as they are getting paid to do so. Testimonials need not be true or honest. You can pay people to say pretty much anything, and some will be happy to say whatever you like for a suitable sum. Be careful about paying, even for genuine support, as if it is found out then it will devalue the testimonial and possibly be seen in a very negative way. Some advertisers attempt to use doublespeak to disguise the fact that their spokespersons are paid. They put the words like 'compensated endorsement' in small print at the bottom of the screen, apparently hoping that the viewers will not understand that these eleven-letter words mean that the spokesperson is paid. | ||
+ | |||
=== Transfer === | === Transfer === | ||
− | + | Transfer works by association with trusted others, expertise or ideas. Nowadays, much advertising is wordless, consists only of images, is not about the product being sold, and is devoid of evidence, even fallacious evidence. Marlboro for example associate the image of the strong, masculine, independent, fearless cowboy with its cigarettes by constant conjunction. You can't fault their reasoning, because there is none, but you can fault them for that. | |
+ | |||
+ | Transfer is also used by infiltrants where they associate with other people or groups that already have high trust and credibility (more on that in [[Psychological warfare]]). | ||
+ | |||
+ | == Logical analysis == | ||
+ | The following are examples of common fallacies with a logical analysis of argument <ref>Logically Fallacious http://www.logicallyfallacious.com/index.php/logical-fallacies</ref>. | ||
+ | |||
+ | === Red Herring === | ||
+ | A 'Red Herring' (alias Smoke Screen, Wild Goose Chase, Beside the Point, Misdirection [form of], Changing the Subject, False Emphasis, Chewbacca Defense, Irrelevant Conclusion, Irrelevant Thesis, Clouding the Issue, Ignorance of Refutation, Judgmental Language [form of]) raises an issue that is unimportant to a claim. | ||
+ | |||
+ | Analysis: How important is the issue raised to the claim? | ||
+ | |||
+ | <strong>Logical Form:</strong> | ||
+ | |||
+ | Argument A is presented by person 1. | ||
+ | |||
+ | Person 2 introduces argument B. | ||
+ | |||
+ | Argument A is abandoned. | ||
+ | |||
+ | <strong>Example:</strong> | ||
+ | |||
+ | M: A 'Red Herring' (alias Smoke Screen, Wild Goose Chase, Beside the Point, Misdirection [form of], Changing the Subject, False Emphasis, Chewbacca Defense, Irrelevant Conclusion, Irrelevant Thesis, Clouding the Issue, Ignorance of Refutation, Judgmental Language [form of]) raises an issue that is unimportant to a claim in an argument. | ||
+ | |||
+ | K: There is no such fish species as a "red herring"; rather it refers to a particularly pungent fish -typically a herring but not always- that has been strongly cured in brine and/or heavily smoked. | ||
+ | |||
+ | M: It was used for training hunting dogs not to leave the scent path of foxes they were following, yes. | ||
+ | |||
+ | K: But why did they hunt foxes in the first place? | ||
+ | |||
+ | <strong>Explanation:</strong> K has successfully derailed this conversation off of fallacious digressions to a deep, existential, discussion on fox hunts. | ||
+ | |||
+ | === Straw Man === | ||
+ | Straw Man happens when one side attacks a position, the 'Straw Man', not held by the other side, then acts as though the other side's position has been refuted. | ||
+ | |||
+ | Analysis: Compare the arguer's version of the argument to which he/she is responding to the argument as originally presented. Be skeptical when an arguer represents the argument of another. | ||
+ | |||
+ | <strong>Logical Form:</strong> | ||
+ | |||
+ | Person 1 makes claim Y. | ||
+ | |||
+ | Person 2 restates person 1’s claim (in a distorted way). | ||
+ | |||
+ | Person 2 attacks the distorted version of the claim. | ||
+ | |||
+ | Therefore, claim Y is false. | ||
+ | |||
+ | <strong>Example:</strong> | ||
+ | |||
+ | T: Biological evolution is both a theory and a fact. | ||
+ | |||
+ | E: That is ridiculous! How can you possibly be absolutely certain that we evolved from pond scum! | ||
+ | |||
+ | T: Actually that is a gross misrepresentation of my assertion. I never claimed we evolved from pond scum. Unlike math and logic, science is based on empirical evidence and, therefore, a scientific fact is something that is confirmed to such a degree that it would be perverse to withhold provisional consent. The empirical evidence for the fact that biological evolution does occur falls into this category. | ||
+ | |||
+ | <strong>Explanation:</strong> E has ignorantly mischaracterized the argument by a) assuming we evolved from pond scum (whatever that is exactly), and b) assuming “fact” means “certainty”. | ||
+ | |||
+ | '''Note:''' At times, an opponent might not want to expand on the implications of his or her position, so making assumptions might be the only way to get the opponent to point out that your interpretation is not accurate, then they will be forced to clarify. | ||
+ | |||
+ | === False Dilemma === | ||
+ | The 'False Dilemma' leads people into having to make a choice between two (or more) options when other options more favourable to or fitting for them are available. | ||
+ | |||
+ | Analysis: ask whether there are other options not mentioned. | ||
+ | |||
+ | <strong>Logical Form:</strong> | ||
+ | |||
+ | Either X or Y is true. | ||
+ | |||
+ | Either X, Y, or Z is true. | ||
+ | |||
+ | <strong>Example (two choices):</strong> | ||
+ | |||
+ | L: Red pill or blue pill? | ||
+ | |||
+ | <strong>Explanation:</strong> As Obi Wan Kenobi so eloquently puts it in Star Wars episode III, “Only a Sith deals in absolutes!” | ||
+ | |||
+ | '''Note:''' There may be cases when the number of options really is limited. If an ice cream man just has chocolate and vanilla left, it would be a waste of time insisting he has mint chocolate chip. | ||
+ | |||
+ | <strong>Tip:</strong> Be conscious of how many times you are presented with false dilemmas, and how many times you present yourself with false dilemmas. | ||
+ | |||
+ | === Slippery Slope === | ||
+ | The 'slippery slope' (alias 'Argument of the Beard' and 'Fallacy of the Beard'). There are two types of fallacy referred to as 'slippery slopes': 'Non Causa Pro Causa' and 'Vagueness': | ||
+ | |||
+ | '''Logical form Non Causa Pro Causa''': | ||
+ | |||
+ | If A happens, then by a gradual series of small steps through B, C,…, X, Y, eventually Z will happen, too. | ||
+ | Z should not happen. | ||
+ | Therefore, A should not happen, either. | ||
+ | |||
+ | '''Explanation''': This type of argument is by no means invariably fallacious, but the strength of the argument is inversely proportional to the number of steps between <strong>A</strong> and <strong>Z</strong>, and directly proportional to the causal strength of the connections between adjacent steps. If there are many intervening steps, and the causal connections between them are weak, or even unknown, then the resulting argument will be very weak, if not downright fallacious. | ||
+ | |||
+ | '''Logical form Vagueness''': | ||
+ | |||
+ | A differs from Z by a continuum of insignificant changes, and there is no non-arbitrary place at which a sharp line between the two can be drawn. | ||
+ | Therefore, there is really no difference between A and Z. | ||
+ | |||
+ | A differs from Z by a continuum of insignificant changes with no non-arbitrary line between the two. | ||
+ | Therefore, A doesn't exist. | ||
+ | |||
+ | '''Explanation''': Though these two fallacies, Non Causa Pro Causa and Vagueness, are distinct, and the fact that they share a name is unfortunate, they often have a relationship which may justify treating them together: semantic slippery slopes often form a basis for causal slippery slopes. In other words, people often think that a causal slide from <strong>A</strong> to <strong>Z</strong> is unavoidable because there is no precise, non-arbitrary dividing line between the two concepts. For example, opponents of abortion often believe that the legality of abortion will lead causally to the legality of infanticide, and one reason for this belief is that the only precise dividing line between an embryo and a newborn baby is the morally arbitrary one of birth. For this reason, causal slippery slopes are often the result of semantic ones. | ||
+ | |||
+ | For an example see below in Examples of critical thinking. | ||
== Sleeper effect == | == Sleeper effect == | ||
− | The impact of a persuasive message will generally tend to decrease over time. A sleeper effect takes place in a situation when <em>effects of a persuasive message are stronger when more time passes</em>. | + | Relatively recently there has been much talk (and research) of the 'sleeper effect'. The impact of a persuasive message will generally tend to decrease over time. A sleeper effect takes place in a situation when <em>effects of a persuasive message are stronger when more time passes</em>. |
− | == | + | == (Counter) moves == |
− | Detecting | + | |
+ | === Detecting deceptive propaganda === | ||
+ | Detecting deceptive propaganda is easy using critical thinking and reasoning skills: | ||
+ | |||
+ | [[File:Government-declassify-everything.jpg|320px|thumb|right]] | ||
+ | |||
+ | * Learn the fallacies, memorise the fallacies, recognise the fallacies. | ||
+ | * When presented with an argument, locate the claim and the evidence that supports it, trace the source and assess credibility of the data presented. | ||
+ | * Look for signs of the obvious fallacies (see above). | ||
+ | * Work to understand why the arguer feels the evidence warrants the claim and apply analysis. | ||
+ | |||
+ | === Use controlled folly === | ||
+ | It is a "mission impossible" to try to convince anybody with reason. Everyone with strong ideologies, and that means the overwhelming majority of people, only wants to see, hear and read what they already believe. | ||
You can try humour. Humour is the art of the incongruous. Seeing, hearing or reading something that conflicts with preconceptions is already incongruous, and the "normal" reaction seems to be to reject the new information. Humour provides a way for new information to get into the brain, where it may be considered. ''Also, note that authorities can not easily deal with not being taken serious.'' | You can try humour. Humour is the art of the incongruous. Seeing, hearing or reading something that conflicts with preconceptions is already incongruous, and the "normal" reaction seems to be to reject the new information. Humour provides a way for new information to get into the brain, where it may be considered. ''Also, note that authorities can not easily deal with not being taken serious.'' | ||
− | Educating people to think critically for | + | === Educate for immunisation === |
+ | Educating people to think critically for at least some partial immunisation can be an excellent counter-strategy against the current mess we seem to be in, but I haven't encountered it (or found it planned) on an institutional scale in educational services (yet). | ||
− | === | + | == Examples of critical thinking == |
− | + | ||
− | == | + | === Banning of golden dawn === |
− | + | In September of 2013 there was talk of banning Golden Dawn <ref>Calls to ban Greek far-right party after murder of anti-fascist rapper http://rt.com/news/greek-rapper-funeral-tension-074/ | |
− | + | </ref> (See [[timeline merchants of death]]). | |
− | == News and watchdogs == | + | This target could not have been chosen better if authorities wanted to introduce people to banning and proactive arrests becoming "normal" in Greece (and Europe). According to Donald Black in "The Behavior of Law", our use of the law (and also what we find acceptable in law keeping) is governed by three qualifications: |
+ | * The degree of intimacy we have with a defendant, i.e., we will invoke the law more often (and prosecution is more likely) if we view the defendant as an outsider versus a family member, neighbor, or friend. | ||
+ | * Cultural distance, i.e., our use of the law will increase if the defendant is of a different race or religion. | ||
+ | * Conventionality, i.e., if we participate in the culture of the majority we’re more likely to view the state as an advocate, e.g., whites versus blacks or the middle class versus the lower class. | ||
+ | Banning Golden Dawn could lead to: | ||
+ | * A possible slippery slope situation (today it's "them", tomorrow it's "us"). | ||
+ | * What you ban will likely go underground where we can not see what’s brewing (more). | ||
+ | * Other, possibly more important and immediate, problems are not getting solved because we are distracted from them: <em>a perverse coalition of “socialist modernisers” and far-right nationalists, who are governing Greece ostensibly to safeguard its “European perspective” </em><ref>The problem is the Greek government, not golden dawn https://greekleftreview.wordpress.com/2013/09/19/the-problem-is-the-greek-government-not-golden-dawn/ | ||
+ | </ref>. | ||
+ | Writing that does not mean I think Golden Dawn members and affiliated are free to do as they please! These are indeed out-of-control-thugs. How about making clear actions like bullying or coercing others into submission to ones ideas, and actions like murder, are not acceptable, <strong>not by anyone</strong> and give perpetrators sentences fitting the crime per case. | ||
+ | |||
+ | == Resources == | ||
+ | |||
+ | === News and watchdogs === | ||
* FAIR http://fair.org/ | * FAIR http://fair.org/ | ||
* PR Watch http://www.prwatch.org/ | * PR Watch http://www.prwatch.org/ | ||
* SpinWatch http://www.spinwatch.org/ | * SpinWatch http://www.spinwatch.org/ | ||
* RationalWiki http://rationalwiki.org/wiki/Main_Page | * RationalWiki http://rationalwiki.org/wiki/Main_Page | ||
+ | * Fallacy Files http://www.fallacyfiles.org/ | ||
− | == Books == | + | === Books === |
− | + | '''Fallacies''' | |
− | + | ||
* How to Detect Propaganda (pdf) - Adapted from: The Institute for Propaganda Analysis, 1937 http://www.mindivogel.com/uploads/1/1/3/9/11394148/how_to_detect_propaganda.pdf | * How to Detect Propaganda (pdf) - Adapted from: The Institute for Propaganda Analysis, 1937 http://www.mindivogel.com/uploads/1/1/3/9/11394148/how_to_detect_propaganda.pdf | ||
* Love is a Fallacy (pdf) - story by by Max Shulman p.10 https://www.dartmouth.edu/~aporia/spring08.pdf | * Love is a Fallacy (pdf) - story by by Max Shulman p.10 https://www.dartmouth.edu/~aporia/spring08.pdf | ||
* Propaganda and debating techniques http://www.orange-papers.org/orange-propaganda.html | * Propaganda and debating techniques http://www.orange-papers.org/orange-propaganda.html | ||
− | + | * Logically Fallacious http://www.logicallyfallacious.com/index.php/logical-fallacies | |
− | + | '''Sleeper Effect''' | |
* The Influence of Speaker Credibility on Information Recall (pdf) - Michael E. Corrie http://www.uwlax.edu/urc/JUR-online/PDF/2003/corrie.pdf | * The Influence of Speaker Credibility on Information Recall (pdf) - Michael E. Corrie http://www.uwlax.edu/urc/JUR-online/PDF/2003/corrie.pdf | ||
* Detecting and Explaining the Sleeper Effect (pdf) - Darlene B. Hannah & Brian Sternthal http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.455.3259&rep=rep1&type=pdf | * Detecting and Explaining the Sleeper Effect (pdf) - Darlene B. Hannah & Brian Sternthal http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.455.3259&rep=rep1&type=pdf | ||
* The Sleeper Effect in Persuasion: A Meta-Analytic Review (pdf) - Tarcan Kumkale and Dolores Albarracın http://home.ku.edu.tr/~tkumkale/sleeper.pdf | * The Sleeper Effect in Persuasion: A Meta-Analytic Review (pdf) - Tarcan Kumkale and Dolores Albarracın http://home.ku.edu.tr/~tkumkale/sleeper.pdf | ||
− | == Documentaries == | + | === Documentaries === |
+ | * The "Red Herring" and "Straw Man" Fallacy (difference) https://www.youtube.com/watch?v=exdK7Lirngg | ||
+ | * Chipotle Scarecrow (Honest version) https://www.youtube.com/watch?v=nYZgWYZlAZU | ||
+ | |||
+ | == Related == | ||
+ | * [[Covert operations]] | ||
+ | * [[Psychological warfare]] | ||
+ | * [[Timeline merchants of death]] | ||
+ | * [[Timeline masters of the internet]] | ||
== References == | == References == |
Latest revision as of 19:22, 9 June 2015
Propaganda is "any form of communication in support of national objectives designed to influence the opinions, emotions, attitudes, or behavior of any group in order to benefit the sponsor, either directly or indirectly". Governments have always tried to control people. Those in authority want control of the people’s hearts, minds and allegiances, and block or censor dissident voices. Probably every conflict is fought on at least two grounds: the battlefield and the minds of the people via propaganda. The “good guys” and the “bad guys” can often both be guilty of misleading their people with distortions, exaggerations, subjectivity, inaccuracy and even fabrications, in order to receive support and a sense of legitimacy. The good guise and the bad guise. Black and white. A fight for supremacy, for government. The king is dead, long live the king. We can learn how their game is played for how to deal with (counteract) these propaganda wars [1].
The term “propaganda” first came into common use in Europe as a result of Pope Gregory XV creating the Congregation for the Propagation of the Faith. This was a commission of cardinals charged with spreading the faith and regulating church affairs in heathen lands. A College of Propaganda was set up to train priests for the missions. The word came into common use again when World War I began. No matter the word used, the battle for our minds is as old as human history. The Greeks had games, theater, assembly, law courts, and festivals for propagandising ideas and beliefs. The conflict between kings and Parliament in England was a struggle in which propaganda was involved. Propaganda was one of the weapons used in the movement for American independence, and in the French Revolution. The discipline of public relations (PR) started as a profession after the first world war as the commercial benefits of careful propaganda were realised.
In western nations, most people seem to believe propaganda happens, but only in other nations. Meanwhile the western military-industrial-machine is rife with propaganda that is easy to detect, and some propaganda that is harder to detect, or when detected, it can be hard to find the source of it, and exactly why it is done.
Contents
The Institute for Propaganda Analysis
In 1936 Boston merchant Edward Filene helped establish the short-lived Institute for Propaganda Analysis which sought to educate Americans to recognise propaganda techniques. It's seven propaganda methods have become somewhat of a standard.
Bandwagon
The 'bandwagon' pumps up the value of 'joining the party'. Bandwagon is a fallacy, or mistake, in argumentation. Related to the emotional appeal in persuasion, or pathos, the 'bandwagon' approach involves convincing a readership that the majority of people agree with the writer's argument. This technique suggests that just because a large majority of people agree, the reader should, too. The bandwagon plays heavily on the human need for belonging, making the group a desirable place to be.
Commercial writers often make statements like "Over 5 million people have called…," adding the name of a company. This approach works because of the social pressure of majority opinion. Or an author states, "Everyone is doing whatever it takes to make himself happy. When you recognise that, you don't feel guilty for doing what everyone else is also doing." This approach works because the author argues that what everyone is doing is correct, equating popularity with truth. Review writers use it when they inform their audience that a book or a song has been number-one for several weeks, adding "Check it out." If the readers do not, they risk being left behind.
Card-stacking
Card-stacking (alias Ignoring the Counterevidence, One-Sided Assessment, Slanting and Suppressed Evidence) builds a highly-biased case for position (and cause). In 'card-stacking', deliberate action is taken to bias an argument, with opposing evidence being buried or discredited, whilst the case for one's own position is exaggerated at every opportunity. Thus the testimonial of supporters is used, but not that of opponents. Coincidences and serendipity may be artificially created, making deliberate action seem like random occurrence. Things 'just seem to happen'.
It is by no means always fallacious to present a one-sided argument. For example, it is not a defense attorney's job to present the evidence for a defendant's guilt, that is the job of the prosecutor. Nor can we expect a salesman to list "what could possibly go wrong". Or politicians to give us all sides of a story in an election campaign. As is usual with fallacies, we have to take the context of the argument into consideration. One-sidedness is fallacious in contexts where we have a right to demand objectivity. IMHO, two such contexts are news stories and scientific writing. Slanting in a news story or scientific production may lead the reader into drawing false conclusions, which means that the story is a boobytrap and the reader's reasoning inadvertently fallacious.
Glittering Generalities
Glittering generalities uses power words to evoke emotions replacing rational argument and clear evidence. This is a combination of the generalisation fallacy where one thing is applied to another thing, hypnotic talk that puts people into a light trance (darkened rooms and flashing lights are tale-telling signs of that), and the use of nouns giving a sense of substance while there is none.
Ladies and Gentlemen, it is with the greatest pleasure that I welcome you to this most anonymous of operations. We are gathered here together on twitter and facebook on the brink of a worldwide collapse to which we must all rise in concert [raises fist], for all that is necessary for the triumph of evil is that good people do nothing, which I will never do and I know you will never accept.
Name-calling
'Name-calling' (alias Mud Slinging alias Demonisation) is apparently a much used political practice [2] and one of the easiest to do: Take a random person and denigrate them. Show that you can and will do this to any opponents. You can do it to an apparently strong person, to demonstrate that you are not afraid and will take on and defeat even the powerful. You can do it to a weak person, to show that nobody is safe from your ire. You can do it to an ordinary-next-door person, to show that 'people like you' are not safe either. And apparently, name-calling is one of the most common tactics people use to hurt others or disparage them. Most people who indulge in name-calling know that the label or name they choose to describe another individual is not factually accurate. Likewise, it’s normal to feel hurt and defensive when a person starts labeling you or calling you names. When you are on the receiving end you can lose in two ways: The first way is in loss of self-esteem (if you let it) and the second way in discovering that after the insult, there is no path of resolution being offered to you (if you need the other party for that).
And there is also a much darker version of name-calling where the name-caller has not abandoned intelligent conversation in favor of an emotional outburst, but intentionally facilitates demonisation. Mud sticks, as we all know (a fallacy right there, but hey). Name-calling associates a target (group) with something that is despised or is inferior in some way. Now, if anyone associates with the target (group), the mud will also stick to them. The more the other person or persons become socially isolated, the more people will avoid the target (group). The result is a spiral of isolation neutralising opponents and sending a chilling warning to those who might defend the target (group) or follow in that person's path.
Plain Folks
'Plain folks' makes a leader seem ordinary to increase trust and credibility. IOW, 'plain folks' attempts to short circuit reasoning by asserting that the arguer is just like you and therefore you should believe them:
- Wear ordinary plain clean clothes, no designer gear, no 'flashy' messages. When visiting particular groups, you can dress to show you are like them.
- Use simple words, simple grammar and short sentences. Pause regularly. When talking with a particular group of people, use their lingo, but do not use local dialects and professional jargon. Perhaps use just a few of the local words, or parallel words that boil down to 'I may not be you, but I'm so like you it is the same thing, anyway'. I know tax increases are a bad idea. I pay taxes too.
- Do 'normal' things. Be seen doing chores around the house. Go out running. Walk the dog. Play with your kids. Appear interested in things and people. Be surprised. Be normal. Send those 'I'm like you' body language signals.
This seems to have been integrated into "perception management" (more on that in Psychological warfare).
In 2011, Frank Luntz, a Republican strategist and a US expert on crafting the perfect political message, said, "I’m so scared of this anti-Wall Street effort. I’m frightened to death. They’re having an impact on what the American people think of capitalism." Next Luntz offered tips on how Republicans could discuss the grievances of the Occupiers, and help the governors better handle all these new questions from constituents about "income inequality" and "paying your fair share." Yahoo News sat in on the session, and counted 10 do’s and don’ts from Luntz covering how Republicans should fight back by changing the way they discuss the movement [3]. And young turks followed up on that [4].
Testimonial
In Testimonial (alias Appeal to Authority or Celebrity) the testimony of an independent person is seen as more trustworthy. Appealing to celebrity is one of the most common forms of fallacious appeal to authority. Celebrity endorsement of products is so common that we hardly notice it or wonder why a sports(wo)man is trying to sell us underwear or a cause. Or an actor medicine. I'm not a doctor, but I play one on TV.
Not only do celebrities endorse products they have no expertise with, it is also not without interest, as they are getting paid to do so. Testimonials need not be true or honest. You can pay people to say pretty much anything, and some will be happy to say whatever you like for a suitable sum. Be careful about paying, even for genuine support, as if it is found out then it will devalue the testimonial and possibly be seen in a very negative way. Some advertisers attempt to use doublespeak to disguise the fact that their spokespersons are paid. They put the words like 'compensated endorsement' in small print at the bottom of the screen, apparently hoping that the viewers will not understand that these eleven-letter words mean that the spokesperson is paid.
Transfer
Transfer works by association with trusted others, expertise or ideas. Nowadays, much advertising is wordless, consists only of images, is not about the product being sold, and is devoid of evidence, even fallacious evidence. Marlboro for example associate the image of the strong, masculine, independent, fearless cowboy with its cigarettes by constant conjunction. You can't fault their reasoning, because there is none, but you can fault them for that.
Transfer is also used by infiltrants where they associate with other people or groups that already have high trust and credibility (more on that in Psychological warfare).
Logical analysis
The following are examples of common fallacies with a logical analysis of argument [5].
Red Herring
A 'Red Herring' (alias Smoke Screen, Wild Goose Chase, Beside the Point, Misdirection [form of], Changing the Subject, False Emphasis, Chewbacca Defense, Irrelevant Conclusion, Irrelevant Thesis, Clouding the Issue, Ignorance of Refutation, Judgmental Language [form of]) raises an issue that is unimportant to a claim.
Analysis: How important is the issue raised to the claim?
Logical Form:
Argument A is presented by person 1.
Person 2 introduces argument B.
Argument A is abandoned.
Example:
M: A 'Red Herring' (alias Smoke Screen, Wild Goose Chase, Beside the Point, Misdirection [form of], Changing the Subject, False Emphasis, Chewbacca Defense, Irrelevant Conclusion, Irrelevant Thesis, Clouding the Issue, Ignorance of Refutation, Judgmental Language [form of]) raises an issue that is unimportant to a claim in an argument.
K: There is no such fish species as a "red herring"; rather it refers to a particularly pungent fish -typically a herring but not always- that has been strongly cured in brine and/or heavily smoked.
M: It was used for training hunting dogs not to leave the scent path of foxes they were following, yes.
K: But why did they hunt foxes in the first place?
Explanation: K has successfully derailed this conversation off of fallacious digressions to a deep, existential, discussion on fox hunts.
Straw Man
Straw Man happens when one side attacks a position, the 'Straw Man', not held by the other side, then acts as though the other side's position has been refuted.
Analysis: Compare the arguer's version of the argument to which he/she is responding to the argument as originally presented. Be skeptical when an arguer represents the argument of another.
Logical Form:
Person 1 makes claim Y.
Person 2 restates person 1’s claim (in a distorted way).
Person 2 attacks the distorted version of the claim.
Therefore, claim Y is false.
Example:
T: Biological evolution is both a theory and a fact.
E: That is ridiculous! How can you possibly be absolutely certain that we evolved from pond scum!
T: Actually that is a gross misrepresentation of my assertion. I never claimed we evolved from pond scum. Unlike math and logic, science is based on empirical evidence and, therefore, a scientific fact is something that is confirmed to such a degree that it would be perverse to withhold provisional consent. The empirical evidence for the fact that biological evolution does occur falls into this category.
Explanation: E has ignorantly mischaracterized the argument by a) assuming we evolved from pond scum (whatever that is exactly), and b) assuming “fact” means “certainty”.
Note: At times, an opponent might not want to expand on the implications of his or her position, so making assumptions might be the only way to get the opponent to point out that your interpretation is not accurate, then they will be forced to clarify.
False Dilemma
The 'False Dilemma' leads people into having to make a choice between two (or more) options when other options more favourable to or fitting for them are available.
Analysis: ask whether there are other options not mentioned.
Logical Form:
Either X or Y is true.
Either X, Y, or Z is true.
Example (two choices):
L: Red pill or blue pill?
Explanation: As Obi Wan Kenobi so eloquently puts it in Star Wars episode III, “Only a Sith deals in absolutes!”
Note: There may be cases when the number of options really is limited. If an ice cream man just has chocolate and vanilla left, it would be a waste of time insisting he has mint chocolate chip.
Tip: Be conscious of how many times you are presented with false dilemmas, and how many times you present yourself with false dilemmas.
Slippery Slope
The 'slippery slope' (alias 'Argument of the Beard' and 'Fallacy of the Beard'). There are two types of fallacy referred to as 'slippery slopes': 'Non Causa Pro Causa' and 'Vagueness':
Logical form Non Causa Pro Causa:
If A happens, then by a gradual series of small steps through B, C,…, X, Y, eventually Z will happen, too. Z should not happen. Therefore, A should not happen, either.
Explanation: This type of argument is by no means invariably fallacious, but the strength of the argument is inversely proportional to the number of steps between A and Z, and directly proportional to the causal strength of the connections between adjacent steps. If there are many intervening steps, and the causal connections between them are weak, or even unknown, then the resulting argument will be very weak, if not downright fallacious.
Logical form Vagueness:
A differs from Z by a continuum of insignificant changes, and there is no non-arbitrary place at which a sharp line between the two can be drawn. Therefore, there is really no difference between A and Z.
A differs from Z by a continuum of insignificant changes with no non-arbitrary line between the two. Therefore, A doesn't exist.
Explanation: Though these two fallacies, Non Causa Pro Causa and Vagueness, are distinct, and the fact that they share a name is unfortunate, they often have a relationship which may justify treating them together: semantic slippery slopes often form a basis for causal slippery slopes. In other words, people often think that a causal slide from A to Z is unavoidable because there is no precise, non-arbitrary dividing line between the two concepts. For example, opponents of abortion often believe that the legality of abortion will lead causally to the legality of infanticide, and one reason for this belief is that the only precise dividing line between an embryo and a newborn baby is the morally arbitrary one of birth. For this reason, causal slippery slopes are often the result of semantic ones.
For an example see below in Examples of critical thinking.
Sleeper effect
Relatively recently there has been much talk (and research) of the 'sleeper effect'. The impact of a persuasive message will generally tend to decrease over time. A sleeper effect takes place in a situation when effects of a persuasive message are stronger when more time passes.
(Counter) moves
Detecting deceptive propaganda
Detecting deceptive propaganda is easy using critical thinking and reasoning skills:
- Learn the fallacies, memorise the fallacies, recognise the fallacies.
- When presented with an argument, locate the claim and the evidence that supports it, trace the source and assess credibility of the data presented.
- Look for signs of the obvious fallacies (see above).
- Work to understand why the arguer feels the evidence warrants the claim and apply analysis.
Use controlled folly
It is a "mission impossible" to try to convince anybody with reason. Everyone with strong ideologies, and that means the overwhelming majority of people, only wants to see, hear and read what they already believe.
You can try humour. Humour is the art of the incongruous. Seeing, hearing or reading something that conflicts with preconceptions is already incongruous, and the "normal" reaction seems to be to reject the new information. Humour provides a way for new information to get into the brain, where it may be considered. Also, note that authorities can not easily deal with not being taken serious.
Educate for immunisation
Educating people to think critically for at least some partial immunisation can be an excellent counter-strategy against the current mess we seem to be in, but I haven't encountered it (or found it planned) on an institutional scale in educational services (yet).
Examples of critical thinking
Banning of golden dawn
In September of 2013 there was talk of banning Golden Dawn [6] (See timeline merchants of death).
This target could not have been chosen better if authorities wanted to introduce people to banning and proactive arrests becoming "normal" in Greece (and Europe). According to Donald Black in "The Behavior of Law", our use of the law (and also what we find acceptable in law keeping) is governed by three qualifications:
- The degree of intimacy we have with a defendant, i.e., we will invoke the law more often (and prosecution is more likely) if we view the defendant as an outsider versus a family member, neighbor, or friend.
- Cultural distance, i.e., our use of the law will increase if the defendant is of a different race or religion.
- Conventionality, i.e., if we participate in the culture of the majority we’re more likely to view the state as an advocate, e.g., whites versus blacks or the middle class versus the lower class.
Banning Golden Dawn could lead to:
- A possible slippery slope situation (today it's "them", tomorrow it's "us").
- What you ban will likely go underground where we can not see what’s brewing (more).
- Other, possibly more important and immediate, problems are not getting solved because we are distracted from them: a perverse coalition of “socialist modernisers” and far-right nationalists, who are governing Greece ostensibly to safeguard its “European perspective” [7].
Writing that does not mean I think Golden Dawn members and affiliated are free to do as they please! These are indeed out-of-control-thugs. How about making clear actions like bullying or coercing others into submission to ones ideas, and actions like murder, are not acceptable, not by anyone and give perpetrators sentences fitting the crime per case.
Resources
News and watchdogs
- FAIR http://fair.org/
- PR Watch http://www.prwatch.org/
- SpinWatch http://www.spinwatch.org/
- RationalWiki http://rationalwiki.org/wiki/Main_Page
- Fallacy Files http://www.fallacyfiles.org/
Books
Fallacies
- How to Detect Propaganda (pdf) - Adapted from: The Institute for Propaganda Analysis, 1937 http://www.mindivogel.com/uploads/1/1/3/9/11394148/how_to_detect_propaganda.pdf
- Love is a Fallacy (pdf) - story by by Max Shulman p.10 https://www.dartmouth.edu/~aporia/spring08.pdf
- Propaganda and debating techniques http://www.orange-papers.org/orange-propaganda.html
- Logically Fallacious http://www.logicallyfallacious.com/index.php/logical-fallacies
Sleeper Effect
- The Influence of Speaker Credibility on Information Recall (pdf) - Michael E. Corrie http://www.uwlax.edu/urc/JUR-online/PDF/2003/corrie.pdf
- Detecting and Explaining the Sleeper Effect (pdf) - Darlene B. Hannah & Brian Sternthal http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.455.3259&rep=rep1&type=pdf
- The Sleeper Effect in Persuasion: A Meta-Analytic Review (pdf) - Tarcan Kumkale and Dolores Albarracın http://home.ku.edu.tr/~tkumkale/sleeper.pdf
Documentaries
- The "Red Herring" and "Straw Man" Fallacy (difference) https://www.youtube.com/watch?v=exdK7Lirngg
- Chipotle Scarecrow (Honest version) https://www.youtube.com/watch?v=nYZgWYZlAZU
Related
- Covert operations
- Psychological warfare
- Timeline merchants of death
- Timeline masters of the internet
References
- ↑ The Semantics of “Good” & “Evil” http://theanarchistlibrary.org/library/robert-anton-wilson-the-semantics-of-good-evil
- ↑ Name Calling: An intricate map of who's insulted who http://laphamsquarterly.org/comedy/charts-graphs/name-calling
- ↑ How Republicans are being taught to talk about Occupy Wall Street http://news.yahoo.com/blogs/ticket/republicans-being-taught-talk-occupy-wall-street-133707949.html
- ↑ Leaked: Republicans Scared of Occupy Wall Street https://www.youtube.com/watch?v=7B3Fw5TPJK8
- ↑ Logically Fallacious http://www.logicallyfallacious.com/index.php/logical-fallacies
- ↑ Calls to ban Greek far-right party after murder of anti-fascist rapper http://rt.com/news/greek-rapper-funeral-tension-074/
- ↑ The problem is the Greek government, not golden dawn https://greekleftreview.wordpress.com/2013/09/19/the-problem-is-the-greek-government-not-golden-dawn/