Click Here
Cart

The Logical Fallacy Guide

Sahil Bloom

Welcome to the 242 new members of the curiosity tribe who have joined us since Wednesday. Join the 57,887 others who are receiving high-signal, curiosity-inducing content every single week.

What’s a Rich Text element?

The rich text element allows you to create and format headings, paragraphs, blockquotes, images, and video all in one place instead of having to add and format them individually. Just double-click and easily create content.

Static and dynamic content editing

A rich text element can be used with static or dynamic content. For static content,

just drop it into any page and begin editing. For dynamic content, add a rich text field to any collection and then connect a rich text element to that field in the settings panel. Voila!

  • mldsa
  • ,l;cd
  • mkclds

How to customize formatting for each rich text

Headings, paragraphs, blockquotes, figures, images, and figure captions can all be styled after a class is added to the rich text element using the "When inside of"

nested selector

system.

Visualization credit: @drex_jpg

If you’ve been reading this newsletter, you know that I like to say that humans are fascinating creatures.

We possess the capacity to accomplish some complex feat of technology and engineering, and subsequently fall victim to the most obviously flawed base logic.

Logical fallacies—errors in reasoning that undermine the quality of an argument—are classic examples of this fact.

The Merriam-Webster Dictionary defines fallacy as a false or misleading idea. A logical fallacy, therefore, can simply be thought of as logic based on a false or misleading idea.

Unfortunately, unless you went to law school—or took a robust philosophy course load in college—you’ve likely been minimally exposed to them in a formal context.

Accordingly, we frequently fall victim to logical fallacies—our own emotional, psychological, and intellectual blindspots create the cracks and we fall right into them.

There is no such thing as a perfect logician, but we can all strive to cover our blindspots and craft better arguments. Similar to the study of cognitive biases—which I’ve written about recently here and here—the first step in avoiding logical fallacies is developing an awareness of them.

In that vein, today’s piece will cover 20 common logical fallacies to learn, identify, and avoid.

Without further ado, let’s dive in…

Ad Hominem

Latin phrase for "to the person”—an ad hominem attack is an attack of the individual rather than the argument.

Instead of addressing the argument—its structure, logic, and merits—the offender attempts to refute the opposition on the basis of personal characteristics.

It may be overt—openly attacking the person’s character or personality—or covert—subtly doing the same—but it always focuses on the person, not the argument.

Often referred to as “mud-slinging” in political circles, if you’ve ever watched a political debate or political campaign ads, you’re already familiar with this one. It’s all-too-common on Twitter and other online discourse, as well.

Example

Candidate 1: “…and this is why I believe we need to implement a much more aggressive set of climate change regulations.”

Candidate 2: “I’m sorry, but are we really expected to believe anything coming from a known liar who cheated on his college entrance exams to get to this position?”

The offender (Candidate 2) has attacked Candidate 1 the individual, rather than the argument itself.

The Texas Sharpshooter

The name of this fallacy is based on a fable:

A Texan fires a gun multiple times at a barn wall. He then walks over to the bullet-riddled wall and paints a target around the closest cluster of bullet holes to create the appearance of impressive marksmanship.

Think of this as cherry-picking—selecting and highlighting evidence that supports the conclusion and systematically ignoring evidence that may refute it.

Example

“Tara is a really impressive and successful restauranteur. Her restaurant on Park Avenue is always full and gets really high ratings on Yelp.”

This may be true, but it ignores the fact that Tara’s five other restaurant openings have failed. The cherry-picked data—the successful Park Avenue restaurant—is used to draw a broad conclusion about Tara’s quality as a restauranteur that may be inaccurate.

Sunk Cost Fallacy

A favorite of behavioral economics.

Sunk costs are the economic costs already invested in an activity that cannot be recovered. Money spent on non-refundable flights and hotels, time invested in a project, or energy put towards a relationship all qualify as sunk costs.

The fallacy is found in thinking that you should continue with something on the basis of all that you've put in, with no regard given for future costs or likelihood of ultimate success.

The reality: sunk costs are irrecoverable, so should not factor into the decision about the future.

Example

“The hopes for the space project appear slim, but we have already invested so much, so we have to finish.”

If the space project appears unlikely to achieve its stated objectives, any additional investment in the project may be irrational. The fact that a lot has been invested in it has no bearing on what should rationally be invested in its future.

“I really don’t want to go on this vacation—I’m so busy at work—but I already paid for the flights, so I might as well go.”

The cost of the flights is a sunk cost. If going on the vacation is going to bring you negative utility, that should be the only factor in your decision of whether or not to make the trip.

Bandwagon Fallacy

An assumption of truth on the basis of the majority of people believing it to be true.

"Everyone believes X, so obviously X is true."

The assumption is typically made without regard for the qualifications or ability of the people in question to validate the claim.

Common in the workplace, where a collective belief—“this is how everyone does it”—can lead to broken processes and systems.

Example

"Well, the majority of people we talk to say that integrating chip design and manufacturing is important, so clearly you should get onboard so we can continue forward with our integrated solution, rather than switching to something modular.”

The over-reliance on the majority—with no regard for whether these people are qualified to make this decision—may lead to poor judgement. This is where first principles thinking goes to die.

Straw Man

Setup a straw man to tear down.

The offender ignores the actual argument and replaces it with a flimsy, distorted, easily-refuted argument—a straw man.

By replacing a strong argument with a weak one, the offender creates the illusion of an easy, swift victory.

When done effectively, it is an intellectual sleight-of-hand that leaves observers with the impression that the offender has “won” the argument, when the reality is much different.

Politicians are often expert practitioners of the straw man fallacy.

Example

Candidate 1: “I believe we should focus on our local climate initiatives and hire a team to track and manage our various projects.”

Candidate 2: “So let me get this straight, you’re advocating for our town to take our taxpayer dollars away from our schools and safety programs and redirect them towards expensive McKinsey consultants making flowcharts of our climate projects?”

Candidate 2 effectively repositions the argument as taking money away from schools and safety programs—both of which are important to the community—which makes it easier to attack and refute.

The Appeal to Authority

The over-reliance on the perspective of an "expert" to support the legitimacy of an argument.

Over-reliance is a dangerous game. The qualifications of the authority figure in the field of question must be considered.

As a rule of thumb, the support of an authority figure can be a feature—but not a central pillar—of an argument.

Example

“Well our CEO and board all think that acquiring our competitor is what we should do, so it must be the right move.”

The CEO and board may not have all of the details that the troops on the ground have earned and possess, so may not be qualified to make a judgement on the matter.

At a minimum, the decision to acquire the competitor must be grounded in additional evidence beyond the stated belief of the CEO and board.

Post Hoc Ergo Propter Hoc

Latin for “after this, therefore because of this”—this flawed logic follows a simple structure:

  • Event B followed Event A
  • Therefore Event B must have been caused by Event A

The reality: Just because B followed A, doesn’t necessarily mean that B was caused by A.

Correlation ≠ Causation.

Example

“The crickets begin chirping just before the sun goes down. The crickets must make the sun go down.”

Easy to spot when taken to the extreme, but more challenging to identify when somewhere in the middle.

Personal Incredulity

You are unable to understand or believe something, therefore you argue that it cannot be true.

Complex topics often require significant upfront work to understand, so an inability to understand or comprehend something cannot be used to argue the illegitimacy of a claim.

Example

“The idea that we will ever mine for minerals on asteroids is completely ridiculous. I’m no scientist, but there is no way that will ever be cost effective or efficient, so it will never happen.”

The person making the claim is using an inability to understand the science and cost curve improvement potential as a basis for arguing it will never happen. They have self-proclaimed a lack of qualification to cast judgement—“I’m no scientist”—and have allowed personal incredulity to infiltrate their logic.

(Note: Asteroid mining might be real?)

The False Dilemma

Presenting two choices or alternatives when there are many more that exist. The extremes are presented with no regard for the potential shades of grey.

The false dilemma ignores nuance and lends itself to extreme positions and outcomes. When used, it typically reduces the potential for compromise, as the two options are painted as being extremely far apart.

An all-too-common fallacy in the polarizing world of politics.

Example

“The choice here is simple—either you love freedom and liberty or you love tyranny and oppression. Your call.”

If this line seems stolen from a political campaign, it’s probably because it is. Clearly there is more nuance, but the false dilemma creates a scenario where people have to choose a side.

Dangerous and polarizing.

Burden of Proof

The offender uses the inability of the opponent to provide evidence that a claim is false as justification that the claim is true.

The lack of refuting evidence cannot be considered to be the same as supporting evidence.

The burden of proof lies with the person making the claim to provide supporting evidence—see Hitchens’ Razor.

Example

Tim: “Paranormal activity is clearly real.”

Elizabeth: “No it’s not. Come on, that’s ridiculous.”

Tim: “You haven’t given me one piece of evidence that it’s not real, so it’s clearly real!”

Tim is making the claim and using the lack of refuting evidence as support for the validity of his claim. Elizabeth can use Hitchens’ Razor—the burden of proof is clearly unmet, so no argument is required to dismiss Tim’s argument.

The Red Herring

The term “red herring” comes from the world of hunting dog training:

Hunting dog trainers needed a good way to train the dogs to stay focused on a target scent. The dogs had a keen sense of smell, but they were also easily distracted. In order to train this focus, the trainers developed a strategy—one that involved a smelly fish.

The kippered herring is a reddish-brown fish with a pungent nose. During training, the kippered herring would be introduced as a distracting scent to test whether the dogs were able to stay on track.

From this origin, the "red herring" became synonymous with distraction.

The offender distracts from the argument with a seemingly related—but actually unrelated—point. The discussion is rerouted to a new argument where the offender is better-positioned to respond and win.

Example

Worker: “I can’t believe you’re only paying us $15 per hour. The market rate is much higher.”

Manager: “When I was working the warehouse, we had to work for $10 per hour in much worse conditions, so you have it pretty good in there now, if you ask me.”

The manager’s personal experience is unrelated to the market rate point made by the worker. The manager has shifted the argument to a discussion of hourly rate and working condition progression rather than of the market matter at hand.

No True Scotsman

The "appeal to purity" is characterized by the changing of the original argument to evade a counter-argument.

It is typically used to protect a hasty generalization by excluding the counter-argument that would bust the generalization.

Example

Jeremy: “A Scotsman never drinks scotch with soda.”

Charles: “I am a Scotsman and I drink scotch with soda.”

Jeremy: “Then you must not be a true Scotsman!”

Rather than acknowledge the counter-argument and evidence, the terms of the argument are changed to simply exclude the counter-argument.

Hasty Generalization

In short, jumping to conclusions.

When material, far-reaching, or wide-ranging conclusions are made on the basis of an immaterial, narrow body of evidence.

Insufficient evidence has been gathered or generated to justify the claimed conclusions. This does not mean the conclusions are definitively false—see The Fallacy Fallacy below—but it does mean that more work is required to prove them true.

Example

“The oldest man alive said that his secrets to a long life include smoking, drinking, and laughing with friends. I knew smoking and drinking weren’t bad for you!”

First off, this is actually a thing that happened. More importantly, this is clearly a hasty conclusion to draw on the basis of a sample size of 1. The evidence here is narrow and anecdotal at best, so no real conclusions can be drawn.

Non-Sequitur

The conclusion does not follow logically from the premises.

The evidence presented provides little or no actual support for the argument.

Example

“Charles ate fish for dinner and is well-spoken, so he must be a banker.”

The conclusion feels random and out of thin air relative to the evidence. This is a non-sequitur at its finest.

Tu Quoque

Latin for “you too”—an attempt to discredit an opponent’s argument by pointing out personal behavior as being inconsistent with their argument.

Targeting the hypocrisy of the opponent rather than the argument of the opponent.

Simple, yet oddly effective in political debates.

Example

Candidate 1: “The acceptance of $10 million in suspect funding truly calls into question the integrity of my opponent.”

Candidate 2: “Don’t question my integrity, look at all of the terrible things you have done!”

Candidate 2 is not addressing the actions noted by Candidate 1, instead choosing to attack the hypocrisy of Candidate 1 to discredit the argument.

It may be true that Candidate 1 has low integrity, but that has no bearing over the argument in question regarding Candidate 2’s integrity.

Slippery Slope

An argument that begins with a benign starting point before using a series of successive steps to get to a more radical, extreme end point.

The argument can feel compelling on the surface—no single step appears ridiculous—but the connection of multiple steps into the series is highly-improbable.

Think of this like a parlay bet—a bet that links together multiple bets or events into one. The more bets linked together, the lower the odds of the parlay hitting—and the higher the payout. Any one of the events may have a reasonable probability of occurring, but all of them occurring, and in the correct order, is nearly impossible.

Example

Politician: “If we legalize low-level drug crime, the criminals will make legal profits, invest in more expansive illegal networks, recruit new teams, and expand into new geographies. So we have to stay tough on all crime, even low-level drug crime, because otherwise it will get out of hand rapidly.”

The first step—that legal profits would be made—is plausible and likely, but the successive steps are each an assumption with no evidence in support. The innocuous starting point has led to an extreme end point and the extreme end point is used as justification for the argument.

Begging the Question

Circular reasoning takes the general form of:

  • If A then B
  • If B then A

The argument is circular in the sense that it collapses on itself. It may get you stuck in a paradox—check out the card paradox.

“Begging the Question” is a form of circular reasoning in which the argument is presented in such a way that the conclusion is included in the premise. The premise of the argument assumes the truth of the conclusion.

Example

“Ghosts are real because I once felt and heard something that was definitely a ghost.”

Easy to identify. The logic collapses on itself.

Loaded Question

The “loaded question” is a question asked with a presumption built into the question—it is pre-loaded.

Typically intended to be inflammatory in nature.

The individual on the receiving end of the question is forced to respond to the presumption despite its potentially baseless, irrelevant nature. It puts them on their heels and in a position of weakness where they are unable to revert to the original discussion or argument.

Example

Terry and Clarence are both competing for Mara’s affection. The three are together in a study hall period, when the following exchange occurs:

Terry: “Hey Clarence, what’s happening with the football team? You’re the captain and we are struggling.”

Clarence: “We will be fine, but have you been able to talk to Coach yet about your drug use suspension?”

Clarence has asked a “loaded question” that puts Terry on his heels. It may be baseless, but now Terry is forced to respond to an accusation of drug use when the original discussion was on the lack of leadership on the football team.

Equivocation

Comes from the roots “equal” and “voice”—a single word or phrase can say two very different things.

Equivocation is a classic tactic of the accused. It occurs when the offender—the accused in this case—uses a word, phrase, or sentence in an intentionally misleading manner.

The word, phrase, or sentence sounds like it’s saying one thing, but is actually saying something else.

Example

Police Officer: “Did you steal that television from the house?”

Accused: “I would never take something that wasn’t mine!”

The accused may believe that the television is rightfully hers, so by saying she would never take something that wasn’t hers, she is denying guilt without explicitly stating that she did not take the television.

The Fallacy Fallacy

The meta logical fallacy.

Incorrectly assumes that a claim must be false if a fallacy was used to argue the claim. Just because someone has poorly argued a claim, does not mean the claim itself is definitively false.

It may still be true, albeit poorly argued!

Example

“Paul, with all due respect, your argument for the health benefits of veganism was clearly based on a hasty generalization, so it is false and you are wrong. Therefore, Veganism is not healthy.”

The absence of evidence is not the evidence of absence. Paul may have failed to prove that veganism is healthy, but this doesn’t mean that it isn’t healthy.

Conclusion

There you have it—20 common logical fallacies to learn, identify, and avoid.

I hope the definitions and examples in this guide help you feel more well-equipped to craft great arguments (and refute bad ones).

The Logical Fallacy Guide

Sahil Bloom

Welcome to the 242 new members of the curiosity tribe who have joined us since Wednesday. Join the 57,887 others who are receiving high-signal, curiosity-inducing content every single week.

What’s a Rich Text element?

The rich text element allows you to create and format headings, paragraphs, blockquotes, images, and video all in one place instead of having to add and format them individually. Just double-click and easily create content.

Static and dynamic content editing

A rich text element can be used with static or dynamic content. For static content,

just drop it into any page and begin editing. For dynamic content, add a rich text field to any collection and then connect a rich text element to that field in the settings panel. Voila!

  • mldsa
  • ,l;cd
  • mkclds

How to customize formatting for each rich text

Headings, paragraphs, blockquotes, figures, images, and figure captions can all be styled after a class is added to the rich text element using the "When inside of"

nested selector

system.

Visualization credit: @drex_jpg

If you’ve been reading this newsletter, you know that I like to say that humans are fascinating creatures.

We possess the capacity to accomplish some complex feat of technology and engineering, and subsequently fall victim to the most obviously flawed base logic.

Logical fallacies—errors in reasoning that undermine the quality of an argument—are classic examples of this fact.

The Merriam-Webster Dictionary defines fallacy as a false or misleading idea. A logical fallacy, therefore, can simply be thought of as logic based on a false or misleading idea.

Unfortunately, unless you went to law school—or took a robust philosophy course load in college—you’ve likely been minimally exposed to them in a formal context.

Accordingly, we frequently fall victim to logical fallacies—our own emotional, psychological, and intellectual blindspots create the cracks and we fall right into them.

There is no such thing as a perfect logician, but we can all strive to cover our blindspots and craft better arguments. Similar to the study of cognitive biases—which I’ve written about recently here and here—the first step in avoiding logical fallacies is developing an awareness of them.

In that vein, today’s piece will cover 20 common logical fallacies to learn, identify, and avoid.

Without further ado, let’s dive in…

Ad Hominem

Latin phrase for "to the person”—an ad hominem attack is an attack of the individual rather than the argument.

Instead of addressing the argument—its structure, logic, and merits—the offender attempts to refute the opposition on the basis of personal characteristics.

It may be overt—openly attacking the person’s character or personality—or covert—subtly doing the same—but it always focuses on the person, not the argument.

Often referred to as “mud-slinging” in political circles, if you’ve ever watched a political debate or political campaign ads, you’re already familiar with this one. It’s all-too-common on Twitter and other online discourse, as well.

Example

Candidate 1: “…and this is why I believe we need to implement a much more aggressive set of climate change regulations.”

Candidate 2: “I’m sorry, but are we really expected to believe anything coming from a known liar who cheated on his college entrance exams to get to this position?”

The offender (Candidate 2) has attacked Candidate 1 the individual, rather than the argument itself.

The Texas Sharpshooter

The name of this fallacy is based on a fable:

A Texan fires a gun multiple times at a barn wall. He then walks over to the bullet-riddled wall and paints a target around the closest cluster of bullet holes to create the appearance of impressive marksmanship.

Think of this as cherry-picking—selecting and highlighting evidence that supports the conclusion and systematically ignoring evidence that may refute it.

Example

“Tara is a really impressive and successful restauranteur. Her restaurant on Park Avenue is always full and gets really high ratings on Yelp.”

This may be true, but it ignores the fact that Tara’s five other restaurant openings have failed. The cherry-picked data—the successful Park Avenue restaurant—is used to draw a broad conclusion about Tara’s quality as a restauranteur that may be inaccurate.

Sunk Cost Fallacy

A favorite of behavioral economics.

Sunk costs are the economic costs already invested in an activity that cannot be recovered. Money spent on non-refundable flights and hotels, time invested in a project, or energy put towards a relationship all qualify as sunk costs.

The fallacy is found in thinking that you should continue with something on the basis of all that you've put in, with no regard given for future costs or likelihood of ultimate success.

The reality: sunk costs are irrecoverable, so should not factor into the decision about the future.

Example

“The hopes for the space project appear slim, but we have already invested so much, so we have to finish.”

If the space project appears unlikely to achieve its stated objectives, any additional investment in the project may be irrational. The fact that a lot has been invested in it has no bearing on what should rationally be invested in its future.

“I really don’t want to go on this vacation—I’m so busy at work—but I already paid for the flights, so I might as well go.”

The cost of the flights is a sunk cost. If going on the vacation is going to bring you negative utility, that should be the only factor in your decision of whether or not to make the trip.

Bandwagon Fallacy

An assumption of truth on the basis of the majority of people believing it to be true.

"Everyone believes X, so obviously X is true."

The assumption is typically made without regard for the qualifications or ability of the people in question to validate the claim.

Common in the workplace, where a collective belief—“this is how everyone does it”—can lead to broken processes and systems.

Example

"Well, the majority of people we talk to say that integrating chip design and manufacturing is important, so clearly you should get onboard so we can continue forward with our integrated solution, rather than switching to something modular.”

The over-reliance on the majority—with no regard for whether these people are qualified to make this decision—may lead to poor judgement. This is where first principles thinking goes to die.

Straw Man

Setup a straw man to tear down.

The offender ignores the actual argument and replaces it with a flimsy, distorted, easily-refuted argument—a straw man.

By replacing a strong argument with a weak one, the offender creates the illusion of an easy, swift victory.

When done effectively, it is an intellectual sleight-of-hand that leaves observers with the impression that the offender has “won” the argument, when the reality is much different.

Politicians are often expert practitioners of the straw man fallacy.

Example

Candidate 1: “I believe we should focus on our local climate initiatives and hire a team to track and manage our various projects.”

Candidate 2: “So let me get this straight, you’re advocating for our town to take our taxpayer dollars away from our schools and safety programs and redirect them towards expensive McKinsey consultants making flowcharts of our climate projects?”

Candidate 2 effectively repositions the argument as taking money away from schools and safety programs—both of which are important to the community—which makes it easier to attack and refute.

The Appeal to Authority

The over-reliance on the perspective of an "expert" to support the legitimacy of an argument.

Over-reliance is a dangerous game. The qualifications of the authority figure in the field of question must be considered.

As a rule of thumb, the support of an authority figure can be a feature—but not a central pillar—of an argument.

Example

“Well our CEO and board all think that acquiring our competitor is what we should do, so it must be the right move.”

The CEO and board may not have all of the details that the troops on the ground have earned and possess, so may not be qualified to make a judgement on the matter.

At a minimum, the decision to acquire the competitor must be grounded in additional evidence beyond the stated belief of the CEO and board.

Post Hoc Ergo Propter Hoc

Latin for “after this, therefore because of this”—this flawed logic follows a simple structure:

  • Event B followed Event A
  • Therefore Event B must have been caused by Event A

The reality: Just because B followed A, doesn’t necessarily mean that B was caused by A.

Correlation ≠ Causation.

Example

“The crickets begin chirping just before the sun goes down. The crickets must make the sun go down.”

Easy to spot when taken to the extreme, but more challenging to identify when somewhere in the middle.

Personal Incredulity

You are unable to understand or believe something, therefore you argue that it cannot be true.

Complex topics often require significant upfront work to understand, so an inability to understand or comprehend something cannot be used to argue the illegitimacy of a claim.

Example

“The idea that we will ever mine for minerals on asteroids is completely ridiculous. I’m no scientist, but there is no way that will ever be cost effective or efficient, so it will never happen.”

The person making the claim is using an inability to understand the science and cost curve improvement potential as a basis for arguing it will never happen. They have self-proclaimed a lack of qualification to cast judgement—“I’m no scientist”—and have allowed personal incredulity to infiltrate their logic.

(Note: Asteroid mining might be real?)

The False Dilemma

Presenting two choices or alternatives when there are many more that exist. The extremes are presented with no regard for the potential shades of grey.

The false dilemma ignores nuance and lends itself to extreme positions and outcomes. When used, it typically reduces the potential for compromise, as the two options are painted as being extremely far apart.

An all-too-common fallacy in the polarizing world of politics.

Example

“The choice here is simple—either you love freedom and liberty or you love tyranny and oppression. Your call.”

If this line seems stolen from a political campaign, it’s probably because it is. Clearly there is more nuance, but the false dilemma creates a scenario where people have to choose a side.

Dangerous and polarizing.

Burden of Proof

The offender uses the inability of the opponent to provide evidence that a claim is false as justification that the claim is true.

The lack of refuting evidence cannot be considered to be the same as supporting evidence.

The burden of proof lies with the person making the claim to provide supporting evidence—see Hitchens’ Razor.

Example

Tim: “Paranormal activity is clearly real.”

Elizabeth: “No it’s not. Come on, that’s ridiculous.”

Tim: “You haven’t given me one piece of evidence that it’s not real, so it’s clearly real!”

Tim is making the claim and using the lack of refuting evidence as support for the validity of his claim. Elizabeth can use Hitchens’ Razor—the burden of proof is clearly unmet, so no argument is required to dismiss Tim’s argument.

The Red Herring

The term “red herring” comes from the world of hunting dog training:

Hunting dog trainers needed a good way to train the dogs to stay focused on a target scent. The dogs had a keen sense of smell, but they were also easily distracted. In order to train this focus, the trainers developed a strategy—one that involved a smelly fish.

The kippered herring is a reddish-brown fish with a pungent nose. During training, the kippered herring would be introduced as a distracting scent to test whether the dogs were able to stay on track.

From this origin, the "red herring" became synonymous with distraction.

The offender distracts from the argument with a seemingly related—but actually unrelated—point. The discussion is rerouted to a new argument where the offender is better-positioned to respond and win.

Example

Worker: “I can’t believe you’re only paying us $15 per hour. The market rate is much higher.”

Manager: “When I was working the warehouse, we had to work for $10 per hour in much worse conditions, so you have it pretty good in there now, if you ask me.”

The manager’s personal experience is unrelated to the market rate point made by the worker. The manager has shifted the argument to a discussion of hourly rate and working condition progression rather than of the market matter at hand.

No True Scotsman

The "appeal to purity" is characterized by the changing of the original argument to evade a counter-argument.

It is typically used to protect a hasty generalization by excluding the counter-argument that would bust the generalization.

Example

Jeremy: “A Scotsman never drinks scotch with soda.”

Charles: “I am a Scotsman and I drink scotch with soda.”

Jeremy: “Then you must not be a true Scotsman!”

Rather than acknowledge the counter-argument and evidence, the terms of the argument are changed to simply exclude the counter-argument.

Hasty Generalization

In short, jumping to conclusions.

When material, far-reaching, or wide-ranging conclusions are made on the basis of an immaterial, narrow body of evidence.

Insufficient evidence has been gathered or generated to justify the claimed conclusions. This does not mean the conclusions are definitively false—see The Fallacy Fallacy below—but it does mean that more work is required to prove them true.

Example

“The oldest man alive said that his secrets to a long life include smoking, drinking, and laughing with friends. I knew smoking and drinking weren’t bad for you!”

First off, this is actually a thing that happened. More importantly, this is clearly a hasty conclusion to draw on the basis of a sample size of 1. The evidence here is narrow and anecdotal at best, so no real conclusions can be drawn.

Non-Sequitur

The conclusion does not follow logically from the premises.

The evidence presented provides little or no actual support for the argument.

Example

“Charles ate fish for dinner and is well-spoken, so he must be a banker.”

The conclusion feels random and out of thin air relative to the evidence. This is a non-sequitur at its finest.

Tu Quoque

Latin for “you too”—an attempt to discredit an opponent’s argument by pointing out personal behavior as being inconsistent with their argument.

Targeting the hypocrisy of the opponent rather than the argument of the opponent.

Simple, yet oddly effective in political debates.

Example

Candidate 1: “The acceptance of $10 million in suspect funding truly calls into question the integrity of my opponent.”

Candidate 2: “Don’t question my integrity, look at all of the terrible things you have done!”

Candidate 2 is not addressing the actions noted by Candidate 1, instead choosing to attack the hypocrisy of Candidate 1 to discredit the argument.

It may be true that Candidate 1 has low integrity, but that has no bearing over the argument in question regarding Candidate 2’s integrity.

Slippery Slope

An argument that begins with a benign starting point before using a series of successive steps to get to a more radical, extreme end point.

The argument can feel compelling on the surface—no single step appears ridiculous—but the connection of multiple steps into the series is highly-improbable.

Think of this like a parlay bet—a bet that links together multiple bets or events into one. The more bets linked together, the lower the odds of the parlay hitting—and the higher the payout. Any one of the events may have a reasonable probability of occurring, but all of them occurring, and in the correct order, is nearly impossible.

Example

Politician: “If we legalize low-level drug crime, the criminals will make legal profits, invest in more expansive illegal networks, recruit new teams, and expand into new geographies. So we have to stay tough on all crime, even low-level drug crime, because otherwise it will get out of hand rapidly.”

The first step—that legal profits would be made—is plausible and likely, but the successive steps are each an assumption with no evidence in support. The innocuous starting point has led to an extreme end point and the extreme end point is used as justification for the argument.

Begging the Question

Circular reasoning takes the general form of:

  • If A then B
  • If B then A

The argument is circular in the sense that it collapses on itself. It may get you stuck in a paradox—check out the card paradox.

“Begging the Question” is a form of circular reasoning in which the argument is presented in such a way that the conclusion is included in the premise. The premise of the argument assumes the truth of the conclusion.

Example

“Ghosts are real because I once felt and heard something that was definitely a ghost.”

Easy to identify. The logic collapses on itself.

Loaded Question

The “loaded question” is a question asked with a presumption built into the question—it is pre-loaded.

Typically intended to be inflammatory in nature.

The individual on the receiving end of the question is forced to respond to the presumption despite its potentially baseless, irrelevant nature. It puts them on their heels and in a position of weakness where they are unable to revert to the original discussion or argument.

Example

Terry and Clarence are both competing for Mara’s affection. The three are together in a study hall period, when the following exchange occurs:

Terry: “Hey Clarence, what’s happening with the football team? You’re the captain and we are struggling.”

Clarence: “We will be fine, but have you been able to talk to Coach yet about your drug use suspension?”

Clarence has asked a “loaded question” that puts Terry on his heels. It may be baseless, but now Terry is forced to respond to an accusation of drug use when the original discussion was on the lack of leadership on the football team.

Equivocation

Comes from the roots “equal” and “voice”—a single word or phrase can say two very different things.

Equivocation is a classic tactic of the accused. It occurs when the offender—the accused in this case—uses a word, phrase, or sentence in an intentionally misleading manner.

The word, phrase, or sentence sounds like it’s saying one thing, but is actually saying something else.

Example

Police Officer: “Did you steal that television from the house?”

Accused: “I would never take something that wasn’t mine!”

The accused may believe that the television is rightfully hers, so by saying she would never take something that wasn’t hers, she is denying guilt without explicitly stating that she did not take the television.

The Fallacy Fallacy

The meta logical fallacy.

Incorrectly assumes that a claim must be false if a fallacy was used to argue the claim. Just because someone has poorly argued a claim, does not mean the claim itself is definitively false.

It may still be true, albeit poorly argued!

Example

“Paul, with all due respect, your argument for the health benefits of veganism was clearly based on a hasty generalization, so it is false and you are wrong. Therefore, Veganism is not healthy.”

The absence of evidence is not the evidence of absence. Paul may have failed to prove that veganism is healthy, but this doesn’t mean that it isn’t healthy.

Conclusion

There you have it—20 common logical fallacies to learn, identify, and avoid.

I hope the definitions and examples in this guide help you feel more well-equipped to craft great arguments (and refute bad ones).