Make sure you are sitting down ....

Many Internet forums have carried discussion of the Electric Universe hypothesis. Much of that discussion has added more confusion than clarity, due to common misunderstandings of the electrical principles. Here we invite participants to discuss their experiences and to summarize questions that have yet to be answered.

Moderators: MGmirkin, bboyer

Locked
kiwi
Posts: 564
Joined: Wed Jun 23, 2010 3:58 pm
Location: New Zealand

Make sure you are sitting down ....

Unread post by kiwi » Sat Aug 11, 2012 9:48 pm

:?
A field guide to bullshit

13 June 2011 by Alison George
Magazine issue 2816. Subscribe and save
For similar stories, visit the Interviews Topic Guide
How do people defend their beliefs in bizarre conspiracy theories or the power of crystals? Philosopher Stephen Law has tips for spotting their strategies

You describe your new book, Believing Bullshit, as a guide to avoid getting sucked into "intellectual black holes". What are they?
Intellectual black holes are belief systems that draw people in and hold them captive so they become willing slaves of claptrap. Belief in homeopathy, psychic powers, alien abductions - these are examples of intellectual black holes. As you approach them, you need to be on your guard because if you get sucked in, it can be extremely difficult to think your way clear again.

But isn't one person's claptrap another's truth?
There's a belief system about water to which we all sign up: it freezes at 0 °C and boils at 100 °C. We are powerfully wedded to this but that doesn't make it an intellectual black hole. That's because these beliefs are genuinely reasonable. Beliefs at the core of intellectual black holes, however, aren't reasonable. They merely appear so to those trapped inside.

You identify some strategies people use to defend black hole beliefs. Tell me about one of them - "playing the mystery card"?
This involves appealing to mystery to get out of intellectual hot water when someone is, say, propounding paranormal beliefs. They might say something like: "Ah, but this is beyond the ability of science and reason to decide. You, Mr Clever Dick Scientist, are guilty of scientism, of assuming science can answer every question." This is often followed by that quote from Shakespeare's Hamlet: "There are more things in heaven and earth, Horatio, than are dreamt of in your philosophy". When you hear that, alarm bells should go off.

But even scientists admit that they can't explain everything.
There probably are questions that science cannot answer. But what some people do to protect their beliefs is to draw a veil across reality and say, "you scientists can go up to the veil and apply your empirical methods this far, but no further". Behind the veil they will put angels, aliens, psychic powers, God, ghosts and so on. Then they insist that there are special people who can see - if only dimly - through this veil. But the fact is that many of the claims made about things behind this veil have empirically observable consequences and that makes them scientifically testable.

How can science test these mysteries?
Psychologist Christopher French at Goldsmiths, University of London, ran an experiment into the effects of crystals to explore claims that holding "real" crystals from a New Age shop while meditating has a powerful effect on the psyche, more so than just holding "fake" ones. But French found no difference in participants using real and fake crystals. This was good evidence that the effect people report is down to the power of suggestion, not the crystals.

Of course, this study provoked comments such as: "Not being able to prove the existence of something does not disprove its existence. Much is yet to be discovered." This is just a smokescreen. But because the mantra "it's-beyond-the-ability-of-science-to-establish..." gets repeated so often, it is effective at lulling people back to sleep - even if they have been stung into entertaining a doubt for a moment or two.

Do you think mystery has a place in science?
Some things may be beyond our understanding, and sometimes it's reasonable to appeal to mystery. If you have excellent evidence that water boils at 100 °C, but on one occasion it appeared it didn't, it's reasonable to attribute that to some mysterious, unknown factor. It's also reasonable, when we have a theory that works but we don't know how it works, to say that this is currently a mystery. But the more we rely on mystery to get us out of intellectual trouble, or the more we use it as a carpet under which to sweep inconvenient facts, the more vulnerable we are to deceit, by others and by ourselves.

In your book you also talk about the "going nuclear" tactic. What is this?
When someone is cornered in an argument, they may decide to get sceptical about reason. They might say: "Ah, but reason is just another faith position." I call this "going nuclear" because it lays waste to every position. It brings every belief - that milk can make you fly or that George Bush was Elvis Presley in disguise - down to the same level so they all appear equally "reasonable" or "unreasonable". Of course, you can be sure that the moment this person has left the room, they will continue to use reason to support their case if they can, and will even trust their life to reason: trusting that the brakes on their car will work or that a particular drug is going to cure them.

Isn't there a grain of truth in this approach?
There is a classic philosophical puzzle about how to justify reason: to do so, it seems you have to use reason. So the justification is circular - a bit like trusting a second-hand car salesman because he says he's trustworthy. But the person who "goes nuclear" isn't genuinely sceptical about reason. They are just raising a philosophical problem as a smokescreen, to give them time to leave with their head held high, saying: "So my belief is as reasonable as yours." That's intellectually dishonest.

You say we should also be aware of the "but it fits" strategy. Why?
Any theory, no matter how ludicrous, can be squared with the evidence, given enough ingenuity. Every last anomaly can be explained away. There is a popular myth about science that if you can make your theory consistent with the evidence, then that shows it is confirmed by that evidence - as confirmed as any other theory. Lots of dodgy belief systems exploit this myth. Young Earth creationism - the view that the whole universe is less than 10,000 years old - is a good example. Given enough shoehorning and reinterpretation, you can make whatever turns up "fit" what the Bible says.

What about when people claim that they "just know" something is right?
Suppose I look out the window and say: "Hey, there's Ted." You say: "It can't be Ted because he's on holiday." I reply: "Look, I just know it's Ted." Here it might be reasonable for you to take my word for it.

But "I just know" also gets used when I present someone with good evidence that there are, say, no auras, angels or flying saucers, and they respond: "Look, I just know there are." In such cases, claiming to "just know" is usually very unreasonable indeed.

What else should we watch out for?
You should be suspicious when people pile up anecdotes in favour of their pet theory, or when they practise the art of pseudo-profundity - uttering seemingly profound statements which are in fact trite or nonsensical. They often mix in references to scientific theory to sound authoritative.

Why does it matter if we believe absurd things?
It can cause no great harm. But the dangers are obvious when people join extreme cults or use alternative medicines to treat serious diseases. I am particularly concerned by psychological manipulation. For charlatans, the difficulty with using reason to persuade is that it's a double-edged sword: your opponent may show you are the one who is mistaken. That's a risk many so-called "educators" aren't prepared to take. If you try using reason to persuade adults the Earth's core is made of cheese, you will struggle. But take a group of kids, apply isolation, control, repetition, emotional manipulation - the tools of brainwashing - and there's a good chance many will eventually accept what you say.

Profile
Stephen Law is senior lecturer in philosophy at Heythrop College, University of London, and editor of the Royal Institute of Philosophy journal, Think. His latest book is Believing Bullshit: How not to get sucked into an intellectual black holeA field guide to bullshit

13 June 2011 by Alison George
Magazine issue 2816. Subscribe and save
For similar stories, visit the Interviews Topic Guide
How do people defend their beliefs in bizarre conspiracy theories or the power of crystals? Philosopher Stephen Law has tips for spotting their strategies

You describe your new book, Believing Bullshit, as a guide to avoid getting sucked into "intellectual black holes". What are they?
Intellectual black holes are belief systems that draw people in and hold them captive so they become willing slaves of claptrap. Belief in homeopathy, psychic powers, alien abductions - these are examples of intellectual black holes. As you approach them, you need to be on your guard because if you get sucked in, it can be extremely difficult to think your way clear again.

But isn't one person's claptrap another's truth?
There's a belief system about water to which we all sign up: it freezes at 0 °C and boils at 100 °C. We are powerfully wedded to this but that doesn't make it an intellectual black hole. That's because these beliefs are genuinely reasonable. Beliefs at the core of intellectual black holes, however, aren't reasonable. They merely appear so to those trapped inside.

You identify some strategies people use to defend black hole beliefs. Tell me about one of them - "playing the mystery card"?
This involves appealing to mystery to get out of intellectual hot water when someone is, say, propounding paranormal beliefs. They might say something like: "Ah, but this is beyond the ability of science and reason to decide. You, Mr Clever Dick Scientist, are guilty of scientism, of assuming science can answer every question." This is often followed by that quote from Shakespeare's Hamlet: "There are more things in heaven and earth, Horatio, than are dreamt of in your philosophy". When you hear that, alarm bells should go off.

But even scientists admit that they can't explain everything.
There probably are questions that science cannot answer. But what some people do to protect their beliefs is to draw a veil across reality and say, "you scientists can go up to the veil and apply your empirical methods this far, but no further". Behind the veil they will put angels, aliens, psychic powers, God, ghosts and so on. Then they insist that there are special people who can see - if only dimly - through this veil. But the fact is that many of the claims made about things behind this veil have empirically observable consequences and that makes them scientifically testable.

How can science test these mysteries?
Psychologist Christopher French at Goldsmiths, University of London, ran an experiment into the effects of crystals to explore claims that holding "real" crystals from a New Age shop while meditating has a powerful effect on the psyche, more so than just holding "fake" ones. But French found no difference in participants using real and fake crystals. This was good evidence that the effect people report is down to the power of suggestion, not the crystals.

Of course, this study provoked comments such as: "Not being able to prove the existence of something does not disprove its existence. Much is yet to be discovered." This is just a smokescreen. But because the mantra "it's-beyond-the-ability-of-science-to-establish..." gets repeated so often, it is effective at lulling people back to sleep - even if they have been stung into entertaining a doubt for a moment or two.

Do you think mystery has a place in science?
Some things may be beyond our understanding, and sometimes it's reasonable to appeal to mystery. If you have excellent evidence that water boils at 100 °C, but on one occasion it appeared it didn't, it's reasonable to attribute that to some mysterious, unknown factor. It's also reasonable, when we have a theory that works but we don't know how it works, to say that this is currently a mystery. But the more we rely on mystery to get us out of intellectual trouble, or the more we use it as a carpet under which to sweep inconvenient facts, the more vulnerable we are to deceit, by others and by ourselves.

In your book you also talk about the "going nuclear" tactic. What is this?
When someone is cornered in an argument, they may decide to get sceptical about reason. They might say: "Ah, but reason is just another faith position." I call this "going nuclear" because it lays waste to every position. It brings every belief - that milk can make you fly or that George Bush was Elvis Presley in disguise - down to the same level so they all appear equally "reasonable" or "unreasonable". Of course, you can be sure that the moment this person has left the room, they will continue to use reason to support their case if they can, and will even trust their life to reason: trusting that the brakes on their car will work or that a particular drug is going to cure them.

Isn't there a grain of truth in this approach?
There is a classic philosophical puzzle about how to justify reason: to do so, it seems you have to use reason. So the justification is circular - a bit like trusting a second-hand car salesman because he says he's trustworthy. But the person who "goes nuclear" isn't genuinely sceptical about reason. They are just raising a philosophical problem as a smokescreen, to give them time to leave with their head held high, saying: "So my belief is as reasonable as yours." That's intellectually dishonest.

You say we should also be aware of the "but it fits" strategy. Why?
Any theory, no matter how ludicrous, can be squared with the evidence, given enough ingenuity. Every last anomaly can be explained away. There is a popular myth about science that if you can make your theory consistent with the evidence, then that shows it is confirmed by that evidence - as confirmed as any other theory. Lots of dodgy belief systems exploit this myth. Young Earth creationism - the view that the whole universe is less than 10,000 years old - is a good example. Given enough shoehorning and reinterpretation, you can make whatever turns up "fit" what the Bible says.

What about when people claim that they "just know" something is right?
Suppose I look out the window and say: "Hey, there's Ted." You say: "It can't be Ted because he's on holiday." I reply: "Look, I just know it's Ted." Here it might be reasonable for you to take my word for it.

But "I just know" also gets used when I present someone with good evidence that there are, say, no auras, angels or flying saucers, and they respond: "Look, I just know there are." In such cases, claiming to "just know" is usually very unreasonable indeed.

What else should we watch out for?
You should be suspicious when people pile up anecdotes in favour of their pet theory, or when they practise the art of pseudo-profundity - uttering seemingly profound statements which are in fact trite or nonsensical. They often mix in references to scientific theory to sound authoritative.

Why does it matter if we believe absurd things?
It can cause no great harm. But the dangers are obvious when people join extreme cults or use alternative medicines to treat serious diseases. I am particularly concerned by psychological manipulation. For charlatans, the difficulty with using reason to persuade is that it's a double-edged sword: your opponent may show you are the one who is mistaken. That's a risk many so-called "educators" aren't prepared to take. If you try using reason to persuade adults the Earth's core is made of cheese, you will struggle. But take a group of kids, apply isolation, control, repetition, emotional manipulation - the tools of brainwashing - and there's a good chance many will eventually accept what you say.

Profile
Stephen Law is senior lecturer in philosophy at Heythrop College, University of London, and editor of the Royal Institute of Philosophy journal, Think. His latest book is Believing Bullshit: How not to get sucked into an intellectual black hole http://www.newscientist.com/article/mg2 ... ?full=true

Sparky
Posts: 3517
Joined: Tue Jul 20, 2010 2:20 pm

Re: Make sure you are sitting down ....

Unread post by Sparky » Tue Aug 14, 2012 6:53 am

Repeating is not an argument!!! :D
"It is dangerous to be right in matters where established men are wrong."
"Doubt is not an agreeable condition, but certainty is an absurd one."
"Those who can make you believe absurdities, can make you commit atrocities." Voltaire

User avatar
phyllotaxis
Posts: 224
Joined: Wed Aug 10, 2011 3:16 pm
Location: Wilmington, NC

Re: Make sure you are sitting down ....

Unread post by phyllotaxis » Tue Aug 14, 2012 7:12 am

I think these are largely on target.
The key to accurate belief is continuing, even aggressive, comparison to discernable facts related to that belief.
The examples he gave are not what matters, the points about remaining skeptical and using reason to define and advocate beliefs seems sound to me.
PS- I was sitting when I read this

Kindest regards -

kiwi
Posts: 564
Joined: Wed Jun 23, 2010 3:58 pm
Location: New Zealand

Re: Make sure you are sitting down ....

Unread post by kiwi » Wed Aug 15, 2012 7:29 pm

Sparky wrote:Repeating is not an argument!!! :D
True story 8-)

phyllotaxis wrote:I think these are largely on target.
The key to accurate belief is continuing, even aggressive, comparison to discernable facts related to that belief.
The examples he gave are not what matters, the points about remaining skeptical and using reason to define and advocate beliefs seems sound to me.
PS- I was sitting when I read this

Kindest regards -
own medicine ... tastes funny, cheers Phyllo :idea:

jorgea
Posts: 1
Joined: Tue Oct 16, 2012 11:57 am

Re: Make sure you are sitting down ....

Unread post by jorgea » Tue Oct 16, 2012 12:00 pm

This is engaging material on how to hold one's own regardless of which belief it is.

psi
Posts: 34
Joined: Thu Jan 22, 2009 7:29 pm

Re: Make sure you are sitting down ....

Unread post by psi » Wed Dec 12, 2012 7:08 am

I agree that this is a well reasoned article despite the perhaps tendentiously orthodox choice of pseudo-science examples (I'm willing to bet we have a *lot* more to learn about crystals....: ). But it seems to me there are some big things left out -- for example, when a theory depends on an ever extending use of ad hoc band aids to retain plausibility, or when the discourse supporting it routinely ignores contrary evidence or has a history of marginalizing voices of dissent with labels simply because they are voices of dissent, etc.

User avatar
Metryq
Posts: 513
Joined: Mon Dec 03, 2012 3:31 am

Re: Make sure you are sitting down ....

Unread post by Metryq » Sat Dec 22, 2012 6:18 am

In THE BIG BANG NEVER HAPPENED Lerner describes the history of science and culture (politics, common belief systems, etc.) as swinging in tandem like a pendulum from one extreme to the next. Perhaps it is the power of Lerner's suggestion, but I see the link between today's mainstream "science" and political ideologies. Dogma contrary to experience is argued with religious fervor, and evidence is vigorously squelched with non-sensical reasoning like, "It didn't work last time because they weren't doing it right." (Or, "the laws of physics are different way out in deep space.")

The "tipping point" headlining the Thunderbolts site is more than just the science, and this may be the reason why some people resist it so forcefully. Kepler is one of my heroes because he was not merely fighting the belief systems of others—the ultimate test of his skepticism was in questioning his own beliefs.

Julian Braggins
Posts: 110
Joined: Mon Mar 17, 2008 11:13 pm

Re: Make sure you are sitting down ....

Unread post by Julian Braggins » Sat Oct 12, 2013 2:12 am

Perhaps 'water boils at 100°C ' wasn't the best analogy, it never does at my place. ( 1000mtrs alt. ) :lol:

Locked

Who is online

Users browsing this forum: No registered users and 1 guest