Phil Plait: Don't Be A Dick

tags: , , , , , , , , , , , , ,

In late July 2010, Bad Astronomer and skeptic, Phil Plait, gave a thoughtful public presentation called "Don't Be A Dick." This presentation discusses how one presents their skepticism to the public and how they discuss it publicly. There was a lot of positive reaction to the talk, but also some criticism.

Phil Plait - Don't Be A Dick.

I agree with Phil, partially because this is the method I use to engage with those who believe in some form of magic, and also because I agree with the arguments he uses to support his position. What are your thoughts?

One response so far

  • Epinephrine says:

    I disagree with Phil, on a number of issues. My reply has gotten quite long as I've listened and responded to the talk.

    He asks (I may be paraphrasing slighty, I don't have a transcript), with all this stacked against us, why would you choose to make it harder?

    Well, it hasn't been established that a confrontational tone makes it harder. Sure, there was a sampling of the audience, but they may not recall all the things that honestly contributed.

    One thing Phil said initially was that when provided with evidence and an argument as to why their belief is wrong, it ends up reinforcing it. So it would seem that politely arguing from facts isn't a very good way to start - at least not universally. Yet he says that we need to do targeted debunkings - why?

    Because it works pretty well on people who are on the fence. Those who aren't wholly into the camp.

    Why would mocking work? We're social animals, and we respond very strongly to being mocked. If someone is made to feel embarrassed by their behaviour, they are likely to change it. And nobody wants to join a group that is being laughed at. Yes, the anti-vaccinationist who is being called an ignorant parasite may not be swayed, but someone not on that bandwagon may see the anger, and realise that these people really are endangering the lives of those with weaker immune systems, and those too young to be vaccinated. And if they had been thinking about it, they might be shamed into reconsidering.

    I don't think a single approach is best - I think we need multiple approaches. We need celebrities encouraging people to think for themselves, we need to have a good laugh at those foolish enough to buy into "the Secret," we need to stigmatise reckless endangerment of the public - whether it is drunk driving (which we'd all agree about, skeptic or not) or anti-vaccination.

    Phil has some analogies that are cute, but they can all be stood on their heads. Sure, sometimes it's better to swing lightly at a nail. But you have to overcome static friction to move something - you can swing all you want, but if it isn't hard enough you'll never make any progress. There are nails that call for big hammers. And there are times when you may not want to swing a thousand times, if you can take care of it with a couple of solid swings.

    It's hard to reason someone out of a position they didn't reason themselves into. So why not try invective. Why not use guilt, mockery, derision? Limiting ourselves to polite discourse and reasoned argument probably won't work - it may work in some cases, but while some need the carrot, others respond to the stick - or to both. Good Cop, Bad Cop can work, and Phil is arguing that we should all be Good Cops.

    As much as Phil says that he lost his belief gradually, that nobody yelled at him - others may have benefited from that. I've seen ex-believers write in to thank a certain godless liberal blogger for having challenged them - that they wouldn't have dug into their beliefs without having been angry with him. And while it may seem like things are gradual, it's possible that the occasional bit of derision or mockery may have influenced that process.

    "Taking the low road doesn't help. It doesn't make you stronger, it doesn't make you look good, and it doesn't change anyone's minds."

    Evidence, please. Sure, that' what one would think at first blush, but science is useful precisely because it can lead us to conclusions we wouldn't have reached on intuition alone. So show me the evidence. Does seeing a believer in "toxic vaccines" excoriated, their arguments shredded, and them being the target of a great deal of anger not have any effect on someone perhaps leaning toward not vaccinating? Perhaps it'll cement some people further in their beliefs, but maybe it'll make many others reconsider?

    I get the feeling that Phil is being anything but skeptical here - making claims that aren't based in evidence but on how he perceives it. You can't know unless you have done the experiment. Moreover, he may be thinking about the effect on a believer, while I am concerned not with those who have made it part of their belief system, but those that are still open to argument. Obviously, I only have anecdotes, but I know that even my own not-always-respectful approach has changed people's minds, because they've thanked me for it.

    "In times like these we don't need warriors, we need diplomats." Again, it's a nice sentiment, but where is the proof? Maybe we need both? Maybe we need warriors more than diplomats. I can come up with true statements that would seem to bolster the position for aggressiveness with no evidence to back it up, too. "Given a chance to choose sides, who among the undecided would choose the side that it the victim of so much anger," or "people want to be liked - we all feel that way - it's nice when someone likes you, and it's hard to be disliked. That's why it's so important that we show our dislike of certain positions - loudly, strongly, and publicly."

    See, I'm not saying that those will work, or that they are correct - but they have a kernel of truth to them, on which one hangs the assumption. Phil is doing precisely this in his talk - using little kernels of truth and building off of them, and we know from the history of psychology that this type of argument is wrong. Intuition serves us poorly in predicting how people will react. One could quite incorrectly assume all of the following:
    1) That if there is a small chance of one witness helping you, a greater number of witnesses means that you are more likely to get help (false, this is the diffusion of responsibility)
    2) That one who is rewarded more will be more set in their beliefs (false, cognitive dissonance reinforces poorly rewarded beliefs)
    3) That people wouldn't perform acts to harm another, solely on the word of an authority (false, Milgram obedience experiemnts)
    4) That people will accurately judge objective tings like length of lines, as social pressure doesn't affect objective values (false, Asch's conformity studies)
    5) That people randomly assigned to act as prison guards or prisoners would behave fairly, knowing that it was only chance that put them in the roles (false, Zimbardo's Stanford prison experiment)

    Maybe I'm wrong, but it seems to me that Phil is falling into the same kind of irrational beliefs. While it may seem intuitive to suggest that you won't convince people with an aggressive argument, on what evidence is he basing his talk?

    I'm not convinced one way or another, and I'm virtually certain that we haven't hit on the optimal method. In most games (in the theoretical sense) there are mixed strategies that are optimal. The idea of a Nash equilibrium in this discussion, some balance between the stick and carrot is a tempting one for me. I don't think that Phil's approach is worthless or wrong, but I don't think that other approaches are either. The best payoff may come from a mixed strategy, in which both carrot and stick are present.