Moral derpitude
Here’s one of my biases: I’m more disposed to agree with something if it can be framed in an intellectual manner.
So it was with this week’s Internet tempest-in-a-teapot, over the use of the word “derp.” After one commentator used the term to slam his opponent (as “derpy”), various people like me took to the web to debate its appropriateness. I was at first inclined to agree with Gawker’s Max Read, who said the word was juvenile, silly and vaguely offensive:
"Derp," a word for "stupidity," was not a particularly funny joke when it was a throwaway line in the Matt Parker-Trey Stone BASEketball. It didn't get funnier when it crossed over to 4chan and YTMND ten years ago, especially since message-board posters managed to turn it from a nonce word into one with connotations of disability.
It’s the sound of the word. Unlike calling someone “stupid” or an “idiot,” calling them “derpy” adds something of the low humor of the mimic — just imagine someone repeating everything you say, but replacing all your words with “herp derp herp derp.” (That’s exactly what one browser extension does.) Surely there are ways to conduct a debate that aren’t so demeaning to all parties involved.
But then I mostly changed my mind after reading economist Noah Smith offer a much more sophisticated definition of “derp” than “stupidity.” It has to do — notice I didn’t say this was a simpler definition — with Bayesian probability, a branch of statistics that deals with events (such as, say, most of real life) where the truth of a proposition is not certain.*
In Bayesian inference, you start your analysis of a question with a “prior belief” — what you think before you consider any evidence. This is shortened to your “prior.” Your “posterior belief” or “posterior” is what you conclude after considering the evidence. Smith:
What does it mean for a prior to be "strong"? It means you really, really believe something to be true. If you start off with a very strong prior, even solid evidence to the contrary won't change your mind. In other words, your posterior will come directly from your prior.
Having strong priors — strong a priori beliefs that you hold to even when the evidence suggests otherwise — is NOT necessarily irrational in Bayesian probability. In one example, if you are trying to determine whether your friend’s baby is a boy, a girl, or a dog, you would be justified in rejecting the third option based on your prior belief that humans can’t give birth to dogs even if your only evidence, say, is a photo of a puppy.
Using the example of people who believe that solar power will never be cost-competitive with fossil fuels, Smith says there are limits to how much we should tolerate people clinging to their priors:
But here's the thing: When those people keep broadcasting their priors to the world again and again after every new piece of evidence comes out, it gets very annoying. After every article comes out about a new solar technology breakthrough, or a new cost drop, they'll just repeat "Solar will never be cost-competitive." That is unhelpful and uninformative, since they're just restating their priors over and over. Thus, it is annoying. Guys, we know what you think already. English has no word for "the constant, repetitive reiteration of strong priors". Yet it is a well-known phenomenon in the world of punditry, debate, and public affairs. On Twitter, we call it "derp".
There’s a certain elegance to that definition that appeals to me. Someone who is derpy is someone who constantly refuses to change their views based on conflicting evidence. (That’s not to say they have to adopt the position of their opponents — perhaps they could adopt a more moderate position that’s less in conflict with the evidence, or concede the validity of the evidence but proffer new evidence to bolster your position.) I still don’t like “derp” as a word, but Smith is right that there’s no other word to express that point of view.
UPDATE: One final necessary element of the argument that I originally omitted: What separates derpiness as a concept from mere stubbornness is that someone who is derpy not only holds on to his or her belief in the face of conflicting evidence, but loudly persists in professing that original belief even as they have been disproved repeatedly.
Now, of course, I’m second-guessing myself, that I’m only intrigued by this idea because it was expressed in a way that appeals to my intellectual vanity. Anyone have any other arguments one way or the other before I arrive at a posterior belief on the value of deep?
*Note: my grasp of Bayesian inference and related areas is very shaky, but I’ve read several intriguing pieces lately that rely heavily on it. If anyone knows a good layman’s introduction to the concept I would be very grateful for the tip; I’d like to learn more without getting too into the math.