A month ago, the Washington Post’s Reid Wilson crunched the numbers and concluded that the most efficient tourism agency in the country was… Indiana’s:
From a public policy perspective, however, no state does a better job attracting visitors than Indiana. That’s because its tourism office gets a better return on its investment than any other state.
In fiscal year 2012-2013, Visit Indiana had $2.3 million to spend marketing the state, according to the U.S. Travel Association. That year, Indiana took in more than $8.3 billion in tourism revenue, the Census Bureau said. That means every tax dollar spent on marketing and promotion yielded $3,635 in economic activity.
Indiana isn’t known for its tourism marketing — where I grew up, just over the border near Chicago, we saw a lot more advertising for Michigan than we did Indiana. But we’re talking efficiency. In 2013, Michigan spent nearly 12 times as much on tourism as Indiana, but got less than twice the tourist spending in return. Indiana doesn’t spend with the big boys, but it apparently gets good bang for its buck.
At the opposite end of the list were states like Alaska, Hawaii, Wyoming, Montana and my current home of South Dakota. Those states took in less than $225 in tourism spending for every dollar of promotion. Hawaii, certainly a tourist destination by any definition, was more than 30 times less efficient than Indiana.
But those inefficient states all have a few other things in common: they’re all low-population and all fairly remote.
That’s relevant because the tourism spending data from the U.S. Travel Association has a very loose definition: it’s all spending by U.S. citizens on trips where they stayed overnight or traveled more than 50 miles. That means if a resident of Brookings, S.D., travels 52 miles south to Sioux Falls for the afternoon, everything they spend there counts as tourism.
It’s a lot easier to drive 50 miles to go somewhere than to drive 500 miles or more. That leads to a conundrum for interpreting this tourism data: a state like New Jersey has a lot more people living in that 50-to-100-mile sweet spot than a state like Hawaii or South Dakota, even though the latter have attractions people will travel long distances to see.
In fact, a huge proportion of this tourism spending appears to be in-state travel. Take a look at the relationship between state population and tourism spending:
The more people a state has, the more “tourism” spending it has. In fact, there’s a pretty remarkable, nearly one-to-one relationship between population and tourism spending. Only a few states are outliers — noted tourist states Hawaii, Nevada and Florida have more tourism spending than you’d expect from their population, and Ohio, Pennsylvania and Teas have less.
Given that most state tourism agencies are focused on bringing money into the state rather than merely circulating it, this would make me leery of making any judgment about the efficiency of tourism promotion based on raw tourism spending.
(A minor note about nomenclature: we’re dealing with two confusingly similar terms here. I’ll use “spending” and “tourism spending” to refer to money spent by tourists in a state, and “budgets,” and “promotional dollars” to refer to money spent by states trying to encourage tourism.)
Instead, a truer picture needs to control for the effects of population and capture these outliers. To understand what’s going on, we need to look at four different factors.
1. Who has the most tourist spending, period?
The states with the most raw tourism spending are California, Florida, Texas, New York and Illinois. I’m sure lots of people travel to those states for tourist purposes — California and Florida in particular have lots of attractions, as does New York City. But those states also have lots of their own residents, so tourism is a small percentage of the overall picture.
Meanwhile the lowest amounts of tourist spending are all states with small populations. In fact, some of them are known tourist destinations — Vermont (third worst) has ski slopes, South Dakota (sixth) has Mount Rushmore and the Black Hills, and Wyoming (seventh) has Yellowstone.
Nonetheless, all of those states have less than $3 billion in total tourist spending. That’s an order of magnitude less than the top five states.
- California ($75.5 billion)
- Florida ($48.4 billion)
- Texas ($43.3 billion)
- New York ($35.9 billion)
- Illinois ($25.1 billion)
Least tourism spending
- Delaware ($1.3 billion)
- Rhode Island ($1.6 billion)
- Vermont ($1.7 billion)
- Alaska ($1.7 billion)
- North Dakota ($1.8 billion)
As we know, though, “tourist spending” is highly dependent on population. So we’ll look at the fuller picture in a minute.
2. Which states spend the most money trying to attract tourists?
The biggest tourism budgets are largely from the biggest states — which have the most money to spend. The one exception is Hawaii, for whom tourism is a necessity rather than a hobby. Meanwhile some states just don’t seem to care very much, spending less than $3 million on promotion. Among these states is Indiana, identified by Epstein as the most efficient tourism promotion state. Given that among the lowest-budget states it has the largest population, you can see where its efficiency comes in.
Washington State abolished its tourism department several years ago.
- Hawaii ($75 million)
- Florida ($56.2 million)
- Illinois ($55.4 million)
- California ($50 million)
- Texas ($37.2 million)
Smallest promotional budgets
- Rhode Island ($710,000)
- Delaware ($2 million)
- Indiana ($2.3 million)
- Iowa ($2.4 million)
- Vermont ($2.9 million)
Those raw numbers only tell part of the story, though. As we know, both tourism spending and state budgets are correlated with population. So let’s control for population and look at per-capita data. This changes the picture radically.
3. Which states have the most per capita tourist spending?
Accounting for population in many cases flips the results upside down. The states with small overall tourism were all small-population states. And many of these same states show up at the top of the list when it comes to tourist spending per resident.
The leaders in per-capita visitor spending are Nevada and Hawaii, two obvious destination states. After that are many of the small, rural western states Epstein identified as the least efficient in tourist promotion: Wyoming, Montana, North Dakota and South Dakota, along with Vermont.
The states with the lowest tourism spending per capita tend to be states with a fair amount of “tourist” raw spending. It’s just that they’re got so many people that tourism isn’t as important as it is for the Hawaiis of the world.
- Nevada ($8,473.62)
- Hawaii ($6,344.78)
- Wyoming ($4,161.43)
- Montana ($2,786.50)
- North Dakota ($2,755.03)
Least tourism spending per resident
- Ohio ($1,252.65)
- West Virginia ($1,269.47)
- Indiana ($1,289.66)
- Pennsylvania ($1,408.35)
- Michigan ($1,431.46)
Also note that Indiana shows up on this list, too.
4. Which states spend the most money per resident attracting visitors?
Hawaii spends more raw money on tourism promotion than any state, so it’s no surprise this small state is also the runaway leader with more than $55 in promotional spending per resident per year. It’s followed by Alaska ($23.94), Wyoming ($22.36), South Dakota ($14.98) and Montana ($12.43). Those are all small states, which inflates per capita numbers, but also remote states that have to work harder to persuade people to visit. Meanwhile, the states with the lowest per capita spending are Indiana, Ohio, Pennsylvania, Rhode Island and Georgia, all of which spend less than a dollar per resident trying to attract visitors.
Highest tourism promotion budget per resident:
- Hawaii ($55.13)
- Alaska ($23.94)
- Wyoming ($22.36)
- South Dakota ($14.98)
- Montana ($12.43)
Lowest tourism promotion budget per resident:
- Indiana ($0.35)
- Ohio ($0.43)
- Pennsylvania ($0.46)
- Rhode Island ($0.67)
- Georgia ($0.70)
All this leaves us ready to draw our final conclusion:
5. Which states have the most efficient tourism promotion departments, accounting for population?
What we’re looking at here is tourism spending per resident per budget dollar. Effectively, we’re taking Epstein’s original calculation (tourism spending per budget dollar) and adding in a division by population. These numbers are very small, so for legibility’s sake, I’ll present them as per capita spending per thousand promotional dollars.
By this model, the actual most efficient state tourism agency is Rhode Island, which brings in $2.11 per resident per thousand tourism promotion dollars. Rhode Island, 48 miles long from north to south, probably doesn’t have much intra-state “tourism” spending by the U.S. Travel Association’s definition. Rhode Island has a meager $1.5 billion in tourist spending, second-smallest in the country, but it also has a small population and gets a lot of bang for the buck from its $710,000 annual tourism budget. It’s twice as efficient as the next state.
Vermont comes in number two, with 92 cents of tourism spending per resident per thousand promotional dollars. It’s followed by Iowa, Delaware, and Nevada.
Indiana, Epstein’s pick, is the sixth-most efficent state, with 57 cents per resident per thousand promotion dollars. That’s still pretty good, Hoosier State! Tourism’s not a big part of your economy, but you’re not throwing money down the drain trying to get that mediocre result.
The least efficient states by this metric are familiar: Illinois, California, Florida, Texas and Michigan. Hawaii is the sixth-worst, getting just 8.4 cents of spending per person for every thousand dollars it gives its tourism office. But that’s still better than Illinois’s dead-last 3.5 cents.
The other states Epstein called out for inefficiency are all over the map. Alaska is 34th at 14.2 cents. Wyoming is 11th with 33 cents. Montana is 18th with 22.6 cents, and South Dakota is 21st at 21.4 cents.
Here’s the best and worst states at tourism promotion on this per capita basis. A full list is at the bottom.
- Rhode Island ($2.11)
- Vermont ($0.92)
- Iowa ($0.83)
- Delaware ($0.76)
- Nevada ($0.57)
Least efficient states, per capita tourism spending per promotional budget dollar:
- Illinois ($0.04)
- California ($0.04)
- Florida ($0.05)
- Texas ($0.05)
- Michigan ($0.05)
By way of comparison, here’s the ratings by Epstein’s model:
- Indiana ($3,635.60)
- Pennsylvania ($3,084.38)
- Ohio ($2,890.24)
- Georgia ($2,583.81)
- Iowa ($2,523.15)
Least efficient states, tourism spending per promotional budget dollar:
- Alaska ($101.23)
- Hawaii ($115.08)
- South Dakota ($174.24)
- Wyoming ($186.15)
- Montana ($224.15)
After the jump, a footnote and the full ranking of tourism department efficiency.
As chairman of the South Dakota Republican Party, Craig Lawrence said his No. 1 job is clear: finding people to run for office.
South Dakota Democratic Party executive director Zach Crago puts similar importance on finding candidates. And while the races for governor and Congress get most of the attention, it’s the lower-profile state legislative candidates who take most of the time.
“I am of the philosophy that the more candidates we have, the better,” Lawrence said.
By that standard, Republicans in South Dakota are doing better. Already commanding supermajorities in both houses of the Legislature, Republicans are contesting 15 more House seats than Democrats, and eight extra Senate seats. If every single Democratic candidate were to win, they’d still have fewer seats in the House than the Republicans’ current total of 53.
After candidate withdrawals in August, Democrats had slightly fewer candidates in each chamber than they did in 2012, though Northern State University political science professor Jon Schaff said it looks like Democrats are recruiting better candidates than in past years.
“Given the state of their party, this might be the best they can hope for right now,” he said. “That might be a kind of victory, if they can beat some vulnerable incumbent Republicans.”
Simply putting more people on the ballot doesn’t mean a party’s going to win. While they can’t win any race they don’t enter, many of the seats parties are conceding would have been difficult to win anyway.
For example, the eight Senate seats Democrats never even attempted to contest were in the 13 most-Republican districts in the state. In each one, registered Republicans outnumber Democrats by more than 2,000. The five more they abandoned at the withdrawal deadline all had popular Republican incumbents and in almost all cases heavy GOP leans.
Similarly, Republicans aren’t running any candidates in three of the four districts where Democrats have a voter majority — though they’re also conceding two districts where they have 1,000-voter majorities.
Contested vs. uncontested races
Although it’s never advantageous to sit out a race, it’s often been the case for Democrats that fielding a full slate of candidates in a Republican-heavy district hasn’t made much of a difference.
In 2012, for example, Democrats won 17 of the 70 House seats, or 24 percent. In the 39 races where Democrats ran a full slate of candidates, they won 10, or 26 percent — an almost imperceptible improvement.
In the 2008 and 2010 Senate races, Democrats actually did worse in the races where both parties contested seats. In 2010, for example, they won six seats overall — but three of those were uncontested. In the 22 races where Republicans and Democrats both ran a candidate, Democrats won just three.
The effect was the opposite in the 2012 Senate races and the 2010 and 2012 House races. There, Democrats did better in races they contested than they did overall. (The 2008 House race had essentially identical results.)
But overall, the differences were small. The biggest difference was a mere 3.5 percentage points.
Cause or effect
Setting aside the fact that parties can’t win races where they don’t run a candidate, and that barring major scandal from their candidate it’s always better to contest an additional marginal race, there’s two different ways to think about the importance of candidate recruitment. Either:
- Having more candidates causes a political party to win more, because they’re going to win a certain percentage of races they contest and increasing the number of races you contest increases the number you win; or
- Both the number of candidates and the number of races won are effects of the general political environment. When the environment is favorable to a party, more people become candidates and more candidates win; when the environment is unfavorable, fewer people become candidates and fewer people win.
Recent history is far from conclusive, but suggests the second interpretation is more accurate.
When Democrats feel like they’ve got a good chance of winning, they’re more likely to invest their time in running for office than if they feel they’re a long shot.
In 2008, a good Democratic year across the country, Democrats ran candidates for all 35 Senate seats and 66 of 70 House seats. Two years later, as Democratic morale plummeted and Republicans were on the offensive, 25 percent fewer Democrats were running in the general election.
“You have a lot of Democrats who look at some of these districts and say, ‘Wait a second, how is it possible for a Democrat to win in a district like this?’ ” Crago said. “Likewise, if you see where the Republicans are not recruiting candidates, you’re seeing the exact same thing from them.”
Lawrence saw the same thing.
“When you encourage somebody to run, you have to encourage them that there’s a potential likelihood that they can win,” Lawrence said. “Nobody likes to lose.”
On the national level, political science research dating back decades, including a landmark 1983 book by Gary Jacobson and Samuel Kernell, found that potential candidates tend to run for office only when they feel their odds of winning are best:
High-quality challengers enter Congressional races when their odds are best. This is determined not only by what is going on in the district, but also by national political tides. Thus, national-level phenomena influence whether high-quality challengers enter, which in turn influences voting within each district.
Looking at this year’s candidate numbers, Schaff noted improvement by the Democrats. But overall, he said, Democrats’ candidate counts look more like those from 2010 and 2012, when they took a drubbing, than 2006 and 2008, when Democrats did well.
“If you want to draw a conclusion from that, it’s that the political entrepreneurs in the Democratic Party are thinking that this year is going to be more like ’10 and ’12 and that, therefore, this is not a great year to run,” he said.
With apologies to Thomas Jefferson, the concept of self-evidency is one of the most dangerous concepts in modern American political discourse.
When something is self-evident, it doesn’t need to be justified. It’s obvious to anyone in their right mind who thinks about it.
In contrast, things can be true without being self-evidently so. I believe it’s better for government to be as transparent as possible. Someone could rebut that government transparency should be limited because it threatens people’s privacy. That doesn’t change my mind — I’m convinced transparency is best. But I recognize that other points of view are at least potentially valid and don’t expect others to adopt my view because it’s obviously right.
But lots of people think their views about subjects like taxes, immigration, same-sex marriage or energy policy are self-evidently true. In some cases they may even be right! But even treating something as self-evident is toxic to the polity, because it chokes off debate and leaves people unwilling to change their minds even when presented with convincing evidence that they’re wrong.
A particular mindset emerges among people who believe something is self-evidently true when confronted with people who hold a different position on that issue. How do you explain differences of opinion when something is self-evident? There are only two options: the other person has to be either stupid or malevolent.
That is: someone who doesn’t hold a “self-evident” view is either too stupid or uninformed to realize the obvious correct answer, or they realize that obvious correct answer and yet willfully act on behalf of a position they know to be wrong.
This even applies to people who don’t even disagree with the self-evident-believer, but merely don’t share their outrage about the situation at hand. For someone who believes so strongly in the capital-t Truth of a particular issue, even staying on the sidelines is unacceptable — something that can only be the result of stupidity or malevolence.
Stupidity or malevolence is in most cases a false binary. In most issues, well-informed and well-meaning people can come to different conclusions about the best goal and/or the best way to reach the best goal. People can also have good, valid reasons for not sharing outrage (such as, I’ll humbly submit, trying to cover a controversial issue as a neutral journalist).
But, you might justifiably object, some things are self-evidently true. We shouldn’t entertain serious debate about subjects like the morality of genocide just for the sake of “civility.” And you’re probably correct. But the problem comes when you move from the clear-cut cases to muddier ground.
As human beings we’re fallible and prone to errors in judgement: We all know people who passionately believe true something we think to be ridiculous or wrong. We can even identify things we ourselves have been wrong about in the past. Are our current set of beliefs correct in a way that our past beliefs and others’ credos aren’t? It’s unlikely. This doesn’t mean the things we believe aren’t true or that we should reject them. All it means is we should, in holding to certain truths, acknowledge the possibility — however slim — that we’re wrong and our opponents are right.
Many things have a true answer to them, but only a few of those truths are self-evident. Confusing the two has poisoned our political debate.
So let’s say you want to give the moon an atmosphere. I’ve got you covered.
Physics professor Gregory Benford wrote a piece for Slate about how one could theoretically slam ice comets into the moon as step one of a plan to make the moon habitable for human life. It’s likely only a few of you read it — it got about 1,600 shares on Facebook and 250 tweets, which is pretty good but even with a decent multiplier effect is far short of ubiquitous. But it’s the sort of thing I tend to stumble across, sipping from the firehose of information online through dozens of RSS feeds, Twitter and friends who also like gathering interesting or bizarre links.
I’ve been in the habit of sharing interesting stuff like that with a few friends, or posting the best on my Facebook page or Twitter. Now I’m trying something new: culling the best stuff I come across into regular miniature newsletters. I call it “Internet Flotsam” and you can subscribe below to get an assortment of interesting links in your inbox on a mostly-daily basis. I try to mix it up — some history and politics, some science and statistics, some pop culture and sports.
Here’s a few excepts from my recent letters, to give you a sense of whether this is the sort of thing you’re interested in:
- Having kids, keeping roommates: Two couples — one expecting kids — went in together on a joint 30-year mortgage. They’ll share the house and housework and raise their families together. The experiment is just starting, but all four people hope the experience will “curb feelings of isolation” and preserve the social living environment they loved while having roommates, even after the point (kids) when couples make their family their social focus.
- Twelve Wrong Men: “12 Angry Men” is the quintessential jury drama, an idealistic look at how one man’s dogged goodness helped persuaded his comrades to set aside their biases and save an innocent man from an unjust conviction. Except, Mike D’Angelo argues, the alleged criminal in “12 Angry Men” was almost certainly guilty.
- Ten years of sentences: Ever wish you could just make yourself put everything aside, sit down, and get some serious reading done? Daniel Genis found himself in just that situation and read 1,046 books in 10 years. (That’s about one every 3.5 days.) His secret? He was serving a 10-year sentence for armed robbery. ‘At first, Genis resisted “Ulysses,” but his father kept bringing it. “I argued that he wouldn’t have the willpower to get through it once he became a free man,” Alexander Genis told me.’
- The LeBron James of baseball: LeBron James is a beast at basketball, someone who can single-handedly turn a terrible team into a good one just by setting foot on the court. What would it mean for someone to be as dominant at baseball as James is at basketball? Jeff Sullivan tried to find out, and the answer is that he would be about two to four times better than any player in history — the best hitter ever, who’s also the best fielder ever, and maybe who has to also be the best pitcher ever. Why? Basketball has five players on the court who play both offense and defense. Baseball has nine (or 10) at a time, and one of them does most of the defensive work — but that player only plays every five games. Each player in baseball is simply less important to their team than an NBA star.
- Lifehacking: “How to make epic pancakes with your Japanese rice cooker.” I haven’t tried this yet — and don’t even own a rice cooker — but this looks amazing enough I may do both. (Confession: the best part for me at first glance is the minimal cleanup.)
- Terra nullius, terra meam: A Virginia man has founded his own African kingdom as a way to shame all the other dads who invented excuses when their daughters wanted to become princesses. Surprisingly, he may have firmer legal footing here than you’d expect, because of a centuries-old piece of international law (and a unique territorial dispute between Egypt and Sudan). It’s not going to work, of course, but it COULD work. (Also, I apologize for the Google Translated Latin.)
If you do subscribe, please give me feedback. This is only valuable for me if people are enjoying the links, so tell me what kind of stuff you like and don’t like to see in these letters, and what time of day is best for you to get them.
Alexis Madrigal at the Atlantic terms new media ventures like FireThirtyEight, Vox, the Upshot and others “method journalism,” in that they’re primarily focused on how they report the news, rather than what news they report:
In a world where traditional beats may not make sense, where almost all marginal traffic growth comes from Facebook, where subscription revenue is a rumor, where business concerns demand breadth because they want scale… a big part of the industry’s response this year has been to create sites that become known by how they cover something rather than what.
FiveThirtyEight’s method is using data and statistic to cover the news. Vox is about explaining the news. Circa is about prioritizing news for viewing on mobile phones. The Upshot is about “plain-spoken, analytical journalism.”
As a reporter myself interested in exploring new ways to gather and produce journalism, I’ve been following these new ventures avidly. And yesterday, completely unintentionally, I conducted an unscientific experiment on my work blog into how readers respond to these various new approaches to journalism.
In the morning, I posted a FiveThirtyEight-style data analysis (in this case, I was literally inspired by an article FiveThirtyEight founder Nate Silver had written). After hearing an argument about why a gubernatorial had lost his election, I cross-referenced Census data and election results to figure out whether dislike of “carpetbaggers” had swung serious votes:
If dislike of people without deep roots in the state contributed to Lowe’s defeat, then you’d expect him to do worse in counties where there were fewer transplants — who presumably don’t have the same value on deep local roots, since they lack them themselves…
… In this case, the correlation between Lowe’s support and the rate of out-of-state residents is essentially zero. You can see it for yourself: there’s no pattern there. Lowe did well in some states with lots of transplants, and in some states with few transplants. Among counties with similar levels of out-of-staters, Lowe’s vote share varied by as much as 50 points.
A few hours later, I posted a Vox-style article, explaining something that bewildered more than one person I know: why the closure of a brief section of Interstate 29 caused authorities to announce a 455-mile detour.
Why would authorities tell motorists to go through Des Moines when they could drive on back roads through Iowa or Nebraska to avoid the flooding, at a much lower cost in added time and miles?
After I snarked a little bit on Twitter, a spokesperson with the state Emergency Operations Center called me up to explain.
“Federal law requires that the interstate system in all of America must be connected,” said Jonathan Harms, a spokesman with the Emergency Operations Center. “The DOT and the Federal Highway Administration needs to have some sort of route that collects all the interstates in America.”
Officials are, therefore, obligated by law to present an official detour that stays on the Interstate system. But they also can offer other, more direct detours. Like this one… that’s only an extra 20 miles and half an hour.
I had fun with both posts. But they got very different responses from the public.
The FiveThirtyEight-style post provoked a few intelligent responses by other close observers of South Dakota politics, but generally didn’t attract a wide audience. In its first day, it was shared on Facebook seven times — at least two of which were by me. Six weeks after its post, on July 28, it had racked up a grand total of 148 views.
The Vox-style post didn’t really spark any discussion. (There had been some Twitter discussion about the detour before my post on why the long detour was proposed, but nothing once I explained the reason for the seemingly absurd detour.)
But lots of people read and liked it. In its first day, it was shared on Facebook a full 344 times. And six weeks into the future it’s got a full 4,759 total views and is my third most-viewed story of the past three months behind two articles that got national attention.
As I mentioned above, two posts is in no way a scientific study of how people respond to data-heavy analysis vs. explanatory journalism. FiveThirtyEight has plenty of posts go viral, while Vox has written some wonky analysis to go along with its usual explainers aimed at making the news understandable for ordinary people. But it’s not surprising to me that my unscientific experience yesterday showed a lot more people are interested in having things explained to them in a straightforward manner than deep dives into the numbers.
(This post has been updated with new data.)
Common sense is always the easiest way to avoid actually arguing the matter at hand. “It’s just a matter of common sense. It’s just so obvious.” Which works with people who already agree with you. But politics is largely the process of convincing people who don’t always agree with you.
This is a powerful statement even without its context, and I’m tempted to leave it at that. As a political journalist, I frequently hear appeals to “common sense” from politicians and ordinary citizens alike. It’s a time-honored tradition dating back to Thomas Paine. But appeals to common sense are also a way of shutting down debate, dismissing potential objections as nonsensical or elitist or both. The common-sensical argument may very well be the best argument, but it’s not the best argument because it seems intuitively correct. That’s a logical fallacy.
(None of this is to suggest that common sense doesn’t have value. It does — but it’s wanting as a logical argument. Similarly, criticism of common sense doesn’t imply the inverse: that something has value because it is complicated or counter-intuitive.)
Common sense is also, as Levin went on to say, “anti-political” because politics in a democratic form of government are based on the assumption that there are legitimate differences of opinion.
“If you say that if somebody doesn’t agree with you they have no common sense and they need not be listened to, it’s a great way to avoid the difficult work that is politics in a diverse country — or even in a town,” Levin said. “If only those who agree with you show up, and only those who show up matter, why do the hard work of convincing others?”
The particular context was Levin analyzing attempts to create “citizen courts” that could indict and judge people — often officials — for offenses against individual liberty. (This post is not to opine on the validity of these citizen courts or the broader “sovereign citizen” movement, which has frequently clashed with regular officials subscribing to mainstream legal theories.)
I wrote two articles about citizen courts in South Dakota:
- Citizens boards aim to rein in ‘corrupt’ officials (Argus Leader, Dec. 15, 2013)
- Activists trying to bring citizen grand juries to state, but are they legal? (Argus Leader, Dec. 20, 2013)
This week’s episode of HBO’s “Game of Thrones” included a brief moment in which one character discusses the value of honor in a fight. In so doing, it recalls one of the show’s more iconic moments from an earlier — and complicates that scene’s apparent message. Spoilers, as well as gruesome images and a brief discussion of rape, after the jump:
When most people think of Louisiana, they think of New Orleans, its most prominent city. But that might irk a typical Louisianan, who’s likely to live well outside the Big Easy. Only 350,000 people call New Orleans home, less than 8 percent of the state and not much more than the capital Baton Rouge.
In contrast, my current state of South Dakota evokes images of endless fields of prairie and farmland. But almost 20 percent of the Mount Rushmore State lives in the fast-growing city of Sioux Falls, a percentage point behind Chicago’s relation to Illinois.
Other states aren’t associated with a single city at all. Missouri is divided between St. Louis and Kansas City. The two biggest cities in Minnesota are so entwined they’re usually referred to by a joint name. And Alabama has not two queen cities but four: Birmingham, Montgomery, Mobile and Huntsville all have around the same population, and each is around twice as populated as any other city in the state.
These facts aren’t just idle curiosities. They reflect important principles in math, statistics and demography, laws of nature most of us follow unawares — or break just as obliviously.
Whether a dense urban area or a rural one, any state or country will tend to have a small number of very large cities and a large number of small towns. It’s a principle called the “power law,” and it shows up in a lot more than just cities. In languages, a few words like “the” are used a lot, while large numbers of words are never used at all. Similarly, in books, music and movies, a few hits get consumed by everyone, while the so-called “long tail” of more obscure works have just a few customers each. Scholars have even discovered that war casualties follow a power law — there’s only a small number of bloodbaths like World War II, but large numbers of little skirmishes and insurgencies that don’t even crack the front pages.
If you graph something that follows a power law, it will have an L-shape, with a spike at one side that drops sharply, then gently declines in a long tail:
But if you graph it over a logarithmic scale — such as 1, 10, 100, 1,000, 10,000, etc., where values rise exponentially with every tick — a power law distribution will instead resemble a straight line:
(Not everything follows the power law — for example, average heights for men or women follows a “normal” distribution, with a lot of people near the average and smaller numbers on both sides. Both the power law and the normal distribution are examples of different ways numbers can cluster.)
But here’s the catch: the power law can be broken, even to things where it normally applies. There are other forces at work besides this tendency of some occurrences to involve a few big things and lots of little things. For example, while city size generally follows a power law, some countries have one city that’s disproportionately larger than the rest. A classic example is Paris, France’s cultural, political, economic and demographic center of gravity. If France followed the power law, Paris would still be the biggest city, but Marseilles or Lyons would be relatively larger and more powerful, a rival center of power and influence in the country instead of being on the country’s periphery. (Compare France to Italy, where Rome is big and important, but so is the business capital of Milan.)
Cities like Paris are called “primate cities” because they dominate their countries. The term, as coined by Mark Jefferson in 1939, just refers to cities that are “at least twice as large as the next largest city and more than twice as significant,” though its application can be somewhat subjective.
Most of the research on primate cities has happened at the national level, but the same logic can apply at a regional or sub-national level. New York City, the U.S.’s largest, isn’t a primate city for the whole country (despite the opinions of its residents). But no one can deny the dominance NYC exerts over its home state, 40 percent of whose population resides there. And that’s not even counting the influence of the Big Apple in parts of New Jersey and Connecticut.
I used data on city populations in each state to look at the role of each state’s largest city and how it relates to both the second-largest city and the rest of the state.
This ranges from New York, where NYC is 32 times larger than second-banana Buffalo, to Alabama, where the 212,000 people in Birmingham only slightly outnumber the 206,000 in Montgomery. As a percentage of the population, New York City and Anchorage, Alaska, both have more than 40 percent of their states’ populations, while Columbia, South Carolina, and Charleston, West Virginia, have less than 3 percent of theirs despite being the largest in the state.
Top five cities with the most people compared to the second-largest city in their state:
Top five cities with the most people compared to the second-largest city in their state:
Bottom five cities by share of state’s population:
- Charleston, 2.75% of West Virginia
- Columbia, 2.79% of South Carolina
- Newark, 3.13% of New Jersey
- Bridgeport, 4.08% of Connecticut
- Jacksonville, 4.33% of Florida
Bottom five cities with the most people compared to the second-largest city in their state:
- Birmingham, Alabama, 1.03 times larger than Montgomery
- Charleston, West Virginia, 1.04 times larger than Huntington
- Columbia, South Carolina, 1.05 times larger than Charleston
- Cheyenne, Wyoming, 1.06 times larger than Casper
- Memphis, Tennessee, 1.07 times larger than Nashville
Get the rest of the lists here.
I looked only at official city size, recognizing the shortcomings of this approach. The true strength of many urban centers doesn’t stop at the legal boundary but encompasses the great masses of residents and businesses in surrounding suburbs. In this sense, looking at metropolitan areas would be a better bet. But many metropolitan areas, as defined by the U.S. Census, spill over into other states, creating too much messiness for a state-focused analysis.
But being a primate city — or not — is about more than just beating #2. It means being truly dominant center of a state or country. In a normal power-law situation, the second-largest city will have half the population of the largest, while the third-largest will have one-third the people, and the fourth one-quarter, descending in inverse proportion to their rank.
So I pulled lists of the population of every city in each state from Wikipedia, and graphed them on a logarithmic scale. Remember that if something follows a power law, when graphed on a logarithmic scale it will appear to be a straight line.
That’s what we see looking at Missouri:
In contrast, Illinois displays a clear example of a primate city, with Chicago’s population far above the trend line evidenced by the rest of the state.
Other states don’t clearly fall into one category or another. Some have several big cities distinct from the general trend. Some have two cities near the top who also happen to be very near each other — both New Jersey and Minnesota reflect this situation. (Though New Jersey is probably better understood as being pulled between two different cities, neither one in its borders: New York City and Philadelphia.) In other cases, the two cities are far apart: Sioux Falls and Rapid City are on opposite sides of the state, as are Seattle and Spokane, and Philadelphia and Pittsburgh. In those cases, distance could mitigate the “primate city” effect, leaving the distant second city more as a distinct provincial capital than as runner-up.
The right end of each graph isn’t as important as the left — in many cases the linear trend breaks down on the right due to the large number of smaller cities, but research has shown smaller towns don’t follow the power law distribution as consistently as cities over 10,000 people do. Different states also had wildly varying sample sizes, ranging from hundreds of cities to less than two dozen. Also, the y-axis of each graph varies, with the top of the y-axis representing the population of the state’s largest city, whether that’s eight million or 42,000.
Today is the beginning of Daylight Saving Time, which many people love to hate. Studies have shown there are both health and economic costs to Daylight Saving Time, and no one enjoys the beginning of DST, when we lose an hour. (I’m actually kind of partial to the end, when we gain an hour.) And I am told that any discomfort someone like myself feels from clock-changes is nothing compared to parents of small children, who are less able to regulate their own body clock according to artificial factors like a clock change.
But here’s the thing. All these downsides to Daylight Saving Time have nothing to do with whether the sun sets at 6 p.m. or 7 p.m. They’re about the fact that we change, in a single day, from one time to another. When it’s December, I think people are actually pretty content that the sun isn’t rising at 8:30 a.m. And I definitely appreciate it being bright late into the evening in the peak of the summer. It just really stinks for a few days each spring and fall to have to reconfigure one’s internal clock.
Daylight Saving Time is illustrative of a broader principle: In many cases, when we complain about changes, what really bothers us is not the new normal, but the transition to get to the new normal. Put another way: sometimes it’s not the result that’s painful, it’s the change itself.
Take, for example, a family that’s earning $100,000 per year. Then, suddenly, something changes and they’re earning just $70,000 per year. There’s nothing wrong about earning $70,000 per year. Lots of families earn that much or less and are still comfortable and happy. But the change to a $70,000 salary from a much higher one can be painful. (Think about this example when listening to a lot of political discussion about changes to benefits and tax rates. This theory of the Painful Change explains why people will react so strongly to the proposal that their tax rate or government benefit change to a new, less generous, level that seems to a dispassionate observer to be perfectly reasonable.)
I think of the Painful Change maxim, too, when reading commentary and debate about climate change. If a region’s climate becomes hotter and drier, that’s bad news for all the living things (humans included) who currently live there. It’s not necessarily bad news for life itself, which in the long term will adapt to the new normal, possibly with new species or new behaviors from old species. But it can be catastrophic for everything that had adapted to the old way. Life thrives in the climate of St. Louis and life thrives in the climate of Minneapolis, but if Minneapolis’ climate changes to be like St. Louis’, it’s not going to be pleasant for things already living in Minneapolis.
Don’t take this idea of the Painful Change to diminish the significance of this transitional agony. I’m not making a “Who Moved My Cheese” argument that we should just suck it up and accept negative change because the new situation is all that matters — though in many cases, graceful adaptation to change is exactly what’s called for. My point is that we should conceptually distinguish between journey and destination. Sometimes we have to endure painful changes to get to good results. (I’d put Daylight Saving Time in that category.) Sometimes painful changes lead to painful results. Similarly, pleasant changes can lead to good or bad situations. And sometimes the magnitude of the change outweighs the magnitude of the result — while it can be worth it to endure a terrible change to get to a much better place, it’s not worth enduring a terrible change for a trivial improvement in one’s situation.
The comfort of the transition doesn’t necessarily tell us anything about where we end up, and we should recognize that when we make decisions — or before we start complaining about turning our clocks back in the spring.
A month ago, a map went viral showing the (allegedly) most popular television shows set in each state. South Dakota got “Deadwood.” Washington got “Frasier.” Maine, “Murder She Wrote.” Take a look at it here:
The map was produced by Business Insider, and at their site you can find justifications for the rankings.
It’s the sort of project designed as much to provoke argument as it is to settle them. And what interests me more than the map itself is one of those arguments I got into on a friend’s Facebook wall.
After I pointed out that (with one exception) the map-makers had excluded reality shows, someone I didn’t know got his dander up:
Wow I don’t think there is much else on TV anymore that is not a reality show. Seams like they are on par to put real actors out of business because they can pay these hicks peanuts compared to accomplished actors. Seams like a case of too many channels. I say we cut back to like 10 channels and our IQ would go up 40 points.
Oh and to prove my point 75% of the shows listed in the pictures are pre-1990′s and don’t exist anymore. So its more nostalgia then actually whats the “most popular”. And now its about impossible to find shows set in any state other than California, Texas, or New York. Even though ironically most of them are produced and filmed in British Columbia.
This included one major factual claim — that 75 percent of the shows were “pre-1990′s,” and from a quick glance over the map, it didn’t seem to be correct. So I opened up a spreadsheet.
After an hour or so of hand-entring data about TV shows that I later discovered Business Insider had already gathered, I had my result — and it proved my intuition right. Many, or even most, of the shows were new:
In fact, more than half had premiered in the 1990s or later. More than a third had come in this millennium. There were more shows from the 2000s than from the 1950s, 1960s and 1970s combined.
But the spreadsheet was more interesting than that. So even though my interlocutor fell silent at this point, I picked up his side of the discussion and imagined why the current television landscape might seem a vast wasteland. There’s one obvious answer: if you don’t have cable. Particularly, if you don’t have premium cable.
Because there is fantastic television being produced today, but a lot of it isn’t on ABC, NBC and CBS. To see great shows on the list like “Breaking Bad,” “Justified” and “Deadwood,” you needed to be watching AMC, FX or HBO. And if you aren’t — like I was growing up in a broadcast-only household — it might seem like TV is nothing more than laugh-track comedies and singing competitions.
Of course, cable is a relatively recent phenomenon compared to broadcast TV:
But my erstwhile opponent did have one good point. There are a LOT of reality shows on TV, particularly compared to yesteryear.
It’s just that there are also a lot of non-reality shows on TV. Because there are more shows on TV, period.
That’s been the biggest impact of the cable revolution. There are more players producing TV programming than there were in the days of three or four or five networks. Much of it is awful. Some of it is fantastic. The challenge is finding the wheat in the chaff — but then, that’s the fundamental challenge of modern life, a world sufficed with more options and information than any one person could possibly consume.
(Now, this is a flawed dataset. A better picture would come from a ranking of the best TV shows in history, not one that limits states like California and New York to just one show set there, while forces obscure choices onto the list for states like North Dakota and New Hampshire. But it’s still an instructive exercise.)
For those curious, you can view the full spreadsheet here. Below is a list of all the networks and the number of shows on the top 50.
News that some rural Colorado counties are trying to secede from their increasingly urban and liberal state has revived talk of a historical curiosity: the attempt, during the Great Depression, to create a new state out of parts of northern Wyoming, western South Dakota and southern Montana.
The name of the state, which would have been America’s 49th, was proposed to be Absaroka.
Here’s what it would have looked like:
I’m not concerned here with discussing the wisdom of secession, or the practicalities thereof. What got me curious today was a simpler question: what would South Dakota’s politics be like if these counties, some of the most reliably Republican in the state, weren’t part of the South Dakota electorate?
An Absaroka-less South Dakota would be more Democratic than the current Mount Rushmore state — but only to a degree.
One quick shorthand method for calculating the partisan lean of a state is the Cook Partisan Voting Index. Basically it looks at shares of the presidential vote to calculate how much more Democratic or Republican a state is than the country as a whole.
Real South Dakota (RSD), for example, has a PVI of “R+10,” meaning it’s 10 percentage points more Republican than the country. California is D+9, meaning it’s 9 percentage points more Democratic than the country. Virginia is dead even, meaning its partisan lean exactly matches the country.
Fortunately, Absaroka would have split along county boundaries, so it’s relatively easy to calculate the PVI for Alternate South Dakota. In the 2008 presidential election, John McCain would have won 52.4 percent of the two-party vote (he actually got 54.2 percent of the two-party vote). In the 2012 presidential election, Mitt Romney would have won 57.5 percent (he really won 59.2 percent). Comparing that average, 55 percent Republican, to the national Republican share of 47.1 percent, Alternate South Dakota ends up as an R+7.8.
In real life, South Dakota is R+10, so losing Absaroka would have made South Dakota about two percentage points more Democratic.
That’s not a ton. South Carolina is an R+8 state. Montana is an R+7. Both are solidly red states at the presidential level. (Georgia, at R+6, is the bluest state right now with two Republican senators.)
But small shifts can make the difference in close elections.
For example, in 2010, Kristi Noem beat Stephanie Herseth Sandlin by around 7,000 votes. But in Alternate South Dakota, without Noem’s Black Hills electoral strongholds, Herseth Sandlin narrowly wins re-election by 6,700 votes — a near inversion of the actual result. (Another potential boost for Herseth Sandlin: if Custer County were in a different state, independent B. Thomas Marking wouldn’t have been a candidate in the race.)
And Tom Daschle would have broken the Curse of Karl Mundt in 2004 if western South Dakota had gone to play in Absaroka. In real life, Thune beat Daschle by 4,500 votes. Alternate South Dakota would have voted for Daschle by a 9,300-vote margin.
(Big caveat: this is a scenario in which one assumes all other factors remain the same. In fact, a South Dakota without its western portion would have different politics. Different issues would be dominant. Candidates might take different positions, responding to different pressures from their constituents. Campaigning patterns would unfold differently.)
This only goes so far. For example, 2010 Democratic gubernatorial candidate Scott Heidepriem can draw little consolation from this counter-factual. In real life Dennis Daugaard won by 23 points and 73,000 votes. Alternate South Dakota would have voted for Daugaard by the only slightly less overwhelming total of 21 points and 52,000 votes.
What to take away? Geography matters. South Dakota is Republican through and through, and would remain so even if the most Republican part of the state were sliced off. But the slight shift toward the political center could have had big impacts in the state’s recent close elections.
Miscellaneous things I am pondering:
- What would the smaller South Dakota’s nickname be? Still the Sunshine State? Or something different?
- If the tourist hordes heading to the Black Hills were heading to another state, do you think Alternate South Dakota would have put tollbooths up on I-90?
- Would Pierre still be the capital? The physical investment in government infrastructure would be expensive to duplicate. But while Pierre is geographically central to South Dakota and has major population centers to its west in the Black Hills, in Alternate South Dakota there’s very little to the west of Pierre.
- In January, a Wyoming sportswriter took a look at a similar question: what would the high school sports conferences look like in Absaroka? If you like sports, give it a read.
- For more information on Absaroka and other attempts for parts of a state to secede into a new state, check out Andrew Shears’ project, “The 124 United States That Could’ve Been.” Here’s his map:
In “Competition and the Efficiency of Bureaucracies,” Gary Becker writes:
Bureaucracies are large complex hierarchical organizations governed… by formal rules rather than discretionary choices. This apparent rigidity in the decision-making process does not necessarily make bureaucracies “inefficient” because they may have advantages of scale and scope that offset their disadvantages of inflexibility and remote decision-making.
This struck me as a good, quick summary of why bureaucracies have drawbacks — and why they can be the best way to do things even with those drawbacks.
A similar thought, coincidentally, popped up in a presentation about the evolution of board games, sent to me the other day by a friend. Games journalist Quintin Smith, giving a talk about all the ways board games have evolved, started talking about the wargame “A Few Acres of Snow.” The discussion starts at 19:53 in the presentation.
This is a sickeningly well-designed game. This is just beautiful. It’s a wargame about the French and English fighting for control of their Canadian colonies, which sounds like whatever it sounds like. It uses deck-building to simulate the logistics of running a war in a foreign country.
Okay, I’m not selling this.
The point is, you have your deck, and your deck represents soldiers, the Indians you’ve recruited, the priests, the home support, the boats. More importantly, it contains cards for every piece of territory you control. And the territory cards are relatively useless, which means the more you spread yourself, the more land you spread yourself over, the less control you have.
Every hand of cards you draw is a story, because you need soldiers, and then your deck, which is basically your subordinates, says, “We don’t have any soldiers, not now.” “We need boats!” “No boats, they’re all somewhere else.”
And you just can’t do this! The amazing thing is, it’s a war game, but really, you’re fighting your own logistical battles. And it’s amazingly tense. Because if your deck would do what you wanted it to for just one turn, you could hit Montreal and you could take it and you could end the game. But it never gives you that.
And the coolest thing about this is, there’s actually a sort of administration card. As a general, you can say, “This is a mess. We need administration.” And the administration card, when it comes up in your hand, lets you remove cards from your deck permanently — with the twist that there’s no way of getting rid of the administration card. So if you build an administration, there’s no way to remove it. It’s like you’re permanently deciding, “We need more desks! We need people sort of running the war for me.” And then that starts getting in your way as well. (Emphasis added)
The way in which clever game design can replicate real-world experiences in ways beyond just moving pieces on a board (for another example, see my post on the supply-and-demand mechanics in the board game “Power Grid”) continually impresses me. The entire structure of an entertaining card game ends up replicating the insights of academic experts into the strengths and drawbacks of the bureaucracies that are inevitable in modern life.
(This post has been edited.)
I didn’t believe it at first when I met Southerners who told me how they were routinely dismissed as unintelligent by Northerners the minute a drawl came out of their mouths — and mocked and infantilized for the same. I had never had that reaction myself, and never heard anyone talking about it.
But these Southerners — some of whom are trying to lose their Southern accents to avoid this situation — aren’t imagining things or being over-sensitive. Multiple studies have looked at identical passages read by people in Southern and “standard” accents and found that listeners rate the person reading in a Southern accent as less intelligent, less wealthy and less educated than the same passage read by a non-Southerner.
From a study by Taylor Phillips, a student at Stanford University (and a Kentuckian studying in California):
Southern condition participants rated intelligence on average 3.2 (SD=1.36), while Standard condition participants rated intelligence on average participants explicitly to rank intelligence, the Southern voices received an average rating of 3.05 (SD=1.43), while Standard voices received an average rating of 5.25 (SD=1.16). The average difference between Southern and Standard voices within participants’ intelligence ratings was -1.6 (SD=1.12;; Southern minus Standard). For the explicit intelligence measure, this average difference increased to -2.2 (SD=1.18). This suggests that Southern accent does trigger differences in social perception of intelligence, and that these differences are both strong and in the direction of the stereotype.
A similar result from a dissertation by Hayley Heaton, a doctoral student at Emory University in Georgia:
The analyses revealed that when the speaker was talking with a standard accent, he or she was rated significantly more intelligent (F (1,60) = 4.14, p = 0.05), more arrogant (F (1,60) = 5.47, p = 0.02), smarter (F (1,60) = 4.49, p = 0.04), better educated (F (1,60) = 5.02, p = 0.03), and as having better English (F (1,60) = 12.90, p < 0.01) than Southern-accented speakers, regardless of passage type.
What I find most interesting about this is that so many other prejudices about groups of people — or at least negative prejudices — have an air of social unacceptability. But making fun of Southerners as dumb hicks seems to be fair game.
My best hypothesis (unsupported by any data I can find) is that prejudice against Southerners remains acceptable because unlike many other group stereotypes, it’s not tied to any particular racial or ethnic group. It’s taboo today to make fun of someone for their race or ethnicity, which makes stereotypes about people from diverse Northern cities a minefield. (These stereotypes, which do exist, are often good-natured or embraced by their subjects, not imposed by outsiders.) The same would potentially apply to other regions that also aren’t dominated by a single racial or ethnic group, though none come immediately to mind.
As with any analysis of stereotypes, it’s important to be cognizant of conflating effects — perhaps a region is seen as less intelligent because the education system there is poorer? But the fact that a stereotype is on average true of a group doesn’t justify treating individual members of that group as if they conform to that stereotype.
Have other people experienced similar judgements based on region or regional accent? Why do you think these are acceptable when so many other prejudices are not in contemporary America?
(TV) used to be the sort of thing that you watched casually week to week; you weren’t supposed to get deeply invested in the emotional lives of the characters, and the shows were designed to keep that involvement to a bare minimum. You were drawn by the actors’ charisma or good looks, but you weren’t supposed to worry about their inner lives, which were mostly nonexistent. It was the fans who read deeper meanings into the shows, and through fan fiction and essays they provided the emotional resonances that the TV shows were not intended to evoke. Doctor Who is a great example of a show that went full circle through the cycle of fandom; many of the writers and showrunners, as well as the actors, were great fans of the program when they were kids, and many of them worked on semi-official tie-in novels or radio plays while it was in hibernation. By reviving the program, they effectively recreated it in their fannish image; the characters are now capable of expressing the thoughts and emotions that could only be inferred in the original version.
The just-aired Christmas special, by the way, was merely okay — some very good elements, and lots of flaws, some in the episode itself and others planted by failures earlier in the series.
But the Doctor Who 50th anniversary special last month was among the show’s best episodes.
Several weeks ago, while discussing the oncoming winter with my Southern-raised girlfriend, we reached an impasse over what exactly constituted weather cold enough to get alarmed about. Coming from Louisiana, she insisted that anything in even the 40s Fahrenheit was frigid, weather to cause people to stay indoors, bundled up in front of the fireplace. Myself, growing up in bitter Chicago winters, said you can’t start calling weather “cold” until the weather at least falls into the 30s — and that even then, extreme cold doesn’t start until the thermometer falls to the single digits.
But clearly our perspectives were entirely subjective. The only way out of this situation, for any good rationally minded person, is to get more data.
So I went to my Facebook page and posted the following query:
Above what temperature would you generally consider the weather to be “hot,” as opposed to merely “warm”? Below what temperature would you generally consider the weather to be “cold,” as opposed to merely “cool”? (For context, please also provide the part of the country/world you grew up in.)
Twenty different people responded: nine men (counting me) and 11 women. Here’s what I found:
- The warmest temperature anyone considered cold was 62, though that may be an outlier — that respondent gave a range of only 11 degrees between cold and hot, much less than the average. Next up was 55 degrees, from a southerner.
- The coldest temperature that anyone considers not cold was a mere 11 degrees, from someone raised on a farm on the central South Dakota prairie.
- The coldest temperature anyone considered hot, aside from that same outlier (who said 73) was 85 degrees, while the highest threshold for the onset of true heat was 97.
- One person commented, “I think that you’ll find that the survey results will show that women get colder at a much warmer temperature than men.” And, in fact, he was right. The median female respondent said coldness began at 45 degrees, while the median male said coldness didn’t begin until 32 degrees. (Means told a similar story.) This wasn’t a function of a sample including a lot of females from warmer climes — the median latitude was about the same for both genders.
- But there was no difference when it came to when hotness began. Both men and women had a median hotness temperature of 90 degrees.
- Indeed, there was remarkable agreement about what constitutes heat. Setting aside the outlier, the range of hotness answers varied by only 12 degrees. The range of coldness answers varied by 45 degrees.
- Where people grew up, unsurprisingly, mattered. Using a little bit of judgement for people who had moved around (I defaulted to the town people listed as their hometown on their Facebook page) I plotted a longitude for each person. The southern half of the longitudes (a dividing line right through the southern part of the Chicago area) said cold began at a median of 42.5 degrees. The northern half said 33.5 degrees. (There was only a 2.5 degree difference on heat — the southern half said 92.5, while the northern half said 90.)
- The key difference, as shown on the below chart, was that while some northerners can’t stand the cold, no one from the south (minus one person who split his time as a kid between Indonesia and Alaska — he’s plotted as Indonesia and is a clear outlier, but clearly is someone who experienced both extremes) could. (Note that this actually a chart of the absolute value of latitude, because the southern hemisphere latitude of Jakarta looked weird, and distance from the equator is the more important factor.)
- The heat differences, again, are less dramatic:
This study didn’t actually end up proving anything or resolving my debate with my girlfriend. (For one thing, I’d prefer to have a sample size of several thousand points, not just a score.) But I had fun doing it, which is really the point of [social] science.
Interestingly, in our conversations, my girlfriend and I have agreed that the extremes aren’t actually where people disagree. That is, when it’s 102, everyone agrees it’s really hot, even if some people are more bothered by it than others. The same when the weather hits single digits — everyone agrees it’s really, uncomfortably cold. The conflicts arise in the middle ground — whether it’s warm enough to open the windows, or cold enough to require a comforter on top of bed sheets.