So let’s say you want to give the moon an atmosphere. I’ve got you covered.
Physics professor Gregory Benford wrote a piece for Slate about how one could theoretically slam ice comets into the moon as step one of a plan to make the moon habitable for human life. It’s likely only a few of you read it — it got about 1,600 shares on Facebook and 250 tweets, which is pretty good but even with a decent multiplier effect is far short of ubiquitous. But it’s the sort of thing I tend to stumble across, sipping from the firehose of information online through dozens of RSS feeds, Twitter and friends who also like gathering interesting or bizarre links.
I’ve been in the habit of sharing interesting stuff like that with a few friends, or posting the best on my Facebook page or Twitter. Now I’m trying something new: culling the best stuff I come across into regular miniature newsletters. I call it “Internet Flotsam” and you can subscribe below to get an assortment of interesting links in your inbox on a mostly-daily basis. I try to mix it up — some history and politics, some science and statistics, some pop culture and sports.
Here’s a few excepts from my recent letters, to give you a sense of whether this is the sort of thing you’re interested in:
- Having kids, keeping roommates: Two couples — one expecting kids — went in together on a joint 30-year mortgage. They’ll share the house and housework and raise their families together. The experiment is just starting, but all four people hope the experience will “curb feelings of isolation” and preserve the social living environment they loved while having roommates, even after the point (kids) when couples make their family their social focus.
- Twelve Wrong Men: “12 Angry Men” is the quintessential jury drama, an idealistic look at how one man’s dogged goodness helped persuaded his comrades to set aside their biases and save an innocent man from an unjust conviction. Except, Mike D’Angelo argues, the alleged criminal in “12 Angry Men” was almost certainly guilty.
- Ten years of sentences: Ever wish you could just make yourself put everything aside, sit down, and get some serious reading done? Daniel Genis found himself in just that situation and read 1,046 books in 10 years. (That’s about one every 3.5 days.) His secret? He was serving a 10-year sentence for armed robbery. ‘At first, Genis resisted “Ulysses,” but his father kept bringing it. “I argued that he wouldn’t have the willpower to get through it once he became a free man,” Alexander Genis told me.’
- The LeBron James of baseball: LeBron James is a beast at basketball, someone who can single-handedly turn a terrible team into a good one just by setting foot on the court. What would it mean for someone to be as dominant at baseball as James is at basketball? Jeff Sullivan tried to find out, and the answer is that he would be about two to four times better than any player in history — the best hitter ever, who’s also the best fielder ever, and maybe who has to also be the best pitcher ever. Why? Basketball has five players on the court who play both offense and defense. Baseball has nine (or 10) at a time, and one of them does most of the defensive work — but that player only plays every five games. Each player in baseball is simply less important to their team than an NBA star.
- Lifehacking: “How to make epic pancakes with your Japanese rice cooker.” I haven’t tried this yet — and don’t even own a rice cooker — but this looks amazing enough I may do both. (Confession: the best part for me at first glance is the minimal cleanup.)
- Terra nullius, terra meam: A Virginia man has founded his own African kingdom as a way to shame all the other dads who invented excuses when their daughters wanted to become princesses. Surprisingly, he may have firmer legal footing here than you’d expect, because of a centuries-old piece of international law (and a unique territorial dispute between Egypt and Sudan). It’s not going to work, of course, but it COULD work. (Also, I apologize for the Google Translated Latin.)
If you do subscribe, please give me feedback. This is only valuable for me if people are enjoying the links, so tell me what kind of stuff you like and don’t like to see in these letters, and what time of day is best for you to get them.
Alexis Madrigal at the Atlantic terms new media ventures like FireThirtyEight, Vox, the Upshot and others “method journalism,” in that they’re primarily focused on how they report the news, rather than what news they report:
In a world where traditional beats may not make sense, where almost all marginal traffic growth comes from Facebook, where subscription revenue is a rumor, where business concerns demand breadth because they want scale… a big part of the industry’s response this year has been to create sites that become known by how they cover something rather than what.
FiveThirtyEight’s method is using data and statistic to cover the news. Vox is about explaining the news. Circa is about prioritizing news for viewing on mobile phones. The Upshot is about “plain-spoken, analytical journalism.”
As a reporter myself interested in exploring new ways to gather and produce journalism, I’ve been following these new ventures avidly. And yesterday, completely unintentionally, I conducted an unscientific experiment on my work blog into how readers respond to these various new approaches to journalism.
In the morning, I posted a FiveThirtyEight-style data analysis (in this case, I was literally inspired by an article FiveThirtyEight founder Nate Silver had written). After hearing an argument about why a gubernatorial had lost his election, I cross-referenced Census data and election results to figure out whether dislike of “carpetbaggers” had swung serious votes:
If dislike of people without deep roots in the state contributed to Lowe’s defeat, then you’d expect him to do worse in counties where there were fewer transplants — who presumably don’t have the same value on deep local roots, since they lack them themselves…
… In this case, the correlation between Lowe’s support and the rate of out-of-state residents is essentially zero. You can see it for yourself: there’s no pattern there. Lowe did well in some states with lots of transplants, and in some states with few transplants. Among counties with similar levels of out-of-staters, Lowe’s vote share varied by as much as 50 points.
A few hours later, I posted a Vox-style article, explaining something that bewildered more than one person I know: why the closure of a brief section of Interstate 29 caused authorities to announce a 455-mile detour.
Why would authorities tell motorists to go through Des Moines when they could drive on back roads through Iowa or Nebraska to avoid the flooding, at a much lower cost in added time and miles?
After I snarked a little bit on Twitter, a spokesperson with the state Emergency Operations Center called me up to explain.
“Federal law requires that the interstate system in all of America must be connected,” said Jonathan Harms, a spokesman with the Emergency Operations Center. “The DOT and the Federal Highway Administration needs to have some sort of route that collects all the interstates in America.”
Officials are, therefore, obligated by law to present an official detour that stays on the Interstate system. But they also can offer other, more direct detours. Like this one… that’s only an extra 20 miles and half an hour.
I had fun with both posts. But they got very different responses from the public.
The FiveThirtyEight-style post provoked a few intelligent responses by other close observers of South Dakota politics, but generally didn’t attract a wide audience. As of right now, it’s been shared on Facebook seven times — at least two of which were by me.
The Vox-style post didn’t really spark any discussion. (There had been some Twitter discussion about the detour before my post on why the long detour was proposed, but nothing once I explained the reason for the seemingly absurd detour.)
But lots of people read and liked it. As of now, it’s been shared on Facebook a full 344 times.
As I mentioned above, two posts is in no way a scientific study of how people respond to data-heavy analysis vs. explanatory journalism. FiveThirtyEight has plenty of posts go viral, while Vox has written some wonky analysis to go along with its usual explainers aimed at making the news understandable for ordinary people. But it’s not surprising to me that my unscientific experience yesterday showed a lot more people are interested in having things explained to them in a straightforward manner than deep dives into the numbers.
Common sense is always the easiest way to avoid actually arguing the matter at hand. “It’s just a matter of common sense. It’s just so obvious.” Which works with people who already agree with you. But politics is largely the process of convincing people who don’t always agree with you.
This is a powerful statement even without its context, and I’m tempted to leave it at that. As a political journalist, I frequently hear appeals to “common sense” from politicians and ordinary citizens alike. It’s a time-honored tradition dating back to Thomas Paine. But appeals to common sense are also a way of shutting down debate, dismissing potential objections as nonsensical or elitist or both. The common-sensical argument may very well be the best argument, but it’s not the best argument because it seems intuitively correct. That’s a logical fallacy.
(None of this is to suggest that common sense doesn’t have value. It does — but it’s wanting as a logical argument. Similarly, criticism of common sense doesn’t imply the inverse: that something has value because it is complicated or counter-intuitive.)
Common sense is also, as Levin went on to say, “anti-political” because politics in a democratic form of government are based on the assumption that there are legitimate differences of opinion.
“If you say that if somebody doesn’t agree with you they have no common sense and they need not be listened to, it’s a great way to avoid the difficult work that is politics in a diverse country — or even in a town,” Levin said. “If only those who agree with you show up, and only those who show up matter, why do the hard work of convincing others?”
The particular context was Levin analyzing attempts to create “citizen courts” that could indict and judge people — often officials — for offenses against individual liberty. (This post is not to opine on the validity of these citizen courts or the broader “sovereign citizen” movement, which has frequently clashed with regular officials subscribing to mainstream legal theories.)
I wrote two articles about citizen courts in South Dakota:
- Citizens boards aim to rein in ‘corrupt’ officials (Argus Leader, Dec. 15, 2013)
- Activists trying to bring citizen grand juries to state, but are they legal? (Argus Leader, Dec. 20, 2013)
This week’s episode of HBO’s “Game of Thrones” included a brief moment in which one character discusses the value of honor in a fight. In so doing, it recalls one of the show’s more iconic moments from an earlier — and complicates that scene’s apparent message. Spoilers, as well as gruesome images and a brief discussion of rape, after the jump:
When most people think of Louisiana, they think of New Orleans, its most prominent city. But that might irk a typical Louisianan, who’s likely to live well outside the Big Easy. Only 350,000 people call New Orleans home, less than 8 percent of the state and not much more than the capital Baton Rouge.
In contrast, my current state of South Dakota evokes images of endless fields of prairie and farmland. But almost 20 percent of the Mount Rushmore State lives in the fast-growing city of Sioux Falls, a percentage point behind Chicago’s relation to Illinois.
Other states aren’t associated with a single city at all. Missouri is divided between St. Louis and Kansas City. The two biggest cities in Minnesota are so entwined they’re usually referred to by a joint name. And Alabama has not two queen cities but four: Birmingham, Montgomery, Mobile and Huntsville all have around the same population, and each is around twice as populated as any other city in the state.
These facts aren’t just idle curiosities. They reflect important principles in math, statistics and demography, laws of nature most of us follow unawares — or break just as obliviously.
Whether a dense urban area or a rural one, any state or country will tend to have a small number of very large cities and a large number of small towns. It’s a principle called the “power law,” and it shows up in a lot more than just cities. In languages, a few words like “the” are used a lot, while large numbers of words are never used at all. Similarly, in books, music and movies, a few hits get consumed by everyone, while the so-called “long tail” of more obscure works have just a few customers each. Scholars have even discovered that war casualties follow a power law — there’s only a small number of bloodbaths like World War II, but large numbers of little skirmishes and insurgencies that don’t even crack the front pages.
If you graph something that follows a power law, it will have an L-shape, with a spike at one side that drops sharply, then gently declines in a long tail:
But if you graph it over a logarithmic scale — such as 1, 10, 100, 1,000, 10,000, etc., where values rise exponentially with every tick — a power law distribution will instead resemble a straight line:
(Not everything follows the power law — for example, average heights for men or women follows a “normal” distribution, with a lot of people near the average and smaller numbers on both sides. Both the power law and the normal distribution are examples of different ways numbers can cluster.)
But here’s the catch: the power law can be broken, even to things where it normally applies. There are other forces at work besides this tendency of some occurrences to involve a few big things and lots of little things. For example, while city size generally follows a power law, some countries have one city that’s disproportionately larger than the rest. A classic example is Paris, France’s cultural, political, economic and demographic center of gravity. If France followed the power law, Paris would still be the biggest city, but Marseilles or Lyons would be relatively larger and more powerful, a rival center of power and influence in the country instead of being on the country’s periphery. (Compare France to Italy, where Rome is big and important, but so is the business capital of Milan.)
Cities like Paris are called “primate cities” because they dominate their countries. The term, as coined by Mark Jefferson in 1939, just refers to cities that are “at least twice as large as the next largest city and more than twice as significant,” though its application can be somewhat subjective.
Most of the research on primate cities has happened at the national level, but the same logic can apply at a regional or sub-national level. New York City, the U.S.’s largest, isn’t a primate city for the whole country (despite the opinions of its residents). But no one can deny the dominance NYC exerts over its home state, 40 percent of whose population resides there. And that’s not even counting the influence of the Big Apple in parts of New Jersey and Connecticut.
I used data on city populations in each state to look at the role of each state’s largest city and how it relates to both the second-largest city and the rest of the state.
This ranges from New York, where NYC is 32 times larger than second-banana Buffalo, to Alabama, where the 212,000 people in Birmingham only slightly outnumber the 206,000 in Montgomery. As a percentage of the population, New York City and Anchorage, Alaska, both have more than 40 percent of their states’ populations, while Columbia, South Carolina, and Charleston, West Virginia, have less than 3 percent of theirs despite being the largest in the state.
Top five cities with the most people compared to the second-largest city in their state:
Top five cities with the most people compared to the second-largest city in their state:
Bottom five cities by share of state’s population:
- Charleston, 2.75% of West Virginia
- Columbia, 2.79% of South Carolina
- Newark, 3.13% of New Jersey
- Bridgeport, 4.08% of Connecticut
- Jacksonville, 4.33% of Florida
Bottom five cities with the most people compared to the second-largest city in their state:
- Birmingham, Alabama, 1.03 times larger than Montgomery
- Charleston, West Virginia, 1.04 times larger than Huntington
- Columbia, South Carolina, 1.05 times larger than Charleston
- Cheyenne, Wyoming, 1.06 times larger than Casper
- Memphis, Tennessee, 1.07 times larger than Nashville
Get the rest of the lists here.
I looked only at official city size, recognizing the shortcomings of this approach. The true strength of many urban centers doesn’t stop at the legal boundary but encompasses the great masses of residents and businesses in surrounding suburbs. In this sense, looking at metropolitan areas would be a better bet. But many metropolitan areas, as defined by the U.S. Census, spill over into other states, creating too much messiness for a state-focused analysis.
But being a primate city — or not — is about more than just beating #2. It means being truly dominant center of a state or country. In a normal power-law situation, the second-largest city will have half the population of the largest, while the third-largest will have one-third the people, and the fourth one-quarter, descending in inverse proportion to their rank.
So I pulled lists of the population of every city in each state from Wikipedia, and graphed them on a logarithmic scale. Remember that if something follows a power law, when graphed on a logarithmic scale it will appear to be a straight line.
That’s what we see looking at Missouri:
In contrast, Illinois displays a clear example of a primate city, with Chicago’s population far above the trend line evidenced by the rest of the state.
Other states don’t clearly fall into one category or another. Some have several big cities distinct from the general trend. Some have two cities near the top who also happen to be very near each other — both New Jersey and Minnesota reflect this situation. (Though New Jersey is probably better understood as being pulled between two different cities, neither one in its borders: New York City and Philadelphia.) In other cases, the two cities are far apart: Sioux Falls and Rapid City are on opposite sides of the state, as are Seattle and Spokane, and Philadelphia and Pittsburgh. In those cases, distance could mitigate the “primate city” effect, leaving the distant second city more as a distinct provincial capital than as runner-up.
The right end of each graph isn’t as important as the left — in many cases the linear trend breaks down on the right due to the large number of smaller cities, but research has shown smaller towns don’t follow the power law distribution as consistently as cities over 10,000 people do. Different states also had wildly varying sample sizes, ranging from hundreds of cities to less than two dozen. Also, the y-axis of each graph varies, with the top of the y-axis representing the population of the state’s largest city, whether that’s eight million or 42,000.
Today is the beginning of Daylight Saving Time, which many people love to hate. Studies have shown there are both health and economic costs to Daylight Saving Time, and no one enjoys the beginning of DST, when we lose an hour. (I’m actually kind of partial to the end, when we gain an hour.) And I am told that any discomfort someone like myself feels from clock-changes is nothing compared to parents of small children, who are less able to regulate their own body clock according to artificial factors like a clock change.
But here’s the thing. All these downsides to Daylight Saving Time have nothing to do with whether the sun sets at 6 p.m. or 7 p.m. They’re about the fact that we change, in a single day, from one time to another. When it’s December, I think people are actually pretty content that the sun isn’t rising at 8:30 a.m. And I definitely appreciate it being bright late into the evening in the peak of the summer. It just really stinks for a few days each spring and fall to have to reconfigure one’s internal clock.
Daylight Saving Time is illustrative of a broader principle: In many cases, when we complain about changes, what really bothers us is not the new normal, but the transition to get to the new normal. Put another way: sometimes it’s not the result that’s painful, it’s the change itself.
Take, for example, a family that’s earning $100,000 per year. Then, suddenly, something changes and they’re earning just $70,000 per year. There’s nothing wrong about earning $70,000 per year. Lots of families earn that much or less and are still comfortable and happy. But the change to a $70,000 salary from a much higher one can be painful. (Think about this example when listening to a lot of political discussion about changes to benefits and tax rates. This theory of the Painful Change explains why people will react so strongly to the proposal that their tax rate or government benefit change to a new, less generous, level that seems to a dispassionate observer to be perfectly reasonable.)
I think of the Painful Change maxim, too, when reading commentary and debate about climate change. If a region’s climate becomes hotter and drier, that’s bad news for all the living things (humans included) who currently live there. It’s not necessarily bad news for life itself, which in the long term will adapt to the new normal, possibly with new species or new behaviors from old species. But it can be catastrophic for everything that had adapted to the old way. Life thrives in the climate of St. Louis and life thrives in the climate of Minneapolis, but if Minneapolis’ climate changes to be like St. Louis’, it’s not going to be pleasant for things already living in Minneapolis.
Don’t take this idea of the Painful Change to diminish the significance of this transitional agony. I’m not making a “Who Moved My Cheese” argument that we should just suck it up and accept negative change because the new situation is all that matters — though in many cases, graceful adaptation to change is exactly what’s called for. My point is that we should conceptually distinguish between journey and destination. Sometimes we have to endure painful changes to get to good results. (I’d put Daylight Saving Time in that category.) Sometimes painful changes lead to painful results. Similarly, pleasant changes can lead to good or bad situations. And sometimes the magnitude of the change outweighs the magnitude of the result — while it can be worth it to endure a terrible change to get to a much better place, it’s not worth enduring a terrible change for a trivial improvement in one’s situation.
The comfort of the transition doesn’t necessarily tell us anything about where we end up, and we should recognize that when we make decisions — or before we start complaining about turning our clocks back in the spring.
A month ago, a map went viral showing the (allegedly) most popular television shows set in each state. South Dakota got “Deadwood.” Washington got “Frasier.” Maine, “Murder She Wrote.” Take a look at it here:
The map was produced by Business Insider, and at their site you can find justifications for the rankings.
It’s the sort of project designed as much to provoke argument as it is to settle them. And what interests me more than the map itself is one of those arguments I got into on a friend’s Facebook wall.
After I pointed out that (with one exception) the map-makers had excluded reality shows, someone I didn’t know got his dander up:
Wow I don’t think there is much else on TV anymore that is not a reality show. Seams like they are on par to put real actors out of business because they can pay these hicks peanuts compared to accomplished actors. Seams like a case of too many channels. I say we cut back to like 10 channels and our IQ would go up 40 points.
Oh and to prove my point 75% of the shows listed in the pictures are pre-1990′s and don’t exist anymore. So its more nostalgia then actually whats the “most popular”. And now its about impossible to find shows set in any state other than California, Texas, or New York. Even though ironically most of them are produced and filmed in British Columbia.
This included one major factual claim — that 75 percent of the shows were “pre-1990′s,” and from a quick glance over the map, it didn’t seem to be correct. So I opened up a spreadsheet.
After an hour or so of hand-entring data about TV shows that I later discovered Business Insider had already gathered, I had my result — and it proved my intuition right. Many, or even most, of the shows were new:
In fact, more than half had premiered in the 1990s or later. More than a third had come in this millennium. There were more shows from the 2000s than from the 1950s, 1960s and 1970s combined.
But the spreadsheet was more interesting than that. So even though my interlocutor fell silent at this point, I picked up his side of the discussion and imagined why the current television landscape might seem a vast wasteland. There’s one obvious answer: if you don’t have cable. Particularly, if you don’t have premium cable.
Because there is fantastic television being produced today, but a lot of it isn’t on ABC, NBC and CBS. To see great shows on the list like “Breaking Bad,” “Justified” and “Deadwood,” you needed to be watching AMC, FX or HBO. And if you aren’t — like I was growing up in a broadcast-only household — it might seem like TV is nothing more than laugh-track comedies and singing competitions.
Of course, cable is a relatively recent phenomenon compared to broadcast TV:
But my erstwhile opponent did have one good point. There are a LOT of reality shows on TV, particularly compared to yesteryear.
It’s just that there are also a lot of non-reality shows on TV. Because there are more shows on TV, period.
That’s been the biggest impact of the cable revolution. There are more players producing TV programming than there were in the days of three or four or five networks. Much of it is awful. Some of it is fantastic. The challenge is finding the wheat in the chaff — but then, that’s the fundamental challenge of modern life, a world sufficed with more options and information than any one person could possibly consume.
(Now, this is a flawed dataset. A better picture would come from a ranking of the best TV shows in history, not one that limits states like California and New York to just one show set there, while forces obscure choices onto the list for states like North Dakota and New Hampshire. But it’s still an instructive exercise.)
For those curious, you can view the full spreadsheet here. Below is a list of all the networks and the number of shows on the top 50.
News that some rural Colorado counties are trying to secede from their increasingly urban and liberal state has revived talk of a historical curiosity: the attempt, during the Great Depression, to create a new state out of parts of northern Wyoming, western South Dakota and southern Montana.
The name of the state, which would have been America’s 49th, was proposed to be Absaroka.
Here’s what it would have looked like:
I’m not concerned here with discussing the wisdom of secession, or the practicalities thereof. What got me curious today was a simpler question: what would South Dakota’s politics be like if these counties, some of the most reliably Republican in the state, weren’t part of the South Dakota electorate?
An Absaroka-less South Dakota would be more Democratic than the current Mount Rushmore state — but only to a degree.
One quick shorthand method for calculating the partisan lean of a state is the Cook Partisan Voting Index. Basically it looks at shares of the presidential vote to calculate how much more Democratic or Republican a state is than the country as a whole.
Real South Dakota (RSD), for example, has a PVI of “R+10,” meaning it’s 10 percentage points more Republican than the country. California is D+9, meaning it’s 9 percentage points more Democratic than the country. Virginia is dead even, meaning its partisan lean exactly matches the country.
Fortunately, Absaroka would have split along county boundaries, so it’s relatively easy to calculate the PVI for Alternate South Dakota. In the 2008 presidential election, John McCain would have won 52.4 percent of the two-party vote (he actually got 54.2 percent of the two-party vote). In the 2012 presidential election, Mitt Romney would have won 57.5 percent (he really won 59.2 percent). Comparing that average, 55 percent Republican, to the national Republican share of 47.1 percent, Alternate South Dakota ends up as an R+7.8.
In real life, South Dakota is R+10, so losing Absaroka would have made South Dakota about two percentage points more Democratic.
That’s not a ton. South Carolina is an R+8 state. Montana is an R+7. Both are solidly red states at the presidential level. (Georgia, at R+6, is the bluest state right now with two Republican senators.)
But small shifts can make the difference in close elections.
For example, in 2010, Kristi Noem beat Stephanie Herseth Sandlin by around 7,000 votes. But in Alternate South Dakota, without Noem’s Black Hills electoral strongholds, Herseth Sandlin narrowly wins re-election by 6,700 votes — a near inversion of the actual result. (Another potential boost for Herseth Sandlin: if Custer County were in a different state, independent B. Thomas Marking wouldn’t have been a candidate in the race.)
And Tom Daschle would have broken the Curse of Karl Mundt in 2004 if western South Dakota had gone to play in Absaroka. In real life, Thune beat Daschle by 4,500 votes. Alternate South Dakota would have voted for Daschle by a 9,300-vote margin.
(Big caveat: this is a scenario in which one assumes all other factors remain the same. In fact, a South Dakota without its western portion would have different politics. Different issues would be dominant. Candidates might take different positions, responding to different pressures from their constituents. Campaigning patterns would unfold differently.)
This only goes so far. For example, 2010 Democratic gubernatorial candidate Scott Heidepriem can draw little consolation from this counter-factual. In real life Dennis Daugaard won by 23 points and 73,000 votes. Alternate South Dakota would have voted for Daugaard by the only slightly less overwhelming total of 21 points and 52,000 votes.
What to take away? Geography matters. South Dakota is Republican through and through, and would remain so even if the most Republican part of the state were sliced off. But the slight shift toward the political center could have had big impacts in the state’s recent close elections.
Miscellaneous things I am pondering:
- What would the smaller South Dakota’s nickname be? Still the Sunshine State? Or something different?
- If the tourist hordes heading to the Black Hills were heading to another state, do you think Alternate South Dakota would have put tollbooths up on I-90?
- Would Pierre still be the capital? The physical investment in government infrastructure would be expensive to duplicate. But while Pierre is geographically central to South Dakota and has major population centers to its west in the Black Hills, in Alternate South Dakota there’s very little to the west of Pierre.
- In January, a Wyoming sportswriter took a look at a similar question: what would the high school sports conferences look like in Absaroka? If you like sports, give it a read.
- For more information on Absaroka and other attempts for parts of a state to secede into a new state, check out Andrew Shears’ project, “The 124 United States That Could’ve Been.” Here’s his map:
In “Competition and the Efficiency of Bureaucracies,” Gary Becker writes:
Bureaucracies are large complex hierarchical organizations governed… by formal rules rather than discretionary choices. This apparent rigidity in the decision-making process does not necessarily make bureaucracies “inefficient” because they may have advantages of scale and scope that offset their disadvantages of inflexibility and remote decision-making.
This struck me as a good, quick summary of why bureaucracies have drawbacks — and why they can be the best way to do things even with those drawbacks.
A similar thought, coincidentally, popped up in a presentation about the evolution of board games, sent to me the other day by a friend. Games journalist Quintin Smith, giving a talk about all the ways board games have evolved, started talking about the wargame “A Few Acres of Snow.” The discussion starts at 19:53 in the presentation.
This is a sickeningly well-designed game. This is just beautiful. It’s a wargame about the French and English fighting for control of their Canadian colonies, which sounds like whatever it sounds like. It uses deck-building to simulate the logistics of running a war in a foreign country.
Okay, I’m not selling this.
The point is, you have your deck, and your deck represents soldiers, the Indians you’ve recruited, the priests, the home support, the boats. More importantly, it contains cards for every piece of territory you control. And the territory cards are relatively useless, which means the more you spread yourself, the more land you spread yourself over, the less control you have.
Every hand of cards you draw is a story, because you need soldiers, and then your deck, which is basically your subordinates, says, “We don’t have any soldiers, not now.” “We need boats!” “No boats, they’re all somewhere else.”
And you just can’t do this! The amazing thing is, it’s a war game, but really, you’re fighting your own logistical battles. And it’s amazingly tense. Because if your deck would do what you wanted it to for just one turn, you could hit Montreal and you could take it and you could end the game. But it never gives you that.
And the coolest thing about this is, there’s actually a sort of administration card. As a general, you can say, “This is a mess. We need administration.” And the administration card, when it comes up in your hand, lets you remove cards from your deck permanently — with the twist that there’s no way of getting rid of the administration card. So if you build an administration, there’s no way to remove it. It’s like you’re permanently deciding, “We need more desks! We need people sort of running the war for me.” And then that starts getting in your way as well. (Emphasis added)
The way in which clever game design can replicate real-world experiences in ways beyond just moving pieces on a board (for another example, see my post on the supply-and-demand mechanics in the board game “Power Grid”) continually impresses me. The entire structure of an entertaining card game ends up replicating the insights of academic experts into the strengths and drawbacks of the bureaucracies that are inevitable in modern life.
(This post has been edited.)
I didn’t believe it at first when I met Southerners who told me how they were routinely dismissed as unintelligent by Northerners the minute a drawl came out of their mouths — and mocked and infantilized for the same. I had never had that reaction myself, and never heard anyone talking about it.
But these Southerners — some of whom are trying to lose their Southern accents to avoid this situation — aren’t imagining things or being over-sensitive. Multiple studies have looked at identical passages read by people in Southern and “standard” accents and found that listeners rate the person reading in a Southern accent as less intelligent, less wealthy and less educated than the same passage read by a non-Southerner.
From a study by Taylor Phillips, a student at Stanford University (and a Kentuckian studying in California):
Southern condition participants rated intelligence on average 3.2 (SD=1.36), while Standard condition participants rated intelligence on average participants explicitly to rank intelligence, the Southern voices received an average rating of 3.05 (SD=1.43), while Standard voices received an average rating of 5.25 (SD=1.16). The average difference between Southern and Standard voices within participants’ intelligence ratings was -1.6 (SD=1.12;; Southern minus Standard). For the explicit intelligence measure, this average difference increased to -2.2 (SD=1.18). This suggests that Southern accent does trigger differences in social perception of intelligence, and that these differences are both strong and in the direction of the stereotype.
A similar result from a dissertation by Hayley Heaton, a doctoral student at Emory University in Georgia:
The analyses revealed that when the speaker was talking with a standard accent, he or she was rated significantly more intelligent (F (1,60) = 4.14, p = 0.05), more arrogant (F (1,60) = 5.47, p = 0.02), smarter (F (1,60) = 4.49, p = 0.04), better educated (F (1,60) = 5.02, p = 0.03), and as having better English (F (1,60) = 12.90, p < 0.01) than Southern-accented speakers, regardless of passage type.
What I find most interesting about this is that so many other prejudices about groups of people — or at least negative prejudices — have an air of social unacceptability. But making fun of Southerners as dumb hicks seems to be fair game.
My best hypothesis (unsupported by any data I can find) is that prejudice against Southerners remains acceptable because unlike many other group stereotypes, it’s not tied to any particular racial or ethnic group. It’s taboo today to make fun of someone for their race or ethnicity, which makes stereotypes about people from diverse Northern cities a minefield. (These stereotypes, which do exist, are often good-natured or embraced by their subjects, not imposed by outsiders.) The same would potentially apply to other regions that also aren’t dominated by a single racial or ethnic group, though none come immediately to mind.
As with any analysis of stereotypes, it’s important to be cognizant of conflating effects — perhaps a region is seen as less intelligent because the education system there is poorer? But the fact that a stereotype is on average true of a group doesn’t justify treating individual members of that group as if they conform to that stereotype.
Have other people experienced similar judgements based on region or regional accent? Why do you think these are acceptable when so many other prejudices are not in contemporary America?
(TV) used to be the sort of thing that you watched casually week to week; you weren’t supposed to get deeply invested in the emotional lives of the characters, and the shows were designed to keep that involvement to a bare minimum. You were drawn by the actors’ charisma or good looks, but you weren’t supposed to worry about their inner lives, which were mostly nonexistent. It was the fans who read deeper meanings into the shows, and through fan fiction and essays they provided the emotional resonances that the TV shows were not intended to evoke. Doctor Who is a great example of a show that went full circle through the cycle of fandom; many of the writers and showrunners, as well as the actors, were great fans of the program when they were kids, and many of them worked on semi-official tie-in novels or radio plays while it was in hibernation. By reviving the program, they effectively recreated it in their fannish image; the characters are now capable of expressing the thoughts and emotions that could only be inferred in the original version.
The just-aired Christmas special, by the way, was merely okay — some very good elements, and lots of flaws, some in the episode itself and others planted by failures earlier in the series.
But the Doctor Who 50th anniversary special last month was among the show’s best episodes.
Several weeks ago, while discussing the oncoming winter with my Southern-raised girlfriend, we reached an impasse over what exactly constituted weather cold enough to get alarmed about. Coming from Louisiana, she insisted that anything in even the 40s Fahrenheit was frigid, weather to cause people to stay indoors, bundled up in front of the fireplace. Myself, growing up in bitter Chicago winters, said you can’t start calling weather “cold” until the weather at least falls into the 30s — and that even then, extreme cold doesn’t start until the thermometer falls to the single digits.
But clearly our perspectives were entirely subjective. The only way out of this situation, for any good rationally minded person, is to get more data.
So I went to my Facebook page and posted the following query:
Above what temperature would you generally consider the weather to be “hot,” as opposed to merely “warm”? Below what temperature would you generally consider the weather to be “cold,” as opposed to merely “cool”? (For context, please also provide the part of the country/world you grew up in.)
Twenty different people responded: nine men (counting me) and 11 women. Here’s what I found:
- The warmest temperature anyone considered cold was 62, though that may be an outlier — that respondent gave a range of only 11 degrees between cold and hot, much less than the average. Next up was 55 degrees, from a southerner.
- The coldest temperature that anyone considers not cold was a mere 11 degrees, from someone raised on a farm on the central South Dakota prairie.
- The coldest temperature anyone considered hot, aside from that same outlier (who said 73) was 85 degrees, while the highest threshold for the onset of true heat was 97.
- One person commented, “I think that you’ll find that the survey results will show that women get colder at a much warmer temperature than men.” And, in fact, he was right. The median female respondent said coldness began at 45 degrees, while the median male said coldness didn’t begin until 32 degrees. (Means told a similar story.) This wasn’t a function of a sample including a lot of females from warmer climes — the median latitude was about the same for both genders.
- But there was no difference when it came to when hotness began. Both men and women had a median hotness temperature of 90 degrees.
- Indeed, there was remarkable agreement about what constitutes heat. Setting aside the outlier, the range of hotness answers varied by only 12 degrees. The range of coldness answers varied by 45 degrees.
- Where people grew up, unsurprisingly, mattered. Using a little bit of judgement for people who had moved around (I defaulted to the town people listed as their hometown on their Facebook page) I plotted a longitude for each person. The southern half of the longitudes (a dividing line right through the southern part of the Chicago area) said cold began at a median of 42.5 degrees. The northern half said 33.5 degrees. (There was only a 2.5 degree difference on heat — the southern half said 92.5, while the northern half said 90.)
- The key difference, as shown on the below chart, was that while some northerners can’t stand the cold, no one from the south (minus one person who split his time as a kid between Indonesia and Alaska — he’s plotted as Indonesia and is a clear outlier, but clearly is someone who experienced both extremes) could. (Note that this actually a chart of the absolute value of latitude, because the southern hemisphere latitude of Jakarta looked weird, and distance from the equator is the more important factor.)
- The heat differences, again, are less dramatic:
This study didn’t actually end up proving anything or resolving my debate with my girlfriend. (For one thing, I’d prefer to have a sample size of several thousand points, not just a score.) But I had fun doing it, which is really the point of [social] science.
Interestingly, in our conversations, my girlfriend and I have agreed that the extremes aren’t actually where people disagree. That is, when it’s 102, everyone agrees it’s really hot, even if some people are more bothered by it than others. The same when the weather hits single digits — everyone agrees it’s really, uncomfortably cold. The conflicts arise in the middle ground — whether it’s warm enough to open the windows, or cold enough to require a comforter on top of bed sheets.
Making small-talk at a friend’s wedding in Waco, Texas, after talking about my life in South Dakota, I was more than once asked the same question: “So when did your flight come in?”
It didn’t, I’d reply. I drove the 950 miles down to Texas. And things were just getting started.
Every few years I like to hop in the car and put some mileage on it, seeing as many places as possible on a moderately circuitous route between home and some distant point. The road trip is, for those with more time than money (but a decent amount of each) the ideal way to travel. Flying is good to see a single destination, but driving lets you see things all along the way, too.
So four years ago a friend and I drove to Arizona in March, seeing a half-dozen Spring Training baseball games along with the Grand Canyon and various sites in between. Two years ago I went solo, visiting a friend in Denver, a volcano in New Mexico, a canyon in West Texas and relatives in San Antonio. Last month I retraced some of that — nearly 750 miles were duplicated, the north-south swing from South Dakota to Texas. But after attending the central Texas wedding that was the primary purpose of the trip, I veered off into new territory.
Also new this year: I wasn’t alone. When spending the better part of two weeks driving, it helps to have someone to share the wheel with. Fortunately, coming along with me for most of the ride was my girlfriend, Allison, my partner-in-banter for hours of driving, my guest at the wedding and my host for a surprise visit to her family’s home in northern Louisiana:
But that doesn’t come until a bit later. Read the rest of this entry »
The most important thing to know about a new board game is what role chance has in the play. To pick extreme examples, children’s classic “Candy Land” is entirely luck — you can’t be good or bad at Candy Land, you just draw randomly shuffled cards and do what they say. Chess, on the other hand, is pure strategy and no luck — both sides are perfectly balanced and there are no random elements.
Many of the best games have an element of both — a heavy role for strategy, so players’ abilities are tested, but some role for luck as an equalizer, to help keep less-experienced players in the game to the end. Games can be full with lots of one and a little bit of another, but generally speaking, I prefer an emphasis on strategy over luck — controlling your own destiny is more interesting to me than depending on the whims of a dice roll.
All that is prelude to saying that German import “Power Grid” (translated from the original, fantastic German name “Funkenschlag”) is my very favorite board game right now. It’s not purely strategy — there’s a deck of partially shuffled cards, for example — but generally speaking what happens in the game is almost entirely the result of player’s choices. But the brilliance of Power Grid is that it didn’t get rid of chance at the expense of game balance. To the contrary, several elegant (if intricate) mechanisms subtly penalize players in the lead and boost those trailing. The result is a gripping game where every action (or deliberate inaction!) has consequences, where a low key initial game builds to a high-pressure finish. If you like games that force you to think, strategize, and weigh difficult choices, you’ll love Power Grid.
Power Grid puts the players in the role of competing power magnates, trying to expand their company to dominate the electrical industry of the United States, or Germany (or other countries and regions sold separately). The core of the game involves three different steps repeated each turn:
First, players buy power plants, bidding against each other in an auction. At the start of the game, power plants are relatively inefficient — requiring a lot of fuel to power just a city or two. Consequently, they’re cheaper, costing just a few bucks of the game’s currency to buy — unless other players fix their eyes on the same plant and raise your bids. These bidding wars can drive the cost way up from its opening offer, and sometimes that’s even worth it. There’s a few different kinds of power plants, each of which takes a different kind of fuel — coal, oil, garbage or uranium, plus some “green” plants with no required inputs.
This matters because of the second step: players buy fuel from the market to power their plants. This is done in turn order, not via auction — but like the auctions, the law of supply and demand is on full display. There’s a limited supply of each resource, and the more players buy a resource, the higher the price gets. Don’t let the talk of economics scare you — this is clearly indicated on the board, not something involving finicky math. So even though coal starts out as cheap and abundant, and uranium is several times more expensive, if all your rivals have coal plants they could soon find it scarce and pricey, while you have cheap uranium all to yourself.
Finally, players build infrastructure to cities — so they can sell the electricity they generate to customers and earn money. This costs money for each city — plus extra money for overland connections between two cities. The cities of New England are cheap to reach, while the vast expanses west of the Mississippi will require a lot to spend. On the other hand, you might not have as much competition there because of the price, letting you keep expanding as others find their reach stymied.
After all that, players burn off their resources to power as many cities as they can. The more cities you power, the more money you get — but the more money it cost you to get there. And you can never, ever, rest on your laurels — every other turn at the minimum, if not every turn, you’ll be emptying your wallet on plants, resources and cities. Sometimes doing little or nothing for a turn can be the right move, to avoid overpaying for something, or to husband your money for the next turn when a much better power plant will become available. But the competition is fierce and laggards will pay the price.
Another plus is that the game doesn’t involve direct conflict between players. There’s no combat or attacks, no destroying other players’ hard work. But unlike some hobby games which can seem more like everyone playing their own solo game at the table, it does involve player interaction — and indeed makes it integral to the game.
The way the game incorporates competition and supply and demand is its most elegant aspect. But key to the game’s success is the system it puts in place to ensure balance. This is primarily done through artificially manipulating turn order, so players who are doing better are the last ones to buy resources and build into cities — meaning they’ll pay more and find their routes blocked. Leaders also are the first ones to bid on power plants, which hurts them because later in the auctions better power plants tend to become available. (Veteran players talk about the concept of “leading from behind” — intentionally keeping your income low to benefit from this system even as you position yourself for a late surge to the front. Of course, the fact that veterans can game this mechanic like this is partially a downside, in that it doesn’t help new players as much as you might think.)
These mechanics, combined with a few others, are one of the primary downsides of the game: it’s got a lot of little complicated elements that can be too much for some people — especially if no one at the table has played before to help teach and run things. Calculating the changes in turn order, figuring out how many resources to add each turn and handling all intricacies of the auctions can all seem overwhelming. Plus, many of these rules are artificial, without any benefit in theme, and don’t flow intuitively from the rules.
Even setting all that aside, the end game can involve quite a bit of math as players try to stretch their bank accounts for the final push. For me, this is a thrill (though I like to play with a pen and paper so I can jot down the various possibilities as I wait for my turn), but I can see how it would be a chore for people who like more casual games.
Power Grid isn’t for everyone. It’s involved, stressful and moderately complicated. People who like more casual games, or games with more of a random element, probably wouldn’t have fun with Power Grid. But for people who thrive on competition and strategy, it’s nigh perfect.
The game can incorporate anywhere from two to six players, though I’m told it’s best with four to six. (I’ve only ever played it with the larger groups.) Games take about 90 minutes to two hours. It involves both small pieces and math, so probably isn’t suitable for all but the most precocious children.
You can buy Power Grid at a local hobby store or on Amazon. Alternately, I own it and will gladly play it with you. Apologies in advance for beating you.
A question raised just now at work: if something can be preemptive, why can’t it just be emptive?
It’s a somewhat obscure example of a linguistic phenomenon that pops up periodically. Somewhat more famously, perhaps, is the question of why we can be “overwhelmed” and “underwhelmed” but are never just “whelmed.”
What happens is a word loses its meaning. In Old or Middle English, you have a word like “to whelm,” which means “overcome, as with emotions or perceptual stimuli.” That word gets a modifier, like “over.” Then, over the centuries, people gradually start using the compound form more and stop using the original root, until today, “whelm” is basically meaningless without a modifier.
The same thing happened with preemption. Originally, “emption” was a real word, in the late 15th Century, a noun meaning “buying.” Emption meant you were buying something; preemption, about a century later, meant you were buying something before someone else. Over the years, it got generalized to mean to do anything before somebody else — chiefly some sort of blow or strike. Meanwhile, “emption” fell out of the language.
So if a preemptive attack is to attack before someone else, an emptive attack would be just an attack, without reference to relative chronology. In other words, it’d be a pretty meaningless term. In this case, then, there’s a good word why we don’t use “emption” in the modern sense of “preemption.”
Can you think of more good examples?
- Another example is disgruntled/gruntled. We no longer say someone is “gruntled,” from “gruntle” which originally meant “to grumble” or to “grunt.” “Dis” is an intensifier. So someone who is disgruntled grumbles a lot. But only the compound form survived. (Via Larry Kurtz)
- Couth/uncouth. Our word “uncouth,” meaning “lacking good manners or refinement,” derives from the Old English uncuð and originally meant “unknown,” from cuð, the past participle of cunnan, “to know.” In the 16th Century or so it got its modern meaning. To the degree we say “couth” any more, it’s a back-formation from “uncouth.” (Via Pinedale Roundup.)
This post has been updated with the term “bound morpheme” and one or more examples.