Madness & Truth

Writer David H. Montgomery's thoughts on life and culture

Primates of the states

without comments

When most people think of Louisiana, they think of New Orleans, its most prominent city. But that might irk a typical Louisianan, who’s likely to live well outside the Big Easy. Only 350,000 people call New Orleans home, less than 8 percent of the state and not much more than the capital Baton Rouge.

In contrast, my current state of South Dakota evokes images of endless fields of prairie and farmland. But almost 20 percent of the Mount Rushmore State lives in the fast-growing city of Sioux Falls, a percentage point behind Chicago’s relation to Illinois.

Other states aren’t associated with a single city at all. Missouri is divided between St. Louis and Kansas City. The two biggest cities in Minnesota are so entwined they’re usually referred to by a joint name. And Alabama has not two queen cities but four: Birmingham, Montgomery, Mobile and Huntsville all have around the same population, and each is around twice as populated as any other city in the state.

These facts aren’t just idle curiosities. They reflect important principles in math, statistics and demography, laws of nature most of us follow unawares — or break just as obliviously.

Whether a dense urban area or a rural one, any state or country will tend to have a small number of very large cities and a large number of small towns. It’s a principle called the “power law,” and it shows up in a lot more than just cities. In languages, a few words like “the” are used a lot, while large numbers of words are never used at all. Similarly, in books, music and movies, a few hits get consumed by everyone, while the so-called “long tail” of more obscure works have just a few customers each. Scholars have even discovered that war casualties follow a power law — there’s only a small number of bloodbaths like World War II, but large numbers of little skirmishes and insurgencies that don’t even crack the front pages.

If you graph something that follows a power law, it will have an L-shape, with a spike at one side that drops sharply, then gently declines in a long tail:

But if you graph it over a logarithmic scale — such as 1, 10, 100, 1,000, 10,000, etc., where values rise exponentially with every tick — a power law distribution will instead resemble a straight line:

Graph from M. E. J. Newman. 2005. “Power laws, Pareto distributions and Zipf’s law.” Contemporary Physics, 46, 323-51.

(Not everything follows the power law — for example, average heights for men or women follows a “normal” distribution, with a lot of people near the average and smaller numbers on both sides. Both the power law and the normal distribution are examples of different ways numbers can cluster.)

But here’s the catch: the power law can be broken, even to things where it normally applies. There are other forces at work besides this tendency of some occurrences to involve a few big things and lots of little things. For example, while city size generally follows a power law, some countries have one city that’s disproportionately larger than the rest. A classic example is Paris, France’s cultural, political, economic and demographic center of gravity. If France followed the power law, Paris would still be the biggest city, but Marseilles or Lyons would be relatively larger and more powerful, a rival center of power and influence in the country instead of being on the country’s periphery. (Compare France to Italy, where Rome is big and important, but so is the business capital of Milan.)

Cities like Paris are called “primate cities” because they dominate their countries. The term, as coined by Mark Jefferson in 1939, just refers to cities that are “at least twice as large as the next largest city and more than twice as significant,” though its application can be somewhat subjective.

Most of the research on primate cities has happened at the national level, but the same logic can apply at a regional or sub-national level. New York City, the U.S.’s largest, isn’t a primate city for the whole country (despite the opinions of its residents). But no one can deny the dominance NYC exerts over its home state, 40 percent of whose population resides there. And that’s not even counting the influence of the Big Apple in parts of New Jersey and Connecticut.

I used data on city populations in each state to look at the role of each state’s largest city and how it relates to both the second-largest city and the rest of the state.

This ranges from New York, where NYC is 32 times larger than second-banana Buffalo, to Alabama, where the 212,000 people in Birmingham only slightly outnumber the 206,000 in Montgomery. As a percentage of the population, New York City and Anchorage, Alaska, both have more than 40 percent of their states’ populations, while Columbia, South Carolina, and Charleston, West Virginia, have less than 3 percent of theirs despite being the largest in the state.

Top five cities by share of state’s population:

  1. New York City, 42.95% of New York
  2. Anchorage, 40.82% of Alaska
  3. Honolulu, 28.06% of Hawaii
  4. Albuquerque, 26.63% of New Mexico
  5. Omaha, 22.72% of Nebraska

Top five cities with the most people compared to the second-largest city in their state:

  1. New York, New York, 32.41 times larger than Buffalo
  2. Chicago, Illinois, 13.58 times larger than Aurora
  3. Anchorage, Alaska, 9.28 times larger than Juneau
  4. Honolulu, Hawaii, 9.03 times larger than Hilo
  5. Baltimore, Maryland, 6.24 times larger than Columbia

Bottom five cities by share of state’s population:

  1. Charleston, 2.75% of West Virginia
  2. Columbia, 2.79% of South Carolina
  3. Newark, 3.13% of New Jersey
  4. Bridgeport, 4.08% of Connecticut
  5. Jacksonville, 4.33% of Florida

Bottom five cities with the most people compared to the second-largest city in their state:

  1. Birmingham, Alabama, 1.03 times larger than Montgomery
  2. Charleston, West Virginia, 1.04 times larger than Huntington
  3. Columbia, South Carolina, 1.05 times larger than Charleston
  4. Cheyenne, Wyoming, 1.06 times larger than Casper
  5. Memphis, Tennessee, 1.07 times larger than Nashville

Get the rest of the lists here.

I looked only at official city size, recognizing the shortcomings of this approach. The true strength of many urban centers doesn’t stop at the legal boundary but encompasses the great masses of residents and businesses in surrounding suburbs. In this sense, looking at metropolitan areas would be a better bet. But many metropolitan areas, as defined by the U.S. Census, spill over into other states, creating too much messiness for a state-focused analysis.

But being a primate city — or not — is about more than just beating #2. It means being truly dominant center of a state or country. In a normal power-law situation, the second-largest city will have half the population of the largest, while the third-largest will have one-third the people, and the fourth one-quarter, descending in inverse proportion to their rank.

So I pulled lists of the population of every city in each state from Wikipedia, and graphed them on a logarithmic scale. Remember that if something follows a power law, when graphed on a logarithmic scale it will appear to be a straight line.

That’s what we see looking at Missouri:

Graphs created by David H. Montgomery.

Graphs created by David H. Montgomery.

In contrast, Illinois displays a clear example of a primate city, with Chicago’s population far above the trend line evidenced by the rest of the state.

ilplot

Other states don’t clearly fall into one category or another. Some have several big cities distinct from the general trend. Some have two cities near the top who also happen to be very near each other — both New Jersey and Minnesota reflect this situation. (Though New Jersey is probably better understood as being pulled between two different cities, neither one in its borders: New York City and Philadelphia.) In other cases, the two cities are far apart: Sioux Falls and Rapid City are on opposite sides of the state, as are Seattle and Spokane, and Philadelphia and Pittsburgh. In those cases, distance could mitigate the “primate city” effect, leaving the distant second city more as a distinct provincial capital than as runner-up.

The right end of each graph isn’t as important as the left — in many cases the linear trend breaks down on the right due to the large number of smaller cities, but research has shown smaller towns don’t follow the power law distribution as consistently as cities over 10,000 people do. Different states also had wildly varying sample sizes, ranging from hundreds of cities to less than two dozen. Also, the y-axis of each graph varies, with the top of the y-axis representing the population of the state’s largest city, whether that’s eight million or 42,000.

Graphs created by David H. Montgomery.

Graphs created by David H. Montgomery. Click to enlarge.

Finally, here’s another way to visualize the importance of various states’ largest cities. This map shows the approximate location of those biggest cities, and is shaded to reflect what percentage of each state’s population lives in that city — as a percentage of the total pixels in the state’s area on this particular map. The map can be a little misleading — the same percentage in a bigger state will take up more area, so while the New York and Texas circles are about the same absolute size, New York’s represents 42.95 percent of the state’s pixels, while Texas’ occupies just 8.29 percent of its. More or less — I had to select the pixels by hand, so in most cases there’s a little error in one direction or another.

Map created by David H. Montgomery.

Map created by David H. Montgomery.

 

Written by David Montgomery

March 31st, 2014 at 1:04 pm

Posted in Maps,My work

When the pain’s in the change

without comments

Today is the beginning of Daylight Saving Time, which many people love to hate. Studies have shown there are both health and economic costs to Daylight Saving Time, and no one enjoys the beginning of DST, when we lose an hour. (I’m actually kind of partial to the end, when we gain an hour.) And I am told that any discomfort someone like myself feels from clock-changes is nothing compared to parents of small children, who are less able to regulate their own body clock according to artificial factors like a clock change.

But here’s the thing. All these downsides to Daylight Saving Time have nothing to do with whether the sun sets at 6 p.m. or 7 p.m. They’re about the fact that we change, in a single day, from one time to another. When it’s December, I think people are actually pretty content that the sun isn’t rising at 8:30 a.m. And I definitely appreciate it being bright late into the evening in the peak of the summer. It just really stinks for a few days each spring and fall to have to reconfigure one’s internal clock.

Sioux Falls solar day, from weatherspark.com.

Daylight Saving Time is illustrative of a broader principle: In many cases, when we complain about changes, what really bothers us is not the new normal, but the transition to get to the new normal. Put another way: sometimes it’s not the result that’s painful, it’s the change itself.

Take, for example, a family that’s earning $100,000 per year. Then, suddenly, something changes and they’re earning just $70,000 per year. There’s nothing wrong about earning $70,000 per year. Lots of families earn that much or less and are still comfortable and happy. But the change to a $70,000 salary from a much higher one can be painful. (Think about this example when listening to a lot of political discussion about changes to benefits and tax rates. This theory of the Painful Change explains why people will react so strongly to the proposal that their tax rate or government benefit change to a new, less generous, level that seems to a dispassionate observer to be perfectly reasonable.)

I think of the Painful Change maxim, too, when reading commentary and debate about climate change. If a region’s climate becomes hotter and drier, that’s bad news for all the living things (humans included) who currently live there. It’s not necessarily bad news for life itself, which in the long term will adapt to the new normal, possibly with new species or new behaviors from old species. But it can be catastrophic for everything that had adapted to the old way. Life thrives in the climate of St. Louis and life thrives in the climate of Minneapolis, but if Minneapolis’ climate changes to be like St. Louis’, it’s not going to be pleasant for things already living in Minneapolis.

Don’t take this idea of the Painful Change to diminish the significance of this transitional agony. I’m not making a “Who Moved My Cheese” argument that we should just suck it up and accept negative change because the new situation is all that matters — though in many cases, graceful adaptation to change is exactly what’s called for. My point is that we should conceptually distinguish between journey and destination. Sometimes we have to endure painful changes to get to good results. (I’d put Daylight Saving Time in that category.) Sometimes painful changes lead to painful results. Similarly, pleasant changes can lead to good or bad situations. And sometimes the magnitude of the change outweighs the magnitude of the result — while it can be worth it to endure a terrible change to get to a much better place, it’s not worth enduring a terrible change for a trivial improvement in one’s situation.

The comfort of the transition doesn’t necessarily tell us anything about where we end up, and we should recognize that when we make decisions — or before we start complaining about turning our clocks back in the spring.

Written by David Montgomery

March 9th, 2014 at 9:49 am

Posted in Philosophy

The golden age of TV: now

with one comment

A month ago, a map went viral showing the (allegedly) most popular television shows set in each state. South Dakota got “Deadwood.” Washington got “Frasier.” Maine, “Murder She Wrote.” Take a look at it here:

The map was produced by Business Insider, and at their site you can find justifications for the rankings.

It’s the sort of project designed as much to provoke argument as it is to settle them. And what interests me more than the map itself is one of those arguments I got into on a friend’s Facebook wall.

After I pointed out that (with one exception) the map-makers had excluded reality shows, someone I didn’t know got his dander up:

Wow I don’t think there is much else on TV anymore that is not a reality show. Seams like they are on par to put real actors out of business because they can pay these hicks peanuts compared to accomplished actors. Seams like a case of too many channels. I say we cut back to like 10 channels and our IQ would go up 40 points.

Oh and to prove my point 75% of the shows listed in the pictures are pre-1990′s and don’t exist anymore. So its more nostalgia then actually whats the “most popular”. And now its about impossible to find shows set in any state other than California, Texas, or New York. Even though ironically most of them are produced and filmed in British Columbia.

This included one major factual claim — that 75 percent of the shows were “pre-1990′s,” and from a quick glance over the map, it didn’t seem to be correct. So I opened up a spreadsheet.

After an hour or so of hand-entring data about TV shows that I later discovered Business Insider had already gathered, I had my result — and it proved my intuition right. Many, or even most, of the shows were new:

Premiere Shows Percent
1950s 2 4%
1960s 4 8%
1970s 5 10%
1980s 10 20%
1990s 11 22%
2000s 13 26%
2010s 5 10%

In fact, more than half had premiered in the 1990s or later. More than a third had come in this millennium. There were more shows from the 2000s than from the 1950s, 1960s and 1970s combined.

But the spreadsheet was more interesting than that. So even though my interlocutor fell silent at this point, I picked up his side of the discussion and imagined why the current television landscape might seem a vast wasteland. There’s one obvious answer: if you don’t have cable. Particularly, if you don’t have premium cable.

Because there is fantastic television being produced today, but a lot of it isn’t on ABC, NBC and CBS. To see great shows on the list like “Breaking Bad,” “Justified” and “Deadwood,” you needed to be watching AMC, FX or HBO. And if you aren’t — like I was growing up in a broadcast-only household — it might seem like TV is nothing more than laugh-track comedies and singing competitions.

Of course, cable is a relatively recent phenomenon compared to broadcast TV:

 

cablechartYou can see the trend.

But my erstwhile opponent did have one good point. There are a LOT of reality shows on TV, particularly compared to yesteryear.

It’s just that there are also a lot of non-reality shows on TV. Because there are more shows on TV, period.

That’s been the biggest impact of the cable revolution. There are more players producing TV programming than there were in the days of three or four or five networks. Much of it is awful. Some of it is fantastic. The challenge is finding the wheat in the chaff — but then, that’s the fundamental challenge of modern life, a world sufficed with more options and information than any one person could possibly consume.

(Now, this is a flawed dataset. A better picture would come from a ranking of the best TV shows in history, not one that limits states like California and New York to just one show set there, while forces obscure choices onto the list for states like North Dakota and New Hampshire. But it’s still an instructive exercise.)

For those curious, you can view the full spreadsheet here. Below is a list of all the networks and the number of shows on the top 50.

Network Shows
NBC 13
CBS 11
ABC 7
HBO 5
AMC 2
Fox 2
Lifetime 2
Comedy Central 1
CTV 1
FX 1
History 1
Sci-Fi 1
Showtime 1
TNT 1
WB/CW 1

Written by David Montgomery

January 23rd, 2014 at 9:19 am

Posted in Television

The Absaroka difference

without comments

Originally published on Argus Leader Media‘s Political Smokeout blog on Oct. 18, 2013. Updated and reposted here.

News that some rural Colorado counties are trying to secede from their increasingly urban and liberal state has revived talk of a historical curiosity: the attempt, during the Great Depression, to create a new state out of parts of northern Wyoming, western South Dakota and southern Montana.

The name of the state, which would have been America’s 49th, was proposed to be Absaroka.

Here’s what it would have looked like:

I’m not concerned here with discussing the wisdom of secession, or the practicalities thereof. What got me curious today was a simpler question: what would South Dakota’s politics be like if these counties, some of the most reliably Republican in the state, weren’t part of the South Dakota electorate?

An Absaroka-less South Dakota would be more Democratic than the current Mount Rushmore state — but only to a degree.

One quick shorthand method for calculating the partisan lean of a state is the Cook Partisan Voting Index. Basically it looks at shares of the presidential vote to calculate how much more Democratic or Republican a state is than the country as a whole.

Real South Dakota (RSD), for example, has a PVI of “R+10,” meaning it’s 10 percentage points more Republican than the country. California is D+9, meaning it’s 9 percentage points more Democratic than the country. Virginia is dead even, meaning its partisan lean exactly matches the country.

Fortunately, Absaroka would have split along county boundaries, so it’s relatively easy to calculate the PVI for Alternate South Dakota. In the 2008 presidential election, John McCain would have won 52.4 percent of the two-party vote (he actually got 54.2 percent of the two-party vote). In the 2012 presidential election, Mitt Romney would have won 57.5 percent (he really won 59.2 percent). Comparing that average, 55 percent Republican, to the national Republican share of 47.1 percent, Alternate South Dakota ends up as an R+7.8.

In real life, South Dakota is R+10, so losing Absaroka would have made South Dakota about two percentage points more Democratic.

That’s not a ton. South Carolina is an R+8 state. Montana is an R+7. Both are solidly red states at the presidential level. (Georgia, at R+6, is the bluest state right now with two Republican senators.)

But small shifts can make the difference in close elections.

For example, in 2010, Kristi Noem beat Stephanie Herseth Sandlin by around 7,000 votes. But in Alternate South Dakota, without Noem’s Black Hills electoral strongholds, Herseth Sandlin narrowly wins re-election by 6,700 votes — a near inversion of the actual result. (Another potential boost for Herseth Sandlin: if Custer County were in a different state, independent B. Thomas Marking wouldn’t have been a candidate in the race.)

And Tom Daschle would have broken the Curse of Karl Mundt in 2004 if western South Dakota had gone to play in Absaroka. In real life, Thune beat Daschle by 4,500 votes. Alternate South Dakota would have voted for Daschle by a 9,300-vote margin.

(Big caveat: this is a scenario in which one assumes all other factors remain the same. In fact, a South Dakota without its western portion would have different politics. Different issues would be dominant. Candidates might take different positions, responding to different pressures from their constituents. Campaigning patterns would unfold differently.)

This only goes so far. For example, 2010 Democratic gubernatorial candidate Scott Heidepriem can draw little consolation from this counter-factual. In real life Dennis Daugaard won by 23 points and 73,000 votes. Alternate South Dakota would have voted for Daugaard by the only slightly less overwhelming total of 21 points and 52,000 votes.

What to take away? Geography matters. South Dakota is Republican through and through, and would remain so even if the most Republican part of the state were sliced off. But the slight shift toward the political center could have had big impacts in the state’s recent close elections.

Miscellaneous things I am pondering:

  • What would the smaller South Dakota’s nickname be? Still the Sunshine State? Or something different?
  • If the tourist hordes heading to the Black Hills were heading to another state, do you think Alternate South Dakota would have put tollbooths up on I-90?
  • Would Pierre still be the capital? The physical investment in government infrastructure would be expensive to duplicate. But while Pierre is geographically central to South Dakota and has major population centers to its west in the Black Hills, in Alternate South Dakota there’s very little to the west of Pierre.
  • In January, a Wyoming sportswriter took a look at a similar question: what would the high school sports conferences look like in Absaroka? If you like sports, give it a read.
  • For more information on Absaroka and other attempts for parts of a state to secede into a new state, check out Andrew Shears’ project, “The 124 United States That Could’ve Been.” Here’s his map:

Written by David Montgomery

January 21st, 2014 at 12:44 pm

Posted in History,Maps,My work

Interesting words, part 4

without comments

A continuing series. From “Competition and the Efficiency of Bureaucracies,” by Gary Becker.

Bureaucracies are large complex hierarchical organizations governed… by formal rules rather than discretionary choices. This apparent rigidity in the decision-making process does not necessarily make bureaucracies “inefficient” because they may have advantages of scale and scope that offset their disadvantages of inflexibility and remote decision-making.

Okay, so “interesting” may be relative in this particular case. But everyone has to deal with bureaucracies and rigidity at some point in their life. This is a good, quick summary of why bureaucracies have drawbacks — and why they can be the best way to do things even with those drawbacks.

A similar thought, coincidentally, popped up in a presentation about the evolution of board games, sent to me the other day by a friend. Games journalist Quintin Smith, giving a talk about all the ways board games have evolved, started talking about the wargame “A Few Acres of Snow.” The discussion starts at 19:53 in the presentation.

This is a sickeningly well-designed game. This is just beautiful. It’s a wargame about the French and English fighting for control of their Canadian colonies, which sounds like whatever it sounds like. It uses deck-building to simulate the logistics of running a war in a foreign country.

Okay, I’m not selling this.

The point is, you have your deck, and your deck represents soldiers, the Indians you’ve recruited, the priests, the home support, the boats. More importantly, it contains cards for every piece of territory you control. And the territory cards are relatively useless, which means the more you spread yourself, the more land you spread yourself over, the less control you have.

Every hand of cards you draw is a story, because you need soldiers, and then your deck, which is basically your subordinates, says, “We don’t have any soldiers, not now.” “We need boats!” “No boats, they’re all somewhere else.”

And you just can’t do this! The amazing thing is, it’s a war game, but really, you’re fighting your own logistical battles. And it’s amazingly tense. Because if your deck would do what you wanted it to for just one turn, you could hit Montreal and you could take it and you could end the game. But it never gives you that.

And the coolest thing about this is, there’s actually a sort of administration card. As a general, you can say, “This is a mess. We need administration.” And the administration card, when it comes up in your hand, lets you remove cards from your deck permanently — with the twist that there’s no way of getting rid of the administration card. So if you build an administration, there’s no way to remove it. It’s like you’re permanently deciding, “We need more desks! We need people sort of running the war for me.” And then that starts getting in your way as well. (Emphasis added)

The way in which clever game design can replicate real-world experiences in ways beyond just moving pieces on a board (for another example, see my post on the supply-and-demand mechanics in the board game “Power Grid”) continually impresses me. The entire structure of an entertaining card game ends up replicating the insights of academic experts into the strengths and drawbacks of the bureaucracies that are inevitable in modern life.

Written by David Montgomery

January 20th, 2014 at 10:19 am

Posted in Games,Quotes

The socially acceptable prejudice?

with one comment

I didn’t believe it at first when I met Southerners who told me how they were routinely dismissed as unintelligent by Northerners the minute a drawl came out of their mouths — and mocked and infantilized for the same. I had never had that reaction myself, and never heard anyone talking about it.

But these Southerners — some of whom are trying to lose their Southern accents to avoid this situation — aren’t imagining things or being over-sensitive. Multiple studies have looked at identical passages read by people in Southern and “standard” accents and found that listeners rate the person reading in a Southern accent as less intelligent, less wealthy and less educated than the same passage read by a non-Southerner.

Chart from "Put Your Money Where Your Mouth Is: The Effects of Southern vs. Standard Accent on Perceptions of Speakers," by Taylor Phillips

Chart from “Put Your Money Where Your Mouth Is: The Effects of Southern vs. Standard Accent on Perceptions of Speakers,” by Taylor Phillips

From a study by Taylor Phillips, a student at Stanford University (and a Kentuckian studying in California):

Southern condition participants rated intelligence on average 3.2 (SD=1.36), while Standard condition participants rated intelligence on average participants explicitly to rank intelligence, the Southern voices received an average rating of 3.05 (SD=1.43), while Standard voices received an average rating of 5.25 (SD=1.16). The average difference between Southern and Standard voices within participants’ intelligence ratings was -1.6 (SD=1.12;; Southern minus Standard). For the explicit intelligence measure, this average difference increased to -2.2 (SD=1.18). This suggests that Southern accent does trigger differences in social perception of intelligence, and that these differences are both strong and in the direction of the stereotype.

A similar result from a dissertation by Hayley Heaton, a doctoral student at Emory University in Georgia:

The analyses revealed that when the speaker was talking with a standard accent, he or she was rated significantly more intelligent (F (1,60) = 4.14, p = 0.05), more arrogant (F (1,60) = 5.47, p = 0.02), smarter (F (1,60) = 4.49, p = 0.04), better educated (F (1,60) = 5.02, p = 0.03), and as having better English (F (1,60) = 12.90, p < 0.01) than Southern-accented speakers, regardless of passage type.

What I find most interesting about this is that so many other prejudices about groups of people — or at least negative prejudices — have an air of social unacceptability. But making fun of Southerners as dumb hicks seems to be fair game.

My best hypothesis (unsupported by any data I can find) is that prejudice against Southerners remains acceptable because unlike many other group stereotypes, it’s not tied to any particular racial or ethnic group. It’s taboo today to make fun of someone for their race or ethnicity, which makes stereotypes about people from diverse Northern cities a minefield. (These stereotypes, which do exist, are often good-natured or embraced by their subjects, not imposed by outsiders.) The same would potentially apply to other regions that also aren’t dominated by a single racial or ethnic group, though none come immediately to mind.

As with any analysis of stereotypes, it’s important to be cognizant of conflating effects — perhaps a region is seen as less intelligent because the education system there is poorer? But the fact that a stereotype is on average true of a group doesn’t justify treating individual members of that group as if they conform to that stereotype.

Have other people experienced similar judgements based on region or regional accent? Why do you think these are acceptable when so many other prejudices are not in contemporary America?

Written by David Montgomery

January 16th, 2014 at 1:50 pm

Posted in Culture

Interesting words, part 3

without comments

A continuing series.

(TV) used to be the sort of thing that you watched casually week to week; you weren’t supposed to get deeply invested in the emotional lives of the characters, and the shows were designed to keep that involvement to a bare minimum. You were drawn by the actors’ charisma or good looks, but you weren’t supposed to worry about their inner lives, which were mostly nonexistent. It was the fans who read deeper meanings into the shows, and through fan fiction and essays they provided the emotional resonances that the TV shows were not intended to evoke. Doctor Who is a great example of a show that went full circle through the cycle of fandom; many of the writers and showrunners, as well as the actors, were great fans of the program when they were kids, and many of them worked on semi-official tie-in novels or radio plays while it was in hibernation. By reviving the program, they effectively recreated it in their fannish image; the characters are now capable of expressing the thoughts and emotions that could only be inferred in the original version.

From “Doctor Who: Old Vs. New,” by “Lightninglouie,” at io9′s “Observation Deck”.

The just-aired Christmas special, by the way, was merely okay — some very good elements, and lots of flaws, some in the episode itself and others planted by failures earlier in the series.

But the Doctor Who 50th anniversary special last month was among the show’s best episodes.

Written by David Montgomery

December 26th, 2013 at 2:10 pm

Posted in Quotes,Television

Hot or not

without comments

Several weeks ago, while discussing the oncoming winter with my Southern-raised girlfriend, we reached an impasse over what exactly constituted weather cold enough to get alarmed about. Coming from Louisiana, she insisted that anything in even the 40s Fahrenheit was frigid, weather to cause people to stay indoors, bundled up in front of the fireplace. Myself, growing up in bitter Chicago winters, said you can’t start calling weather “cold” until the weather at least falls into the 30s — and that even then, extreme cold doesn’t start until the thermometer falls to the single digits.

But clearly our perspectives were entirely subjective. The only way out of this situation, for any good rationally minded person, is to get more data.

So I went to my Facebook page and posted the following query:

Above what temperature would you generally consider the weather to be “hot,” as opposed to merely “warm”? Below what temperature would you generally consider the weather to be “cold,” as opposed to merely “cool”? (For context, please also provide the part of the country/world you grew up in.)

Twenty different people responded: nine men (counting me) and 11 women. Here’s what I found:

  • The warmest temperature anyone considered cold was 62, though that may be an outlier — that respondent gave a range of only 11 degrees between cold and hot, much less than the average. Next up was 55 degrees, from a southerner.
  • The coldest temperature that anyone considers not cold was a mere 11 degrees, from someone raised on a farm on the central South Dakota prairie.
  • The coldest temperature anyone considered hot, aside from that same outlier (who said 73) was 85 degrees, while the highest threshold for the onset of true heat was 97.
  • One person commented, “I think that you’ll find that the survey results will show that women get colder at a much warmer temperature than men.” And, in fact, he was right. The median female respondent said coldness began at 45 degrees, while the median male said coldness didn’t begin until 32 degrees. (Means told a similar story.) This wasn’t a function of a sample including a lot of females from warmer climes — the median latitude was about the same for both genders.
  • But there was no difference when it came to when hotness began. Both men and women had a median hotness temperature of 90 degrees.
  • Indeed, there was remarkable agreement about what constitutes heat. Setting aside the outlier, the range of hotness answers varied by only 12 degrees. The range of coldness answers varied by 45 degrees.
  • Where people grew up, unsurprisingly, mattered. Using a little bit of judgement for people who had moved around (I defaulted to the town people listed as their hometown on their Facebook page) I plotted a longitude for each person. The southern half of the longitudes (a dividing line right through the southern part of the Chicago area) said cold began at a median of 42.5 degrees. The northern half said 33.5 degrees. (There was only a 2.5 degree difference on heat — the southern half said 92.5, while the northern half said 90.)
  • The key difference, as shown on the below chart, was that while some northerners can’t stand the cold, no one from the south (minus one person who split his time as a kid between Indonesia and Alaska — he’s plotted as Indonesia and is a clear outlier, but clearly is someone who experienced both extremes) could. (Note that this actually a chart of the absolute value of latitude, because the southern hemisphere latitude of Jakarta looked weird, and distance from the equator is the more important factor.)

coldchart

  • The heat differences, again, are less dramatic:

hotchart

This study didn’t actually end up proving anything or resolving my debate with my girlfriend. (For one thing, I’d prefer to have a sample size of several thousand points, not just a score.) But I had fun doing it, which is really the point of [social] science.

Interestingly, in our conversations, my girlfriend and I have agreed that the extremes aren’t actually where people disagree. That is, when it’s 102, everyone agrees it’s really hot, even if some people are more bothered by it than others. The same when the weather hits single digits — everyone agrees it’s really, uncomfortably cold. The conflicts arise in the middle ground — whether it’s warm enough to open the windows, or cold enough to require a comforter on top of bed sheets.

Written by David Montgomery

December 9th, 2013 at 11:00 am

Posted in Miscellany

Real road-tripping: Southern swing

with 2 comments

Making small-talk at a friend’s wedding in Waco, Texas, after talking about my life in South Dakota, I was more than once asked the same question: “So when did your flight come in?”

confederatecircleIt didn’t, I’d reply. I drove the 950 miles down to Texas. And things were just getting started.

Every few years I like to hop in the car and put some mileage on it, seeing as many places as possible on a moderately circuitous route between home and some distant point. The road trip is, for those with more time than money (but a decent amount of each) the ideal way to travel. Flying is good to see a single destination, but driving lets you see things all along the way, too.

So four years ago a friend and I drove to Arizona in March, seeing a half-dozen Spring Training baseball games along with the Grand Canyon and various sites in between. Two years ago I went solo, visiting a friend in Denver, a volcano in New Mexico, a canyon in West Texas and relatives in San Antonio. Last month I retraced some of that — nearly 750 miles were duplicated, the north-south swing from South Dakota to Texas. But after attending the central Texas wedding that was the primary purpose of the trip, I veered off into new territory.

Also new this year: I wasn’t alone. When spending the better part of two weeks driving, it helps to have someone to share the wheel with. Fortunately, coming along with me for most of the ride was my girlfriend, Allison, my partner-in-banter for hours of driving, my guest at the wedding and my host for a surprise visit to her family’s home in northern Louisiana:

Allison and I in front of her library in West Monroe, La.

Allison and I in front of her library in West Monroe, La.

But that doesn’t come until a bit later.

Day 1: Sioux Falls, SD, to Omaha, NE (186 miles, 3 hours)

I started out with an evening drive after getting off work down to Omaha, where my college roommate Ian (and 2009 road-trip companion) put me up for a night. I crashed on his futon after getting roundly schooled in a series of Mario Kart heats.

Day 2: Omaha to Oklahoma City, OK (456 miles, 7 hours)

After a greasy-spoon diner breakfast in Omaha, the trip began in earnest. The morning drive to Kansas City was uneventful except very near the end, when I came over a hill to find that a truck had apparently just flipped around a too-tight curve (speed limit down to 55 mph from 70!). Traffic was at a standstill, just starting to pile up. Given where I came to a halt, I decided to engage in a bit of judicious lawbreaking and drove the wrong way up an onramp to detour around the blockage.

Lunch involved Kansas City BBQ, which was pretty good even though barbecue is not my favorite cuisine. Then it was off for the long, flat, quiet drive across Kansas and Oklahoma to OKC.

Day 3: Oklahoma City to Waco, TX (288 miles, 5 hours)

The actual drive to Waco, the site of my friend Abby’s wedding, is supposed to be around four hours. But that best-case scenario doesn’t reckon with Dallas traffic and road construction, a giant headache even without driving through the center of the city. (It was even worse northbound.) Still, we got there in plenty of time for the opening festivities of the wedding weekend: the rehearsal dinner at a ranch outside of the city.

The Oklahoma City National Memorial.

The Oklahoma City National Memorial.

All that came after an unplanned visit to the Oklahoma City bombing memorial, which happened to be across the street from where I parked for breakfast. It was impressive and somber, and I’m glad I stopped.

I post that photo here mostly because somehow, I didn’t take any photos at the rehearsal dinner! You can view some pictures from the wedding photographer here, though neither Allison nor I are in any of them.

Anyway, it was a fun time in pleasant surroundings. I met a bunch of college classmates again for the first time in six years, played soccer out on the lawn with Abby’s new stepson, and got just inebriated enough to make a toast. (It was short and non-embarassing, I think.) There were buses back to the hotel, though not buses with good ventilation and I surprised everyone by managing to fall asleep in the uncomfortable heat.

Day 4: Wedding in Waco

Again, my inner photographer appears to have not yet showed up for this vacation, for which I apologize. Most of the day was pretty quiet — sleeping in, seeing a bit of Waco, a trip to the drug store to get cough medicine for Allison’s unfortunately timed respiratory infection, then getting ready for the wedding. You’ll have to take my word for it that we looked snazzy, because you know.

The newly married Abby and Sam dance the Hora, or traditional Jewish wedding chair dance.

The newly married Abby and Sam dance the Hora, or traditional Jewish wedding chair dance.

The ceremony itself was Jewish, a mixture of traditional rituals and modern sensibilities, helpfully narrated in both English and Hebrew by the rabbi. (Unfortunately the prime seats I snagged ended up being not so prime when it turned out the wedding party stood in between us and the couple.) The dinner was very fancy, and tasty, though the careful seating arrangements were a bit blunted by the sheer volume in the room that made conversation difficult with anyone but one’s immediate neighbors.

To the certain shock of anyone who has even known me, I even danced a little bit once the music came on. I’m sure I’d have never ventured onto the floor without a better-coordinated girlfriend present, but I mostly didn’t regret it. (Note to acquaintances I will run into at future social gatherings: this is not a precedent.)

Day 5: Waco to Todd Mission, TX, to Huntsville, TX (179 miles, 3 hours)

It was the middle of November, but southern Texas didn’t get the memo. (Back home, Sioux Falls did, with the mercury falling to around 0 during the week I was away.) This Sunday was easily in the 80s and sunny, which was good, because we weren’t trapped in the car. Instead, Allison and I met my aunt, uncle and cousin north of Houston for a visit to the Texas Renaissance Festival.

Allison, Bob, Thomas and Kelly in the stands for a joust at the Texas Renaissance Festival.

Allison, Bob, Thomas and Kelly in the stands for a joust at the Texas Renaissance Festival.

Renaissance festivals, or at least this particular one, are basically a cross between state fairs and ComicCon. There’s a lot of overpriced food and merchandise, except everything has a vaguely medieval/fantasy veneer to it — along with a healthy dose of science fiction, steampunk, and things with only the most tenuous connections to genre fiction in the slightest.

They’re also pretty fun, and this time, I remembered to break out the camera.

Knife-juggling.

Knife-juggling.

Birds of prey.

Birds of prey.

Cyrano.

Cyrano.

allisonmonkey

 

Weapons demonstrations

Weapons demonstrations

Elephant ride!

Elephant ride!

Allison, myself, Bob and Thomas

Allison, myself, Bob and Thomas

It was definitely good to get out of the car, into the sun, and see some people wandering around with outlandish costumes and outlandishly fake accents for a day. We ate unhealthy food, drank overpriced alcohol (mead!), saw performances of carillons, bagpipes and insults, and marveled at all the merchandise that would have been all too tempting had we merely been millionaires. And did I mention I rode an elephant? (It’s not particularly medieval, but why pass it up? Sure, it was $8 for about 60 seconds, but if you’re not prepared to waste some money, don’t go to the fair.) Had I gone to the fair when I was a fantasy-loving 12-year-old, I might have exploded with excitement. My slightly more sober adult self merely enjoyed himself a great deal.

On top of all of this, there was good times with friends and family:

 

Day 6: Huntsville, TX, to West Monroe, LA (349 miles, five hours)

After the excitement and cacophony of the fair, Monday was much quieter — but in its own way, more intense. We drove through east Texas and Louisiana, bound for West Monroe — home of Allison’s parents, who I was about to meet for the first time. Oh, and they didn’t know we were coming.

Gulp.

But everything ended up going pretty well. We announced our surprise visit via phone early in the day’s drive, so everyone was ready when we pulled up into her family’s home, which they had fully remodeled a decade ago. Every one was very polite as we chatted, then went out to dinner for some traditional Louisiana food. (I felt bad I couldn’t finish, though it was probably my own fault for eating appetizers.) The all-important meeting went off without any issue, something I suspect was a big relief to all parties involved.

Day 7: West Monroe, LA, to Memphis, TN (256 miles, five hours)

2013-11-19 11.37.44On the way out of town, Allison and I stopped by the only reason most people have heard of West Monroe — the home base of the Duck Commander duck calls made by the Robertson family, stars of the smash reality show Duck Dynasty.

I tried to muster the appropriate level of excitement.

The trip involved five hours of driving, but it ended up taking a little bit longer because I found something much more interesting: the Vicksburg National Military Park, memorializing Ulysses S. Grant’s siege of Vicksburg that helped sever the South in two and ensure Union victory n the Civil War.

The actual park was somewhat overgrown with trees, which took away from some of the splendor. But the rolling terrain was suitably impressive:

 

2013-11-19 15.01.13

That’s a curved panorama of one of the Confederate redoubts during the siege, which was stormed unsuccessfully by Union troops up a narrow trench during one of Grant’s failed attempts to seize the city by force. He finally settled down and starved the rebels out.

2013-11-19 14.48.06The most notable thing about the Vicksburg battlefield is the monuments dotting it, built by the siege’s veterans decades after the war. For whatever reason, the Illinois contingent paid for by far the grandest memorial, a huge, echoey pavilion. (Or perhaps it’s no surprise, given that Illinois was home to both Grant and President Abraham Lincoln.) Another memorial was a huge spire so tall I had to use panorama mode on my phone to get it all in one shot. Unfortunately, my unsteady hands meant the ramrod-straight spire appeared to go all wobbly halfway up, so I’ll withhold the photo. Those same veterans who erected the memorials also put up signs around the battlefield. The plaques, red for Confederates and blue for Union, are laid out along each side’s lines and contain information about the units who fought there and descriptions of notable actions.

2013-11-19 15.39.19Also impressive were the remains of a Union gunship that was sunk in the Yazoo River during the Vicksburg campaign and re-floated and restored a century later. You could walk inside the ruins of the U.S.S. Cairo and tour a museum explaining its significance and displaying plenty of artifacts from its crew.

Sadly, it turns out that seeing Vicksburg properly requires a lot of time — more time than we had, driving as we were to meet Allison’s cousin and her family for dinner in Memphis. So after two hours, we cut our trip short before getting a chance to explore the heart of the Confederate lines.

No matter, though — an after-dark arrival in Memphis brought a chance to sample that town’s own barbecue options, followed by some quality time with a one-year-old and then much-needed sleep.

Day 8: Memphis to Huntsville, AL (215 miles, 3.5 hours)

2013-11-20 15.07.23Wednesday was a lazy day — sleeping in and not much driving. Before leaving Memphis we went to the downtown Beale Street, the heart of the city’s blues culture. So with just an hour or two to spare, we went to the city’s Rock N Soul Museum for an abbreviated tour of music history.

It wasn’t the most exciting museum, but was well done. A complimentary audio tour with admission included lots of music samples and narration, giving capsule biographies for many stars of 50s, 60s and 70s music and cultural context for the music in the racial tensions of the time. There were also plenty of artifacts, though none that really jumped out at me as I was looking for iconic photos to take.

Well, okay, there was one thing that was sort of attention-grabbing, alongside things like Ike Turner‘s piano: the flamboyant costume of professional wrestler Sputnik Monroe, a Memphis native who was an advocate for desegregation:

2013-11-20 15.02.45

2013-11-20 15.04.10

 

 

 

 

 

 

 

 

 

 

 

 

 

 

We didn’t get a chance to fully take in the museum, since we were on a timer. Eventually we set out across northern Mississippi and Alabama, bound for Huntsville, Alabama (not to be confused with our Day 5 stop in Huntsville, Texas). There, I met for the first time in person a decade-long pen-pal, Cliff:

I'm the guy on the left.

I’m the guy on the left.

Outside the restaurant, in downtown Huntsville, Allison and I cleverly disposed of the leftover chips we had taken home in a doggie bag a few days earlier and never eaten, though the way the little devils swarmed I was nervous for a second. (I needn’t have been. Allison has a black belt.)

2013-11-20 20.45.53

Day 9: Huntsville to Atlanta, GA (220 miles, 3.5 hours)

2013-11-21 10.10.22Huntsville’s primary claim to fame is that it’s home to NASA’s Marshall Space Flight Center. And while we couldn’t go inside, we could tour the U.S. Space and Rocket Center museum, which was top-notch for a pair of overgrown children like Allison and I. Lots of rockets and space vehicles on display (some authentic, others replicas), along with interactive demonstrations and — for some reason — a traveling display about Leonardo da Vinci.

In fact, the museum was so much fun we spent altogether too much time there, and were nearly late for our evening engagement several hours away in Atlanta. But before things came to that, we had fun doing things like Mars-themed rock-climbing:

Allison is the much better climber.

Allison is the much better climber.

Life-size space shuttle replicas:

2013-11-21 11.59.09-1

Really tall phallic symbols rockets:

2013-11-21 12.06.03

Moon rock!

2013-11-21 12.42.24

And various other things to pose in front of:

2013-11-21 12.09.23

2013-11-21 12.14.07

2013-11-21 12.07.10

 

 

 

 

 

Disappointed we couldn’t linger, we hit the road for Atlanta and our date in the evening: the Atlanta Symphony Orchestra’s concert featuring Shostakovich’s Fifth Symphony.

Though the evening ended up being something of a fiasco as we were under fierce time constraints and at one point running in full dress clothes through downtown Atlanta to get to our restaurant on time, I did enjoy being able to dress up and attend a classy night out:

2013-11-21 22.05.30

Day 10: Relaxing in Atlanta and a night at the theater

After the tumult of the night before, Friday was spent at a more leisurely pace. We slept in, and strolled instead of ran through Atlanta to get lunch. That night ended up an unqualified success, as we went to a fine performance of King Lear at Atlanta’s Shakespeare Tavern. Their setup is something of the dinner theater model: the audience is seated at tables, where they can get dinner before the show, and drinks and dessert throughout.

Post-show, despite the late hour and the early day we had ahead of us on Saturday, we couldn’t resist proceeding from the Shakespeare Tavern to the Marlow’s Tavern a mile away for a nightcap.

2013-11-22 18.15.12

2013-11-22 23.14.59

 

Day 11: Atlanta to St. Louis, MO (555 miles, 8 hours)

At an ungodly hour of the pre-dawn morning, we stumbled groggily out of our motel for our last drive together of the trip: to the Atlanta airport, where Allison was flying home to Philadelphia. She caught her flight without a hitch, and I found myself faced with about 15 hours to do eight hours of driving. (I sure wasn’t going to drive 15 hours on a half-night’s sleep.) So I set off, circuitously, first visiting Georgia’s Stone Mountain — the first attempt at carving giant men into a mountainside by Gutzon Borglum, who would later move on to a more famous work at Mount Rushmore, South Dakota.

2013-11-23 08.23.51

As an artistic work, Stone Mountain is pretty impressive, and more so the closer you get:

DSC_4701

Of course, any monument to the Confederacy is always a going to be a little uncomfortable to watch. The figures of Robert E. Lee, Stonewall Jackson and Jefferson Davis aside, the informative plaques on display were rather light on the existence of slavery, the institution all three were fighting (to varying degrees) to defend.

Not that Lost Cause revisionism seemed to be at the forefront of anyone’s mind at Stone Mountain these days. In fact, me as a tourist seemed rather out of place. Everyone else at the mountain that morning was busy setting up a winter village attraction that would soon open (though with fake snow, of course, because Georgia):

2013-11-23 08.18.02

2013-11-23 12.23.15

The view the Union remnants on Horseshoe Ridge would have had as they made their stand.

It was another few hours before my next stop, also Civil War themed: the battlefield of Chickamauga, where in 1863 Braxton Bragg’s Confederate army routed and nearly destroyed the federal Army of the Cumberland — but for the courageous stand of General George Thomas, who rallied fleeing troops on Horseshoe Ridge and held his ground against ferocious attacks until making an orderly withdrawal under cover of darkness. (Thomas got his nickname, “The Rock of Chickamauga,” when a union officer observing the situation wrote General William Rosecrans that “Thomas is standing like a rock.” That officer was a young Brigadier General James Garfield, the future president.)

My favorite part of the battle, though, was the performance of the “Lightning Brigade,” among the most technologically advanced units on the battlefield. Col. John T. Wilder equipped his men — with their own personal funds, until the Army, embarrassed, paid up — with the new Spencer repeating rifles, which could fire up to 20 rounds per minute, compared to three or four from traditional single-shot weapons. This gave Wilder’s men a huge advantage and let them face off many times their number. As mounted infantry, they could also move around the battlefield with rapidity but still hold a line. After a blunder left the Union army fleeing in a rout, the Lightning Brigade not only held its own but was preparing to counterattack and turn the advancing Confederate flank until Assistant Secretary of War Charles Dana got in the way and demanded to be immediately escorted to safety.

The battlefield was a driving loop with a cell phone-based audio tour. I was in a bit of a rush but still enjoyed seeing firsthand a battle I had read about just a few months prior at the New York Times’ excellent “Disunion” series, which chronicles the Civil War in real time 150 years to the day after the war’s events.

From Chickamauga I traveled up through Tennessee, Kentucky, and southern Illinois to just west of St. Louis, where, exhausted, I finally stopped for the night.

Day 12: St. Louis to Sioux Falls (611 miles, 8.5 hours)

The final day’s travel was quiet, if tiring — west to Kansas City, then north past Omaha to Sioux Falls. I took a straight shot with no detours, no tourism and no photos. After nearly two weeks on the road, and the last two days with heavy solo driving, I just wanted to be home.

That whole 12-day endeavor meant more than just checking a few more states off my to-visit list. It was also both fun and refreshing, a good break from work. Most of all, it was a great time with a swell lady. Even someone as introverted as I can admit that some things are just more fun doing with someone else.

All told I put around 3,300 miles on my car through 14 states, five of which I had never visited before:


Southern Swing | My new trip on Roadtrippers

Written by David Montgomery

December 6th, 2013 at 11:00 am

Posted in Travel

Board game review: “Power Grid”

without comments

powergrid-1The most important thing to know about a new board game is what role chance has in the play. To pick extreme examples, children’s classic “Candy Land” is entirely luck — you can’t be good or bad at Candy Land, you just draw randomly shuffled cards and do what they say. Chess, on the other hand, is pure strategy and no luck — both sides are perfectly balanced and there are no random elements.

Many of the best games have an element of both — a heavy role for strategy, so players’ abilities are tested, but some role for luck as an equalizer, to help keep less-experienced players in the game to the end. Games can be full with lots of one and a little bit of another, but generally speaking, I prefer an emphasis on strategy over luck — controlling your own destiny is more interesting to me than depending on the whims of a dice roll.

All that is prelude to saying that German import “Power Grid” (translated from the original, fantastic German name “Funkenschlag”) is my very favorite board game right now. It’s not purely strategy — there’s a deck of partially shuffled cards, for example — but generally speaking what happens in the game is almost entirely the result of player’s choices. But the brilliance of Power Grid is that it didn’t get rid of chance at the expense of game balance. To the contrary, several elegant (if intricate) mechanisms subtly penalize players in the lead and boost those trailing. The result is a gripping game where every action (or deliberate inaction!) has consequences, where a low key initial game builds to a high-pressure finish. If you like games that force you to think, strategize, and weigh difficult choices, you’ll love Power Grid.

Power Grid puts the players in the role of competing power magnates, trying to expand their company to dominate the electrical industry of the United States, or Germany (or other countries and regions sold separately). The core of the game involves three different steps repeated each turn:

powergrid-3

Four of Power Grid’s power plants, showing their base cost, how much fuel each takes and how many cities they can power.

First, players buy power plants, bidding against each other in an auction. At the start of the game, power plants are relatively inefficient — requiring a lot of fuel to power just a city or two. Consequently, they’re cheaper, costing just a few bucks of the game’s currency to buy — unless other players fix their eyes on the same plant and raise your bids. These bidding wars can drive the cost way up from its opening offer, and sometimes that’s even worth it. There’s a few different kinds of power plants, each of which takes a different kind of fuel — coal, oil, garbage or uranium, plus some “green” plants with no required inputs.

This matters because of the second step: players buy fuel from the market to power their plants. This is done in turn order, not via auction — but like the auctions, the law of supply and demand is on full display. There’s a limited supply of each resource, and the more players buy a resource, the higher the price gets. Don’t let the talk of economics scare you — this is clearly indicated on the board, not something involving finicky math. So even though coal starts out as cheap and abundant, and uranium is several times more expensive, if all your rivals have coal plants they could soon find it scarce and pricey, while you have cheap uranium all to yourself.

powergrid-4

Power Grid’s resource track, showing the coal (brown), oil (black), garbage (yellow) and uranium (red) available for purchase and how much each costs. The more of a resource gets purchased, the more each unit is worth.

Finally, players build infrastructure to cities — so they can sell the electricity they generate to customers and earn money. This costs money for each city — plus extra money for overland connections between two cities. The cities of New England are cheap to reach, while the vast expanses west of the Mississippi will require a lot to spend. On the other hand, you might not have as much competition there because of the price, letting you keep expanding as others find their reach stymied.

After all that, players burn off their resources to power as many cities as they can. The more cities you power, the more money you get — but the more money it cost you to get there. And you can never, ever, rest on your laurels — every other turn at the minimum, if not every turn, you’ll be emptying your wallet on plants, resources and cities. Sometimes doing little or nothing for a turn can be the right move, to avoid overpaying for something, or to husband your money for the next turn when a much better power plant will become available. But the competition is fierce and laggards will pay the price.

Another plus is that the game doesn’t involve direct conflict between players. There’s no combat or attacks, no destroying other players’ hard work. But unlike some hobby games which can seem more like everyone playing their own solo game at the table, it does involve player interaction — and indeed makes it integral to the game.

powergrid-2The way the game incorporates competition and supply and demand is its most elegant aspect. But key to the game’s success is the system it puts in place to ensure balance. This is primarily done through artificially manipulating turn order, so players who are doing better are the last ones to buy resources and build into cities — meaning they’ll pay more and find their routes blocked. Leaders also are the first ones to bid on power plants, which hurts them because later in the auctions better power plants tend to become available. (Veteran players talk about the concept of “leading from behind” — intentionally keeping your income low to benefit from this system even as you position yourself for a late surge to the front. Of course, the fact that veterans can game this mechanic like this is partially a downside, in that it doesn’t help new players as much as you might think.)

These mechanics, combined with a few others, are one of the primary downsides of the game: it’s got a lot of little complicated elements that can be too much for some people — especially if no one at the table has played before to help teach and run things. Calculating the changes in turn order, figuring out how many resources to add each turn and handling all intricacies of the auctions can all seem overwhelming. Plus, many of these rules are artificial, without any benefit in theme, and don’t flow intuitively from the rules.

Even setting all that aside, the end game can involve quite a bit of math as players try to stretch their bank accounts for the final push. For me, this is a thrill (though I like to play with a pen and paper so I can jot down the various possibilities as I wait for my turn), but I can see how it would be a chore for people who like more casual games.

Power Grid isn’t for everyone. It’s involved, stressful and moderately complicated. People who like more casual games, or games with more of a random element, probably wouldn’t have fun with Power Grid. But for people who thrive on competition and strategy, it’s nigh perfect.

The game can incorporate anywhere from two to six players, though I’m told it’s best with four to six. (I’ve only ever played it with the larger groups.) Games take about 90 minutes to two hours. It involves both small pieces and math, so probably isn’t suitable for all but the most precocious children.

You can buy Power Grid at a local hobby store or on Amazon. Alternately, I own it and will gladly play it with you. Apologies in advance for beating you.

Written by David Montgomery

October 22nd, 2013 at 10:30 am

Posted in Games

Emptivity

without comments

A question raised just now at work: if something can be preemptive, why can’t it just be emptive?

It’s a somewhat obscure example of a linguistic phenomenon that pops up periodically. Somewhat more famously, perhaps, is the question of why we can be “overwhelmed” and “underwhelmed” but are never just “whelmed.”

What happens is a word loses its meaning. In Old or Middle English, you have a word like “to whelm,” which means “overcome, as with emotions or perceptual stimuli.” That word gets a modifier, like “over.” Then, over the centuries, people gradually start using the compound form more and stop using the original root, until today, “whelm” is basically meaningless without a modifier.

The same thing happened with preemption. Originally, “emption” was a real word, in the late 15th Century, a noun meaning “buying.” Emption meant you were buying something; preemption, about a century later, meant you were buying something before someone else. Over the years, it got generalized to mean to do anything before somebody else — chiefly some sort of blow or strike. Meanwhile, “emption” fell out of the language.

So if a preemptive attack is to attack before someone else, an emptive attack would be just an attack, without reference to relative chronology. In other words, it’d be a pretty meaningless term. In this case, then, there’s a good word why we don’t use “emption” in the modern sense of “preemption.”

This phenomenon is called a “bound morpheme,” a morpheme (or linguistic element) that doesn’t stand on its own, but only while bound to another morpheme. Thanks to Jon Stutte for digging this up!

Can you think of more good examples?

UPDATED EXAMPLES:

  • Another example is disgruntled/gruntled. We no longer say someone is “gruntled,” from “gruntle” which originally meant “to grumble” or to “grunt.” ”Dis” is an intensifier. So someone who is disgruntled grumbles a lot. But only the compound form survived. (Via Larry Kurtz)
  • Couth/uncouth. Our word “uncouth,” meaning “lacking good manners or refinement,” derives from the Old English uncuð and originally meant “unknown,” from cuð, the past participle of cunnan, “to know.” In the 16th Century or so it got its modern meaning. To the degree we say “couth” any more, it’s a back-formation from “uncouth.” (Via Pinedale Roundup.)

This post has been updated with the term “bound morpheme” and one or more examples.

Written by David Montgomery

September 9th, 2013 at 4:23 pm

Posted in Language

Script-doctoring the Star Wars prequels

without comments

Hating on the Star Wars prequels is a favorite pastime among those of a geekier persuasion. They have their moments, but are also heavily uneven, tediously paced and largely lacking resonance. But what if the prequels were good — REALLY good? That’s the question asked by filmmaker Belated Media, whose name does not appear to be anywhere on his YouTube, Facebook or Tumblr. This nameless video auteur proceeds to answer his own question by sketching out changes to the prequel scripts that actually seem like they’d produce pretty awesome movies.

He’s done episodes 1 and 2 so far, and hopefully won’t wait another year to finish a third installment. Among the changes: turning Darth Maul into a recurring villain, giving Obi Wan Kenobi an arc involving his need for revenge for Qui Gon’s death, introducing Anakin as a teenager instead of a child, drastically simplifying the plotlines, setting much of the action on Alderaan (so its later destruction means more to us), introducing Bail Organa earlier, focusing on the relationship between Anakin and Obi Wan instead of Anakin and Padme, and removing Yoda’s (admittedly cool) lightsaber fighting.

And Jar Jar’s gone, too, but I hope that was assumed.

The two videos are a bit long but fun to watch. If you like Star Wars, give them a look.

Episode I:

Episode II:

See also: the even longer but equally entertaining takedowns of the prequels by Red Letter Media (I, II and III), and the proper order in which the six current Star Wars movies should be watched with the versions of the prequels we’re stuck with.

Written by David Montgomery

August 15th, 2013 at 11:00 am

Posted in Film

When inefficiency is praised

with 5 comments

When most people talk about “efficiency,” they talk about it in one of two ways. For some people, it’s an unabashed good thing, a goal in and of itself. For other people, it may be a good thing but is often used as an excuse to bring bad things — firing employees, or replacing traditional tasks with soulless machines, or the like.

But I’ve been struck lately by several cases where people have defended inefficiency as a virtue. It’s not just that inefficiency has side benefits, but that the direct impact of the lack of efficiency is praiseworthy itself. Usually this is where the activity in question is frowned upon — but perhaps not so much as to warrant an outright ban.

For example, take alcohol laws in the United States. Under the “three-tier” system, the alcohol industry is divided into producers (breweries and distilleries), wholesalers and retailers. The producers, in other words, not only can’t sell directly to the public, but they can’t sell directly to stores that sell to the public, either.

This is doubtlessly inefficient. It “adds to the price of the drink at every step,” it “produces patently ludicrous scenarios,” it harms small producers who are unable to find distribution, and has “fleeced customers for decades,” according to author Kevin R. Kosar.

It’s also exactly the point.

“By deliberately hindering economies of scale and protecting middlemen in the booze business, America’s system of regulation was designed to be willfully inefficient, thereby making the cost of producing, distributing, and retailing alcohol higher than it would otherwise be and checking the political power of the industry,” writes journalist Tim Heffernan.

The inefficiency is deliberate, Heffernan writes, precisely because of the pernicious qualities of alcohol when abused. An outright ban didn’t work, so Americans settled for making the system inefficient, raising the costs and putting an artificial brake on something seen as undesirable.

In contrast, Britain has none of these restrictions — and because of that, Heffernan argues, a much worse national drinking problem.

Whether you agree with Kosar or Heffernan, it’s a striking argument to consider.

It’s the same case with a current issue-of-the-day, drones.

Many people are very upset with the U.S. government for using drones to spy on people and to kill people. And yet, something about that seems odd. Why are people so upset that DRONES are doing the spying or the killing? Some human is the one giving the orders or operating the controls. Why aren’t people complaining about the killing or the spying instead of the method by which that killing or spying is being done?

The key difference between using a drone to fire a missile at a terrorist compound and using, say, a manned aircraft, is that the drone is more efficient. You can conduct the operation without putting an expensive human being at risk. You don’t need to get those expensive human beings to the site in question; they can operate the unmanned vehicle from across the world. By using drones, the costs of conducting killing or spying go way down, thus allowing more of it. (Plus, decades of techno-phobic science fiction has made people a little leery of drones, which doesn’t hurt the public case, but among the people who are really worked up about the subject, efficiency is the real issue.)

Some people might want to ban the government from killing people or spying on people, others just want to limit it. But both can agree that just making it more difficult to do the frowned-upon activity by circumscribing the latest efficient technology is a good first step.

Are there other examples where inefficiency is praised as a self-evident good?

Written by David Montgomery

August 8th, 2013 at 11:00 am

Posted in Culture

The paradox of red state Democratic success

without comments

(This post has been adapted from work previously done for the Argus Leader and its Political Smokeout blog by David Montgomery.)

Many Democrats in Republican-dominated states like South Dakota derive what little political pleasure they have from the party’s national victories. While electing a Democratic governor in South Dakota can seem nigh-impossible at times, Democrats like Barack Obama and Bill Clinton regularly take control of the national reins of power thanks to more liberal voters in other states.

But in South Dakota, at least, history suggests Democratic national victories are actually devastating for the party’s hopes of winning local elections.

Dating back more than 60 years, Democrats have controlled around 10 extra seats in the South Dakota Legislature, on average, in the years when a Republican occupies the White House, compared to when a Democrat is running things in Washington, D.C.

In those years when a Republican is president, moreover, Democrats have tended to gain an average of around three or four seats per election.

But when a Democrat such as Lyndon Johnson, Jimmy Carter or Clinton has been president, South Dakota’s Democrats lose an average of four seats in each election.

This trend is striking when viewed graphically. The following chart shows the percentage of seats in the South Dakota Legislature occupied by Democrats, and is colored by the party affiliation of the president in the year of each election. (So 1976 is red even though Jimmy Carter won that year because Gerald Ford was president at the time of the vote.)

Democratic control of the SD Legislature

The slope is unmistakable — with a few outliers (notably the 1964 Lyndon Johnson landslide), Democrats steadily gain seats in the GOP-run years, and fall backwards otherwise.

These trends don’t prove that Democratic presidents are causing Republican victories in South Dakota, merely that the two tend to happen at the same time.

But Jon Schaff, a political science professor at Northern State University in Aberdeen, said he can see how Democratic presidents might make it tough for local Democrats in conservative states such as South Dakota.

“I could see that when a Democrat is in office, doing things that maybe a majority of the state doesn’t like, the … unpopularity of the national Democratic Party gets attached to local politicians,” Schaff said.

Another explanation for the trend could be the effect of Democratic presidents on Republican voters.

“At times when the national Democratic Party is in ascendancy, Republicans in the state become more partisan,” he said.

The only times Democrats have taken control of one or both houses of the Legislature — the Senate in 1958, 1972, 1974 and 1992, with a deadlocked House in ‘72 — have all been elections with a GOP president. Similarly, the only five times in this period that a Democrat has been elected governor were during Republican presidencies.

That’s not to say there’s not plenty of other possible explanations. Democratic gains in the 1970s under presidents Richard Nixon and Gerald Ford could involve the national anti-GOP backlash after the Watergate scandal. Their gains in the 1980s under President Ronald Reagan might be connected to Reagan’s unpopular farm policy rather than anything more fundamental about Reagan’s political affiliation.

One political watcher, former South Dakota Republican Party chairman Joel Rosenthal, said effect could simply be a coincidence. The real factor, he suggested, was that some Republican governors have been more vigorous at energizing and organizing the state GOP than others.

“It could have a lot more to do with Bill Janklow being on the ticket or being governor, or Dennis Daugaard,” Rosenthal said, pointing to two governors who have served at times when Republicans did well in the Legislature.

Only three governors were in office for both Democratic and Republican presidential administrations: Janklow (under Carter, Reagan, Clinton and Bush II), Mike Rounds (under Bush II and Obama), and Sigurd Anderson (under Harry Truman and Dwight Eisenhower). The sample size there is too small to draw any strong conclusions, but Janklow under Democrats saw Democrats lose an average of just under three seats per election, while Janklow under Republicans saw Democrats gain an average of a quarter seat. Rounds’ one election with a Democratic president was a Republican landslide when Democrats lost more than 13 percent of their seats, while his elections under Bush saw Democrats pick up an average of several seats each year. Anderson’s election under Truman saw the Democrats nearly wiped out in the state, while once Eisenhower took office they rebounded vigorously.

Shortly before the 2012 election, the then-chairman of the South Dakota Democratic Party dismissed the correlation.

“I think there’s probably a number of factors outside who’s in the White House at play in those elections,” said Ben Nesselhuf, chairman of the South Dakota Democratic Party until July 2013. “By and large, South Dakotans have a pretty good understanding of what their Legislature’s about and who they’re sending there, and the difference between the national parties and the state parties.”

But after the voters in November 2012 returned the same number of Democratic lawmakers despite a vigorous SDDP campaign, Nesselhuf privately pointed to this analysis as a reason why Democrats didn’t regain some of the seats they had lost in the 2010 Republican landslide. Some commentators, including this writer, had expected Democrats to gain back some seats under the evocatively named theory of the “dead cat bounce” — a party that suffers a landslide defeat will lose seats they normally win, and thus will be well-positioned to retake them once the landslide is over. That didn’t happen, and Nesselhuf suggested it was because South Dakota Democrats were struggling against the massive unpopularity in South Dakota of a Democratic president.

Rep. Bernie Hunhoff, the Democratic leader in the state House of Representatives, said Democrats tend to get “none of the benefits” of having a Democratic president, since the national party rarely invests significant resources in South Dakota.

“Both parties write off the state,” Hunhoff said. “And yet we get whatever negatives there might be, because everyone likes to blame the party in charge. That was certainly true under Clinton and Obama.”

View the data used in this analysis here.

Written by David Montgomery

July 15th, 2013 at 10:00 am

Posted in History,My work

My research on Egypt

without comments

I’ve been watching the political tumult in the Middle East with interest since before the “Arab Spring” first broke out in late 2010. After writing a research paper on the political economy of the United Arab Emirates for one college class, I decided to keep my research focused in the Arab world for my next class, on the “Diffusion of Democracy,” in the spring of 2008. I wrote a case study of three different Arab countries in different situations — oil-rich monarchy Kuwait, impoverished monarchy Jordan, and massive then-dictatorship Egypt.

While Jordan and Kuwait have had little political change since then, Egypt has been in near-continuous uproar. And I’m still proud that much of my analysis proved prescient. The full essay is here or embedded below, but here’s some choice excerpts.

By way of quick background, I analyzed the countries from two perspectives: structural and process. Structural analysis looks at what socioeconomic characteristics correlate with democracy, and sees how the undemocratic countries compare to try to predict which are likely to become democratic. I used two primary datasets. One was assembled by Ronald Inglehart and Christian Welzel, who took the massive World Values Survey and assigned every country a value based on its citizens’ responses on two axes — one between “traditional” and “secular/rational” values associated with industrialization, and one between “survival” and “self-expression” associated with the rise of the consumer-based economy. Inglehart finds that the self-expression values correlate well with support for democracy. Here’s a fascinating map showing how various countries fall; you can see right away it is not optimistic about the Muslim world:

World Values Survey culture map

Secondly, the Egyptian political scientist Moataz Fattah conducted a survey of literate Arabs throughout the Muslim world to try to gauge support for democracy. (The literate-only dataset is a significant limitation that probably serves to overestimate democratic values.) Fattah divided the Islamic population into four general camps — “traditionalist Islamists” who reject democracy as contrary to Islam, “modernist Islamists” who want a democracy compatible with Islam, “autocratic statists” who are secular supporters of dictatorship, and “liberal pluralists,” secular supporters of democracy. In general, where the modernists and the pluralists outnumber the traditionalists and the statists, support for democracy is strong.

But while structural analysis has a pretty good track record of predicting democratization in the long run, they don’t very much to say when those transitions are likely to occur. That’s the emphasis of a process-oriented approach, which takes a qualitative look at the structure of a regime and how it is likely to bend or break to popular democratic pressure. This approach also looks at what groups there are in society who might be able to exert effective pressure to demand democratic reforms.

In Inglehart’s model, Egypt scored a -1.57 on the traditional-rational axis and a -0.4 on the survival-expression axis. That’s not good — only a handful of countries that far to the left on the survival-expression axis are democratic, and none are paragons. Still, Inglehart did find a substantial constituency for democracy in the Egyptian survey — 30 percent of respondents had self-expression values, comparable to Venezuela, Peru, South Korea and Portugal. Egypt’s Freedom House scores (under the Mubarak regime) were actually less free than Inglehart’s model would predict. My conclusions from this model:

This suggests that Egypt is due for at least some formal democratization over the long term because its government is out of tune with the beliefs of its people. It would not take very much expansion of self-expression values to lead to a major change in this modernization model: “any society in which more than half the population emphasizes self-expression values scores at least 90 percent of the maximum score on liberal democracy.” On Inglehart and Welzel’s measure of “effective” democracy (this is to say, elite corruption), Egypt is a poor performer, ranking among the most corrupt countries in the world. Even if Egypt were to acquire formal democratic institutions, Inglehart and Welzel do not predict any rapid improvements in effective democracy. (Emphasis added)

In light of what happened three years after this paper, I think this paragraph holds up very well.

Fattah’s model suggested “Egyptians have among the strongest preferences for democratic institutions in the Muslim world (higher even than Muslims living in the United States) and also show high support for ‘democratic values.’” Only three percent of literate Egyptians were “traditionalist” (undoubtedly the actual figure is higher) and seven percent statist, compared to a huge 63 percent of modernist Islamists and 27 percent secular pluralists. In contrast, the rates of traditionalists in other Arab countries were 26 percent in Syria, 25 percent in Algeria and 46 percent in Saudi Arabia. But the dominance of modernist Islamists over secular pluralists in the Egyptian literate population — only about 70 percent of Egyptians can read, so the pluralists are surely even more outnumbered than this suggests — doesn’t bode well for people hoping for a Western-style liberal democracy there.

From a transition studies approach, I suggested the Egyptian military might not prove loyal to Mubarak if push came to shove. Though Mubarak had lavished resources on the military, he hadn’t taken certain steps to ensure its loyalty (such as that taken by Kuwait, stocking the military’s upper echelons with family members of the leader):

On the other hand, where some countries have bound the military to the regime through patrimonialism—placing relatives and key supporters who have a stake in the regime’s survival in key posts—Egypt’s military is “highly institutionalized” and capable of acting independently. Herb and others agree with Bellin that: “where the coercive apparatus is institutionalized, the security elite have a sense of corporate identity separate from the state. They have a distinct mission and identity and career path. Officers can imagine separation from the state.” They have O’Donnell and Schmitter find that where the military is professionalized and independent, “the only route to political democracy is a pacific and negotiated one.” It is conceivable (despite the close ties between regime and army) that the military could switch to the opposition or remain neutral during a transition.

In fact, when millions of protesters packed Tahrir Square and other Egyptian streets, the military refused Mubarak’s orders to crack down and ultimately removed him from power. Mohammad Morsi, Mubarak’s democratically elected successor, proved no better at bringing the army to heel and suffered the same fate.

Of course, in order for the military to respond to democratic pressures, there had to be those pressures. I examined two key sectors of Egyptian society that might prove capable of organizing politically to oppose Mubarak (or run a government after his ouster): the secular political parties, and the Muslim Brotherhood.

The former are “largely a sorry bunch” who had been intentionally emasculated by Mubarak:

Saad Eddin Ibrahim, a liberal Egyptian academic and dissident, notes with dismay (in a chapter written from a prison where he was serving a seven-year sentence for opposition activities) that “Egypt’s [secular] democracy advocates are the weakest of the three salient actors at present,” along with the regime and Islamists, because instead of “viewing them as an ally against extremism, the state has repeatedly repressed democracy advocates.” Unless things change drastically, the liberal opposition looks to be an ineffective advocate for democracy.

The Muslim Brotherhood, though banned under Mubarak, were nonetheless popular and organized. Even while officially illegal, members managed to dominate aspects of Egyptian society, often by acting as nominal independents. But the “biggest question with the Brotherhood is not whether they are strong enough to be a credible opposition organization—if they are not, then no one is, and all indications are that they are—but what kind of opposition group they would be.”

On the one hand, the Brotherhood has, since renouncing violence, consistently endorsed democratic principles. Brotherhood rhetoric uses terms like “democracy,” “liberty” and “freedom” “freely and repeatedly,” and the Brotherhood “consistently dismiss the argument that Islam and democracy are incompatible.”

On the other hand, the Brotherhood’s talk of democracy can sound suspicious to secular liberals. Brotherhood democracy is Islamic democracy based on sharia law. “Western critics,” notes Sana Abed-Kotob, “are fearful that the Brethren are using elections as a tactic to gain power and subsequently do away with the democracy that gave them their voice.” … Even granting the Brotherhood the best of intentions, a Brotherhood-led democracy will probably contain many objectionable elements to secular liberals. But for democracy advocates in Egypt, firm military support for the regime means that the Brotherhood is the only effective opposition group. On the Brotherhood’s good intentions may ride the prospects for Egyptian democracy. (Emphasis added)

As it turned out, the Brotherhood was not intimately involved in Mubarak’s overthrow, and neither were the secular political parties. His downfall came subject to a massive, unorganized outpouring of support that I did not expect, spurred by an international popular movement. But when examining what came next, as various interest groups tried to organize to control Egyptian democracy, the above paragraphs are still useful. The Muslim Brotherhood, once it made the decision to contest elections, was clearly the dominant group, winning both a large majority in the parliament and the presidency. And Morsi’s rule did indeed “contain many objectionable elements to secular liberals,” which combined with his mismanagement of the economy and the disloyalty of the military contributed to his downfall.

Even if Egypt’s liberals are able to organize effectively, they’ll still be outnumbered. Democratic institutions in Egypt are likely to return Islamist governments. But Egypt does have a solid core of around a third of the population who support democratic values, which is non-negligible and could shape the country’s political future for years to come.

Written by David Montgomery

July 7th, 2013 at 12:38 pm

Posted in Culture,My work