November 30, 2013

Declining meritocracy in an age of greater status striving

There's a discussion at Steve Sailer's about the rise and fall of meritocratic tests in hiring, riffing on this article in The Atlantic. It was low around the turn of the 20th C., peaked in the mid-century, and fell out of favor sometime during the '70s. Most of the article is about how Big Data may help revive and re-vamp test-based hiring, but for now it's more of a nerd obsession than a widespread business practice.

By the way, what was the prevailing way of hiring around the peak of intra-elite competition and inequality, in the early 1900s? For workers, it was ruthlessness in a melee:

Near the turn of the 20th century, one manufacturer in Philadelphia made hiring decisions by having its foremen stand in front of the factory and toss apples into the surrounding scrum of job-seekers. Those quick enough to catch the apples and strong enough to keep them were put to work.

I'm sure they would have made great Black Friday shoppers. And for managers?

In those same times, a different (and less bloody) Darwinian process governed the selection of executives. Whole industries were being consolidated by rising giants like U.S. Steel, DuPont, and GM. Weak competitors were simply steamrolled, but the stronger ones were bought up, and their founders typically were offered high-level jobs within the behemoth. The approach worked pretty well. As Peter Cappelli, a professor at the Wharton School, has written, “Nothing in the science of prediction and selection beats observing actual performance in an equivalent role.”

Buy-outs by behemoths, where the former leaders are absorbed into lesser roles within the newer bigger trust -- sound familiar? If people's thirst for ever greater status and wealth is insatiable, this Borg-like assimilation is inevitable.

But resistance was not futile. During the 1920s, the elites agreed to rein in their pyramid-climbing, for the stability of the society, after the explosive climate of World War I and its aftermath. Not long after, inequality began steadily falling. What new norms were adopted by the middle of the century?

By the 1950s, it was not unusual for companies to spend days with young applicants for professional jobs, conducting a battery of tests, all with an eye toward corner-office potential. “P&G picks its executive crop right out of college,” BusinessWeek noted in 1950, in the unmistakable patter of an age besotted with technocratic possibility. IQ tests, math tests, vocabulary tests, professional-aptitude tests, vocational-interest questionnaires, Rorschach tests, a host of other personality assessments, and even medical exams (who, after all, would want to hire a man who might die before the company’s investment in him was fully realized?)—all were used regularly by large companies in their quest to make the right hire.

During the Great Compression of roughly 1920 to 1980, ruthlessly jockeying for status was taboo. That belonged to the Gilded Age with its robber barons, courtesans, and other professional strivers. Now you were supposed to be more content with what you had, and not step on someone else's skull just to so you could own a second car.

Perhaps the rise of meritocratic testing was a way the elites found of dampening the internecine status-striving that newly blew the country up in the wake of World War I. "Quit your complaining -- the test says you belong in this range, and that's where you go. Don't bother trying to act like a courtier."

Objective tests have a natural ceiling, where no extra amount of resume-padding, networking, and butt-kissing will alter your destiny. It contains elite status jockeying.

Here's a reminder of what the Great Compression business culture was like at the executive level, from Fortune magazine in 1955. You get a good feel for mid-century cocooning and isolation, but you also can't help but notice how self-effacing and reining-it-in the elites were compared to the robber barons or our neo-robber barons today.

During the late 1960s for elites, and a little later for everyone else, the restraints of the Great Compression came undone, probably as folks had forgotten or were unaware to begin with of the soaring inequality and social-political instability of the period running from the Civil War through the Gilded Age and culminating after WWI.

Peter Turchin, whose basic "structural-demographic" framework I'm borrowing here, has a series of posts on the topic of how and when elite competition and over-production of elites began. Straightforward measures like enrollments (per capita) at law schools, business schools, and medical schools, not to mention the higher ed bubble in general, all point to the 1970s as a transition era. By about 1980, the break with the mid-century restraint was complete.

What then became of hiring based on objective tests?

Remarkably, this regime, so widespread in corporate America at mid-century, had almost disappeared by 1990. “I think an HR person from the late 1970s would be stunned to see how casually companies hire now,” Peter Cappelli told me—the days of testing replaced by a handful of ad hoc interviews, with the questions dreamed up on the fly. Many factors explain the change, he said, and then he ticked off a number of them: Increased job-switching has made it less important and less economical for companies to test so thoroughly. A heightened focus on short-term financial results has led to deep cuts in corporate functions that bear fruit only in the long term.

Over the past several decades, companies don't want to invest in testing out potential hires, when they're so footloose about where they work (see the rest of the article for numerical data, or consult your own experience). All that costly scientific testing wouldn't pay off until it discovered and hooked a good fit for the long term. Instead, new hires always have their eyes peeled for the next job, always on the move to find an angle on reaching one rung higher on the status ladder. Not content with a decent job at a good company.

With the return of unbounded status striving, hiring is more and more driven by "playing the game" factors, as though you were a latter-day courtier:

Perhaps the most widespread bias in hiring today cannot even be detected with the eye. In a recent survey of some 500 hiring managers, undertaken by the Corporate Executive Board, a research firm, 74 percent reported that their most recent hire had a personality “similar to mine.” Lauren Rivera, a sociologist at Northwestern, spent parts of the three years from 2006 to 2008 interviewing professionals from elite investment banks, consultancies, and law firms about how they recruited, interviewed, and evaluated candidates, and concluded that among the most important factors driving their hiring recommendations were—wait for it—shared leisure interests. “The best way I could describe it,” one attorney told her, “is like if you were going on a date. You kind of know when there’s a match.” Asked to choose the most-promising candidates from a sheaf of fake résumés Rivera had prepared, a manager at one particularly buttoned-down investment bank told her, “I’d have to pick Blake and Sarah. With his lacrosse and her squash, they’d really get along [with the people] on the trading floor.” Lacking “reliable predictors of future performance,” Rivera writes, “assessors purposefully used their own experiences as models of merit.” Former college athletes “typically prized participation in varsity sports above all other types of involvement.” People who’d majored in engineering gave engineers a leg up, believing they were better prepared.

There's another reason, well discussed by conservatives, for the decline of objective testing: namely the fact that whites tend to do better than blacks:

The Civil Rights Act of 1964, which exposed companies to legal liability for discriminatory hiring practices, has made HR departments wary of any broadly applied and clearly scored test that might later be shown to be systematically biased. Instead, companies came to favor the more informal qualitative hiring practices that are still largely in place today.

I'm starting to think the whole "disparate impact" thing is a rationalization to do away with merit-based hiring, in an era of greater status striving. The charge of "b-b-but, it's racist against blacks!" is just a shameless smokescreen that the strivers know will sell well with the target audience.

For example, my impression is that other countries without many Non-Asian Minorities share our disdain for objective testing, etc. It makes it even harder to do objective testing in America, but I'm convinced that it's a second-order thing. The main source is the discontent of elite strivers who want to be able to dazzle the HR rep with their fifty pounds of filler and connections.

And the military continues to use IQ testing. The usual argument is that their jobs are too important to let diversity worship fuck them all up. But you could say that about all kinds of industries, and in fact the military isn't exactly beating back the Indians, Nazis, or Commies these days. I think it makes better sense to see it as part of the military's wish to keep status-striving from getting too out of control -- that could get violent and destabilizing to the political order. Income inequality within the military must be far less than in the private sector, since the top-ranking generals still don't make as much as hedge fund managers.

Aside from the noxious atmosphere produced by the striving mode of thinking and behaving, it neuters the ability of people to do just do their damn jobs well and get paid decently for it. The current regime does not reward competency but rather resume-padding and networking.

Like, isn't it striking how much stuff they invented between 1920 and 1980, on a per-decade basis? The '80s weren't so bad, but then that was just the beginning of the trend toward where we are today, politically and economically. And the last 20-odd years? Jack. All we're better at these days is how to socialize costs and privatize benefits, and how to best dress it up in fashionable ideology -- see the housing bubble, bank bailouts, etc. Neo-con corporate cocksucking mates with liberal diversity worship -- imagine the beautiful offspring they'll create!

At any rate, there's another sign to watch for, to anticipate the unwinding of this spiral of intra-elite competition -- a renewed sincere interest in objective meritocratic testing.

November 27, 2013

Elite concern about broken homes: Changes over time

We've seen the historical patterns for both divorce rates and rates of children growing up in single parent households. How does elite concern for these social ills respond, or not, to their changes over time?

The following graphs all show how common a certain phrase is within Google's digitized library of books (Ngram), using the American English texts up through 2008, and beginning in either 1900 or 1950, whenever the phrase is about to take off. These are both fiction and non-fiction, so divorce could be part of the background in a novel, or a social scientific investigation, or a journalistic overview. Whatever the source, they all spring from an elite concern, although some are targeted toward a popular audience, and others toward their fellow elite members.

The main thing that they show is when the elite of the society actually gave a damn about a huge social problem, showing that they felt more like stewards of the less fortunate, and not shoulder-shruggers promoting a "dog eat dog" morality.

First, "divorce":


There's a secular rise in concern, reflecting the secular rise in the problem itself. However, there are clear cycles around that, and they reflect the crime rate (delayed by roughly 5 years). There's a peak in the late '30s (after the 1933 peak in the homicide rate), a low point in the early '60s (after the 1958 minimum in homicide), and another peak around '93 or '94 (after the 1992 peak in homicide), falling since (along with homicide).

As I've detailed for several years now, a rising-crime atmosphere makes people notice that things seem to be going wrong in the world, that the powers that be are too incompetent, impotent, or corrupt to fix things, and hence that we have to work together to help ourselves through this mess. A falling-crime atmosphere does the reverse: problems seem to be getting better and better, that must be because the powers that be have suddenly become omniscient, omnipotent, and omnibenevolent, and hence we don't need to care much about helping one another through life's troubles anymore.

Divorce is just one example of that broader pattern.

How about during the time period of skyrocketing rates of children of divorce? Remember that the early '60s birth cohorts showed the first steep rise of being a child of divorce or of a never-married single parent by age 16, meaning the divorce probably struck during the early or mid 1970s. And unlike the divorce rate (which turned down after 1980), the rate of children in single parent households has only grown or steadied since then.

There are several relevant examples, but they rise only until the early-to-mid 1990s, and have fallen dramatically since, where they ought to have continued to rise or at least plateau. These cases all show an interaction between the rate of the problem itself and how likely the elites were to pay attention to it. That is, they paid attention as they should have during the rising-crime period, but have tuned out for no good reason during the falling-crime period. This echoes the result from "divorce," only it's now about children of divorce.

"Children of divorce":


"Dysfunctional families":


"Single mothers":


"Single parent household":


How, if at all, did elites show concern when there was less and less of a problem to be addressing? That appears to have been mostly during the Great Compression of circa 1920 to 1970. Rates of being raised in single parent households was the lowest for those born in the late Silent Generation and early half of the Baby Boomers. As you can see above, none of the phrases rose then. However, they had a different phrase for the phenomenon...

"Broken homes":


That shows an almost perfect match with the cycle in status one-upsmanship, inequality, over-production of elites, etc., that Peter Turchin has been describing and explaining. It was the heyday of liberal welfare policies, liberal social science (always with policy implications), and of elites generally seeing their role in society as stewards.

The metaphor "broken home" serves to shame selfish parents away from what is obviously a social evil, but something that the parents might be able to rationalize away without shame bearing down on them. It packs a punch unlike "dysfunctional family," "children of divorce," or "single parent household." For all we know, the phrase began at the grassroots level to shame divorcing parents in people's own communities, and the elites just took the phrase and ran with it in the national media and literature. Whoever started it, though, the point remains about how powerful the stigma was in doing its job.

Did elites tie it in with their broader plan to contain or de-fuse the societal instability that had reached a fever pitch in the late 1910s and early '20s? They sure did. I browsed through the NYT archives, and "broken homes" was typically used to describe a background cause for another phrase we haven't heard in forever -- "juvenile delinquency." A life of crime and vice will only widen the gap between rich and poor (and lead to greater instability too), so the elite stewards not only tried to contain the top from striving ever upwards, but also to keep the bottom from sinking themselves into the depths.

Nothing could be further from the mood of today or of the Gilded Age, when you can do whatever you want as long as it doesn't break the law. Who cares if it ruins communities -- that's not against the law, is it? So the robber barons and elite strivers can indulge as much as they please in their wasteful contests of conspicuous consumption, while the poor are given a free pass to indulge in the "sporting" culture of booze, gambling, and brothels. The re-emergence of "chav" culture in England, and whatever it's called here in America, show how far we've returned to the norms of the Gilded Age.

I was struck by the tenor of those NYT articles from the 1950s, where a broken home was often described as a "tragedy" or "tragic," and moreover as "preventable" -- meaning if you parents get divorced after we warn you, you're guilty of causing a preventable tragedy. They placed no blame at all on the children, even ones who had turned to a life of urban crime. They were instead "victims of broken homes." And "juvenile delinquent" sounds pretty anodyne compared to the harsh condemnation of the parents' behavior above.

Nowadays, we sort of blame the parents, but only for not acting like proper helicopter parents and constantly monitoring their kids' behavior. We don't condemn them for raising their kids without both parents around, though -- they get a pass for that. We're much more severe in judging the kid -- send that little punk to the electric chair! Or if he didn't murder somebody but only got caught buying drugs, still, lock him up!

I'm getting into a separate post now about attitudes toward crime and punishment, rather than divorce, so I'll stop there and save that for another time.

November 26, 2013

Loss of connectedness among children of divorce -- Time to look beyond individualistic traits

In the comments of the post below, there's a discussion of behavior genetics. This research program tries to figure out how our current status has been influenced by genetic and a variety of environmental influences, and their interactions. As far as I know, the traits they study are all from the field of "individual differences" or "differential psychology" -- the things that make you, you and me, me. Personality, intelligence, attitudes, beliefs, habits, and so on.

Necessarily, they cannot study the things that make us, us and them, them. "We" do not share a genome, and "we" are not raised in the same house by the same parents; and ditto for "they" who do not share an entire genome or a common set of parents, which would contrast with our own. Now we're getting into sociology, anthropology, and so on.

There's nothing wrong with the behavior genetics approach, but it misses all of those group-defined traits that give us our cultural identity and our sense of belonging to something larger than ourselves. The things that give greater meaning to our lives, beyond noticing how smart, attractive, or extraverted we may be. So, we ought to supplement behavior genetics with group-ier social sciences to get the full picture of how we develop and turn out the way we do.

One of the major weaknesses of using behavior genetics to study the effects of divorce on children is that the main thing that marriage and family are about is group-level stuff. It's interesting to study how marital strife does and does not affect a child's individual characteristics -- mood, behavior problems, drug use, and so on -- but that's not where we should look for the effects of divorce.

Fundamentally, divorce represents a permanent rupture in the social connectedness among nuclear family members, not only between husband and wife but critically between the children and the absent parent ("non-custodial," "non-residential"). Your parents are not getting back together. And if you were raised by a single parent, without the other having been there for long, your parent is not ever going to bring the absent parent into the home to give you both parents.

Divorce typically results in a change of residence, since one parent (the custodial parent) will not be able to afford payments and maintenance of a house that had been a joint investment of two parents, while also raising the kids. Thus divorce also robs children of the "sense of place" of the home that they had attached themselves to so far during development. Houses, yards, sidewalks, blocks, and neighborhoods are not interchangeable, any more than parenting adults are to the kid. A stepmother is not the child's true mother, and the new place of residence will feel more alien -- more "not meant for us" -- than the marital home.

Moving one level up, the change in location will usually sever the children's social and cultural ties to the group of kids their age, whether at school or in the neighborhood broadly. They have to start all over again with new kids, and again -- kids are not interchangeable. Children of divorce have permanently lost their old friends who they'd grown close to, and now must do the best they can with a different set of kids -- and often not as able to get as close to them as they are able to get to each other, since they have grown up together, while the child of divorce is late to the party.

Finally, not only have they lost the connection to their home, but of their community environment in general. The neighbors' houses, the parks, the libraries, the malls, and the churches will all be different. So will the secret hiding places, the writings carved into the sidewalk, and all of those places where some meaningful experience took place. All those places, anchoring all those experiences, give you a sense of belonging to a particular, special world -- and you must now leave all of them behind for the rest of your formative years.

Whether children are resilient is not the question. Of course nature has programmed us to adapt to changing circumstances -- to roll with the punches. But again, resilience, adjustment, etc., are individual traits: however well the child of divorce adjusts to his new set of parents, new home, new social circle, and new community, he has lost much of his sense of belonging to a particular nested set of groups, and he won't ever get those back. (And to make things worse, resilience is only a tendency back toward normality, not a full recovery.)

We can ignore the predictable glib rejoinders about how any set of parents and peers will do, how a house is nothing more than a memory-free building on a featureless lot of dirt, and how all the streets and parks and malls in this country are all so similar that the kids will never be able to tell the difference. Of course they're not -- particular pieces of social and cultural life are not fungible stuff like dollars and cents.

Divorce is not the only way that children may be uprooted and transplanted willy-nilly into an alien place, but it sure is one of the more common and most reliable ways. And no matter how much the autistics may wish to insist on the fungibility of home, peer group, and community, none will say so for the parents. The children used to have both parents, now they only have one; if a new one moves in, everybody is always aware of their alien status within the family, and behave accordingly. The vastly higher rates of child abuse coming from step-parents rather than parents provides a dark example of what happens when we play down how particular the social-cultural arrangement should be once the ball gets rolling.

Little disruptions here and there are inevitable, and our minds are designed to handle that in the way that we can handle taking a hard fall or getting slugged in the gut. But ripping the children right out from their existing network of support and belonging is far too severe for them to bounce right back from. And treating the world as though it worked that simply is the height of arrogance and heartlessness on the part of the adults.

November 25, 2013

Children of divorce, an unseen historical trend: Is it related to status-striving?

Typically when you look up information on divorce rates over time, you'll see divorces per capita and divorces per marriage, within a given year. Graphing them produces the following:


No matter which rate you choose, there's been a secular increase since the mid-19th C. Apart from that, there's an apparent burst from the early '20s through the mid-'40s, a dramatic recovery through the late '50s, another surge during the '60s and '70s, and another dramatic decline since then.

This cycling allows for a half-full or half-empty debate about the state of things recently and into the future. "Sure, that surge during the '60s and '70s was bad, but it's gotten much better since. Maybe it'll surge again, but who can say?"

No such debate can take place, though, when it comes to a more important statistic -- how likely is a child to grow up without both biological parents? All those stats you read about divorces, marriages, risk of divorce for a given marriage cohort, etc., ignore the main worry that the public has about the topic of divorce -- namely, children of divorce.

The General Social Survey asks a question about who you were living with at age 16. I'll look only at whites to make the point stronger. In the graph below, I've recoded responses into living with both parents (red), living with mother with or without a stepfather (blue), living with father with or without a stepmother (green), and other responses (yellow). I've grouped people into 4-year birth cohorts starting with 1900, which you probably won't be able to read without clicking to make the image larger.


There's a tiny rise in children growing up in intact families, going from those born around 1900 to those born around 1950, but it's only going from 77% to 80%. But it's basically flat at a high level for that entire period. And remember, the question asks about their lives at age 16, so 16 year-olds were still highly likely to be living in intact families into the early '70s (the '56-'59 cohort is barely distinguishable from '40s births).

Then there's a sharp drop with those born in the early '60s, who would've been 16 in the late '70s, and keeps dropping after that, although it may have stalled out at the bottom with Millennials. (Ignore the last cohort, though, since the sample size is tiny.)

So, unlike the picture of divorce rates, which showed a dramatic reversal around 1980 and lasting through today, rates of children of divorce have only gotten worse since those born in the early '60s.

When people argue about whether the divorce problem is getting better or worse, they would seem to be talking past each other. The people who point to the first graph are talking primarily about the fragility vs. stability of the marriage bond, whereas the folks who respond with, I don't care what that graph says, it's getting way worse, probably have in mind the disruption to family cohesiveness.

I put myself in the latter -- marriage is important, but kids growing up in intact families is more. Marriage vs. divorce is about whether two egocentric adults, the husband and wife, can reconcile their differences and stay together. Once kids come into the picture, the egocentric adults are no longer simply husband and wife but father and mother. They have even more other people to take into consideration, and these new ones are defenseless and impressionable to boot. Time to stop thinking only of your own goals.

What's behind these changes? I'm tempted to point to the status-striving / inequality cycle. If some problem started to get bad by the mid-to-late 1970s, and has only gotten worse since, that's a likely culprit. (It's clearly not the cocooning / crime cycle.) We should then see rates of intact families rise during the Great Compression of roughly 1920 to 1970. Or, going from births around 1900 to around 1955. It's there, but very slight. The pattern is a little stronger by adding the red and blue bars together -- that's growing up with at least your mother (both parents, mother alone, or mother and stepfather).

You'd think rising inequality would make mothers and fathers stick together more for the benefit of their children, if they were worried about their offspring doing well in the future. And falling inequality should relax that, with one parent saying, "Meh, it's hard not to earn a decent standard of living these days -- they'll manage."

Therefore, it must have to do with the status-striving that produces inequality to begin with (a la Peter Turchin). One parent starts to feel like they could do better for themselves by splitting off from the other parent, regardless of who gets the kids. This constant monitoring of your prospects for trading up smells a lot more like status-striving. Or, it may not have to do with wanting to leave in order to re-marry or date someone with greater economic resources than your current spouse, but with the drive toward conspicuous consumption and leisure -- both far easier when you aren't held back by the old ball and chain, nor those darn kids who need your time, effort, and money. Whatever the causes, they all seem to stem from status-striving rather than inequality.

Historical research even farther back would be tough because divorce was a much more serious economic disruption before the welfare state. You might look at how common orphans and abandoned children were, though. Those could always reflect parents who just didn't have enough money to raise them -- but then that's always relative to how much the parents feel they need to spend on themselves. "I just wouldn't have enough to buy diapers every month and still shop for groceries at Whole Foods," or whatever the equivalent was in the Gilded Age. And then again, maybe the children were abandoned because one of their parents was too obsessesed with status-striving to rear their own offspring.

Our vague impression of Victorian England and the Gilded Age in America is full of children who don't live with both mommy and daddy --  all those orphaned and abandoned characters from Dickens, and Tom Sawyer and Huck Finn. But some numerical data would be nice to confirm that impression.

I'm becoming more convinced that conservatives who apologize or cheerlead for the Victorian era don't have a clue what kind of society they're defending, except for the cynical rats among the elite who want another Gilded Age where they're on top of a steep pyramid. Below are a few signs that the Gilded Age didn't care hardly as much as the Fifties did about keeping families together. The scenes are of New York in the 1880s, from How the Other Half Lives by Jacob Riis.




GSS variables: family16, cohort, race

November 24, 2013

Abandoned building chic during the heyday of latchkey children

Our neo-Pre-Raphaelite culture is fascinated by ruins that have become reclaimed by nature. See, among many others, this photo essay on The Ruins of Detroit. The prevailing atmosphere is one of desolation and isolation, finality and futility. The human race is gone, and it's never coming back. Nature itself is not lush, thriving, colorful, or expressive -- but a cold and impersonal crust blindly accreting and advancing to block out all traces of human existence.

Before this era of bleak chic, abandoned buildings were considered cool not because of their wasteland atmosphere, but because they opened up possibilities for sanctuary, a public place that could be reclaimed by real-life people rather than impersonal nature. Folks whose home life wasn't always secure and stable, i.e. the generation of latchkey children and children of divorce.

Odd as it may seem at first, abandoned buildings were thus a sign of the hopefulness, resilience, and can-do spirit of those who were given the label of "troubled teenagers," "disadvantaged youth," etc. They were places meant to be lived in, taken care of, and enjoyed -- not visited by gawking, clinical tourists who wouldn't touch or move or improve anything for fear of disturbing authentic decay.

With adolescents as the target audience, squatting chic appeared most commonly in music videos. Here's a notable exception for children in movie form, the musty attic above the school where Bastian connects with characters from another world in The NeverEnding Story:


The atmosphere in music videos wasn't always so Gothic, though. The life of squatters is shown as lighthearted and fun-loving, despite their surroundings. Here's the video for "Catch My Fall" by Billy Idol, where the tone is playful in contrast to his earlier video for "Dancing with Myself," which is more toward alienation:



The video for "Nothing at All" by Heart was filmed in the Bradbury Building, a location used in Blade Runner to establish the grungy Gothic mood of the coming dystopia:


In the Heart video, the lighting is still Gothic, but the surface is a little glossier, and the characters are upbeat and happy-go-lucky -- like, "Well, we've got this empty space all to ourselves, might as well make the best of it and have fun together."



Similar scenes from real life can be seen in the documentary Streetwise (see my review here), where one of the teenage runaways is squatting an apartment in an abandoned high-rise, along with his somewhat older hippie housemate. There's a fun shot of the 13 year-old dude roller-skating around the empty, dingy hallways that makes you chuckle at how resilient the kid is, more than feel sorry for the crappy living conditions that he's making do with (which you also feel, of course).

The goal here isn't to catalog every piece of popular culture where fun-loving folks reclaim such a place and make it their own sanctuary, although in the comments I might include other examples that come to mind later. And I figure you all will know of other examples already. The main point to take away is that in a time of social connectedness, even abandoned youth and abandoned buildings could be transformed into a halfway wholesome ecology, whereas in cocooning times, total isolation reigns.

November 23, 2013

Re-capping the 20th anniversary of the JFK assassination

Twenty years to the hour after President John F. Kennedy was gunned down, 1,500 people gathered Tuesday for a memorial service a block away from the assassination site...

"This city has never blamed the city of Washington for the death of Abraham Lincoln so it is unfair to blame the city of Dallas for this criminal act," declared former U.S. Sen. Ralph Yarborough, who rode in the motorcade with Kennedy two decades ago...

"The future did not die here -- it never dies; it goes on. Here died one spirit," added U.S. Rep. Henry B. Gonzalez of San Antonio, who also rode in the motorcade...

Shortly before the ceremony began, a column of vans and cars taking part in a motorcade sponsored by "The Texas Coalition for Freedom" circled the block where the Kennedy Memorial is located, displaying signs supporting President Reagan and opposing communism. The group later rallied at the memorial.

Another group of protesters silently stood to the side during the service, holding a banner protesting U.S. arms in El Salvador.

From "Dallas Remembers Assassination" (AP, 11 / 22 / 1983).

That was another world, wasn't it? I can't imagine today seeing a caravan of right-wingers trolling a major anniversary of JFK's assassination, nor a group of left-wingers using it to protest militarism. Folks back then had a healthy lack of respect for celebrity worship and the sanctification of high authority.

Steve Sailer's been covering the coverage of the 50th anniversary in the mainstream media, most of which pushes the theme of "Dallas, a city of hate in need of redemption, has it atoned enough to be forgiven of its sins?" They gloss over the name of the assassin, Lee Harvey Oswald, and the fact that he was a Marxist wacko who tried to defect to the Soviet Union, that he was not from Dallas, and so on.

E.g., this article from the NYT that mentions Oswald's name just twice, and only includes half of a sentence about his background and motives: "...even though the killer was a Marxist outsider." Sooo back to how Dallas was like literally the most racist and bigoted city in, seriously, the entire country, I'm not even kidding.

Not like there weren't any retards back in the '80s who tried to frame the story as though the political climate of Dallas had assassinated the president, but I was surprised to read how much truth broke through in the NYT. By '83, political naivete was considered an embarrassment, no longer enshrined as "idealism," and the public was way too savvy to swallow a bunch of over-stretched apologies on behalf of a Supreme Leader who didn't even get to accomplish much in his less than three years in office.

The Left was still hanging around, but remember from this post, they were populist, not power-seekers and corporate cocksuckers. That began to change during the '90s; with respect to Camelot hagiography, especially after the 1991 movie JFK. This shift in the popular view so disturbed Noam Chomsky, a mainstay of radical politics, that in 1993 he wrote a short book detailing how Kennedy was not about to end the War in Vietnam or otherwise save us from the turbulence of the later '60s, and that there was no grand conspiracy to assassinate him or cover it up.

Well, you can imagine what the gist of the 1983 NYT article was -- "Dallas, Dallas, You can't hide, We charge you with regicide!" So let's take a look instead at the parts that you couldn't sneak past the thought police in our Millennial era. For ease of reading, I'll put my comments in brackets within a single long block quote. It mentions Oswald's name six times.

From "Dallas Still Wondering: Did It Help Pull Trigger?" (NYT, 11 / 22 / 1983):

Many demurred at the time, among them Price Daniel, a former Governor. He maintained that Oswald ''spent more time in Russia than in Texas'' and ''was not a product of Dallas, having lived there less than two months, a far shorter time than in New York, New Orleans, San Diego, Moscow and Minsk.''

[Details of the killer's background and motives – something that might give us insight. But detective work is so boring when you can just fantasize instead.]

Most horrific, the minister said, was ''the cheer that came from the crowd across from the City Hall when word came that Oswald had been murdered in the basement of the police station.''

[If there was a crowd cheering the death of Oswald, and no such cheering when Kennedy was shot... what do we conclude about how much Oswald's act resonated with the people of Dallas?]

The moral indictment of Dallas as an accessory to assassination strikes some here as strange, especially as there was scant mention of civic culpability in 1968 when other dreams were slain: those attending Senator Robert F. Kennedy in Los Angeles and the Rev. Dr. Martin Luther King Jr. in Memphis.

[Entirely obvious back in '83, mind-boggling today.]

Mr. Greene, like most others here, still sees the situation that evolved in the 1960's as a failure of the city's leadership, not as any fundamental flaw in its people.

''The real crime,'' he said in an interview this week, ''was in the leadership. They should have announced that this town was not run by these kooks. But the leadership didn't make things plain until just before the assassination, right after the Stevenson episode.''

''The police told everybody before the motorcade, 'You're not going come down here and embarrass us.' And they didn't. Ninety-nine percent of the people along the motorcade route were deliriously supportive.

[So, wait, in a city of hate directed toward Kennedy and all he stood for, there was no counter-crowd to heckle, jeer, and throw shit at his motorcade? I saw worse heckling during George W. Bush's first inauguration, where the “stole the election” thing was still in the air, and his car sped by with the tinted windows rolled up for a good stretch of the route.]

''Then the totally unexpected happened. It wasn't a right-wing kook, it was a left-wing kook, a publicly pronounced left-wing kook. It was a sudden paradox.

[Oswald's leftist background isn't glossed over as one of those “oh yeah, btw” kind of factoids. In '83, the writer drew attention to the fact that everybody in 1963 who thought they had it right, had it wrong, completely backwards in fact. Take-home message: show some humility, liberal witch-hunters.]

Indeed, the reception the handsome young President and his elegant wife received that day was tumultuous, so much so that Nellie Connally, the wife of Gov. John B. Connally, leaned over and exulted, ''No one can say Dallas doesn't love and respect you, Mr. President.''

''You sure can't,'' the President replied...

[Reminder: Kennedy was welcomed in Dallas as though he were one of the Beatles, to use a slight anachronism. Did the City of Hate forget to set their alarm clocks that day?]

Although critical, Mr. Marcus [of the Nieman-Marcus stores] and others here who will discuss the city's shortcomings of the 1960's nonetheless view a blanket condemnation as unfair.

''A lot of journalists and book writers and others hit Dallas with their stories already written,'' Mr. Greene said. ''At the time, I was generally considered a liberal editor in Dallas, but I found myself defending Dallas against pointless charges, against things the city wasn't guilty of.

''We had a lot of semipolitical kooks who had a lot of lung power but not much real power. They depended on their actions' substituting for their numbers.''

[The current NYT reporter who hit Dallas with his story already written was nevertheless willing to allow that quote in, to open up the possibility that maybe his close-mindedness is not a desirable trait for a reporter, and that he's repeating the mistakes of outside reporters from 20 years earlier. Today's writers don't allow the other side to call the writer's pre-fab fantasies “pointless charges.”]

The Washington Post ran a similar article at the time on the theme of, Should we burn Dallas at the stake or forgive its sins? Forgive -- thumbs up. Burn -- thumbs down. ("20 Years Later, Dallas Remembers -- Minus Its Shame and Guilt," WP, 11 / 22 / 1983.)

It too allowed some inconvenient truths to get past the censors, though not as much as in the NYT (the WP is more liberal). But in a sign of the levity and black humor of the Eighties, particularly on topics of Dire Consequences To Liberals, the reporter closes the article with this quote:

"For years and years, the first thing people said when they heard you were from Dallas was, 'Oh, you're the ones who killed Kennedy,' " said Shannon Wynne, a native who owns a string of nightspots. "Now they want to know who killed J.R." Down here, that's progress.

November 22, 2013

Does one clique rule the whole school these days?

Before, I looked at how there seem to be very few distinct and cohesive cultural groups among young people nowadays, as compared to the United Nations of crowds and scenes back in the '80s.

Now, your group membership is mostly based on similarity in tastes and appearance, like attracting like. In the '80s, the group you wanted to join didn't care so much about whether you were the identical twin of an existing group member for tastes and appearance, but whether you were going to pay your dues, get the back of whoever needed it, and contribute to the fun atmosphere of group activities.

Membership was more "costly," hence an honest signal of your loyalty to the team. Whereas now, you don't need to pitch in anything since like attracts like -- once they can tell you have similar tastes and appearance, they let you get close, not because you've expressed any interest in joining and contributing. Groups were thus more cohesive back then, and more superficial and prone to dissolving now (once you start digging a different type of music or sporting a different haircut you're no longer like us).

Something I didn't touch on before, but that is no less striking of a difference between then and now, is how hierarchical and hegemonic the relationship is among cliques these days. With so few distinct groups to throw their weight around, those that do exist have a corner on the fitting-in market. In some schools, they may be the only game in town -- you're either in that one group, or you're part of the atomized majority. Crucially, not an anti-establishment majority that is taking on the monopoly clique, but just a great wide sea of atomized bitter individuals.

Since groups today are mainly defined by tastes and appearance, this monopoly clique will be the ones who are the best looking and the most extraverted (compared to other Millennials, at any rate). So then, there's the popular crowd and everybody else, who would like to move up and fit in.

My strong impression is that crowds were not very hierarchically ranked in the '80s, where you might start out with low-ranking group A, then climb your way up into B, and if you were lucky, reach the peak with C. The goal was not status striving, but fitting in -- it didn't particularly matter which group, as long as you found one. The stoners were perfectly happy in their group, the metalheads in theirs, the jocks and preps and nerds in theirs. That suggests that they were all of roughly equal ranking (with some dominance, say of jocks over nerds), each could stick up for itself in a potential confrontation with any of the others, and changing membership was more of a lateral than a vertical movement.

One of those groups were the good-looking and extraverted clique, but folks in the other cliques saw them as just another clique. Perhaps they wanted to bone the girls in that clique, but they didn't bow down to them. And girls in other cliques may have wanted to be their friends, but only the truly insecure wanted to abandon their current group in order to "trade up."

Being good-looking and having good people skills allowed this group to socialize easily with any of the other groups, but they couldn't apply much peer pressure or cultural influence to out-group members. They didn't wield much power, and were not a gouging monopoly. If anything, they were often the target of potshots by the other groups -- not sustained campaigns to cut them down, but more like regular reminders that they weren't royalty, and not to get too big for their breeches.

I think this explains why Generation X chafes the most at the two prevailing modes of inter-group relations over the past 20 years -- multiculturalism, which forces groups to interact with and worship one another, rather than leaving groups to do their own thing; and authority worship, where us peons are supposed to drop everything we're interested in and get on board with whatever the fashionable people are liking -- Apple products, Instagram, Panera Bread, etc. Their formative experiences with social organization lead them to expect and demand the opposite.

And it also explains why Millennials are so gung-ho with these prevailing modes -- forced interaction between groups is like, "Omigosh, finally a group has been created for me to belong to, and I get to interact with other groups too!" And fashion/authority worship is all to familiar from their middle and high school days -- you either do what the popular crowd does, or you remain part of the atomized masses. The bustling pluralism of the '80s would feel disturbing -- like, how do you even know which group to join? Why can't someone else just create a group for you, like parents setting up a regular group for play dates?

People think that the form of social organization in primary and secondary school is of no great importance, but if it leaves enduring effects on the students' mindset and behavior, as they internalize one set of norms vs. another, then it's nothing to brush off. And these two types of organization stem from a more sheltered vs. a less supervised environment while they're growing up, as I detailed in the earlier post. (Sheltered = hegemonic monopoly group, unsupervised = organic pluralism.)

These ideas were in the back of my mind from watching teen movies like Fast Times at Ridgemont High, The Breakfast Club, and Heathers. But I've started to read more of that memoir on promiscuity that I talked about last week, Loose Girl, which is set mostly in the '80s. Maybe it's the non-fiction nature of the narrative, or maybe all the extra detail that comes from a book-length treatment, but it really struck me how flatter the social group hierarchies used to be back then, allowing for more fluid movement between groups. Fitting in and falling out was not primarily about status striving and downward mobility.

November 20, 2013

Speculations on the coming race riots

For the last several years, the media have been ignoring the racially motivated attacks by blacks against whites. But now that Jews are being targeted in Brooklyn, it's time to take the matter seriously (e.g., this article).

Clueless conservatives might try to find a silver lining here:

"Well, at least some good is coming from the never-ending paranoia and obsession that Jews have about anti-Semitism -- it gives the mainstream media a politically correct veneer to finally get out the message about dark thugs knocking out random whities."

Guess again -- the fact that only Jewish victims get press is a symptom of the underlying disease, whereby white victims of black crime get no sympathy, and the race of the attackers is scrupulously kept secret by the media. Only if it's another minority group do we hear anything, and even then it has to be a verbally combative minority group.

Although violent crime has been falling for the past 20 years, there has been a rise in a certain qualitative type of violence, namely where the victim is a stand-in for the entire group that they come from, and the attacker is bitter at and lashing out against that group -- any representative of the group will serve as a target for their anger. It's part of the broader trend toward polarization and political instability; see Peter Turchin's series of posts under the heading of "Indiscriminate Mass Murder as a Form of Political Violence" in this review post.

He's talking more about spree shootings, but blacks punching out randomly chosen whites is an equally good example of the general pattern.

Based on the rise-and-fall pattern of such violence over the past 200 years, he predicts that a peak or spike is headed our way sometime during the 2020s. (Looking historically at homicide rates, I predict a rise in violent crime around the same time. The two cycles are separate, so sometimes they'll be in phase and sometimes out of phase.)

Blacks are therefore already getting a running start, antagonizing whites as long as they feel they can get away with a slap on the wrist. Tempting fate like that will make it hard to feel sorry for them when whites eventually start fighting back, perhaps igniting a wave of lynchings and race riots not seen since the explosion of the late 1910s and early '20s.

Aside from predicting what time things will get out of control, can we also predict what geographical regions will be hit hardest? Definitely -- look at where all of these "knockout" games are taking place. They're mostly in a band from the Midwest eastward through the Bos-Wash corridor. Not in the Deep South, not in the Mountain West, and as far as I know not yet on the West Coast (though it wouldn't be hard for them to spread there).

Those are the areas that have the strongest roots in the culture of settled farmers, primarily the culture of law. Some punk knocks out a perfect stranger in a hate crime -- how do you respond there? Feel sorry for the legacy of slavery, the culture of poverty, and the absence of parents that led this wayward black to indiscriminate assault. Send him to juvie for awhile, then let him out when he promises that he's really sorry.

Blacks aren't stupid, they know they can get away with so much more disrespect and confrontation in those places. Those were also the main places in the 1960s and '70s that were shaken by race riots (i.e. by blacks).

In an earlier post, I showed that what keeps blacks in line is not the Saxon-Scandinavians and their culture of law (which only encourages blacks to fuck around with strangers), but hell-raisin' Celts and their culture of honor. When you're surrounded by hawk-nosed hillbillies who aren't afraid to go to jail in order to defend the community, it makes you think twice about starting shit with strangers in the first place.

And after the whole thing blows over, the namby-pambys will say, "Welp, guess that's the occasional price we have to pay in order to enjoy a civilized law-abiding culture, and not end up like those poor hicks in Kentucky and West Virginia." Right, there's too much atomization in the legalistic areas for anyone to feel an altruistic drive to break the silly little law if it means keeping or restoring harmony in the broader community.

As Turchin's writings on these topics show, political / indiscriminate violence is linked to rising inequality. So for the long term, it's better to keep inequality from rising. But that is mostly a function of a change in the mindset and behavior of the elites, not of those at the grassroots. The elites are the ones over-producing themselves, socializing costs while privatizing benefits, and locked in internecine status contests. What can the hicks in Kentucky do about that?

In the short term, the correct response is not to wring our hands about "addressing the root causes that lie in economic inequality," but to strike back at those who are spreading disorder. Reducing inequality, while desirable and helpful over the long term, won't be achieved overnight. Meanwhile the legacy of slavery is at your door -- tonight -- trying to break in and "rightfully take back what the white man stole from my ancestors."

I wouldn't stress the importance of being armed in the unlikely event of self-defense either. It's too late by the time you face that choice. The hordes are already roaming your neighborhood at that point, and you're shivering alone in your cocoon. You want to make sure they don't even think about setting foot in your community. That requires social organization with your fellow community members, and sending clear warning symbols that would-be hate criminals will be overwhelmed by a bloodthirsty mob before they can escape back to wherever they came from.

It's the same thing that keeps skinheads from going into the Bronx if they felt like messing around with blacks. "Shit, if those honky-ass skinheads came in here, we'd woop they ass all the way back to the suburbs."

Sometimes you have to fight fire with fire.

November 19, 2013

In the digital age, will we have memorable pictures fit for an album?

I remember as a kid looking through boxes and albums of pictures when the scenes captured on film were scarcely in "the past." Say, the summer vacation, birthday party, or Christmas morning from a few years back. Whenever my family meets up where those pictures are being kept, we still look over them. Aside from those being better times -- mostly the '80s, and when my mom and dad were dating and got married in the '70s -- they are just more enjoyable to look at, out of context.

You can't say that about the gigabytes of digital pictures that you have. You hardly ever browse through them in the short term after taking them, and you aren't ever going to huddle around the laptop with your family and click through image files. Aside from those coming from less eventful times, they just look flat, harsh, and drained-out.

Here is one of the first results that came up for "family picture 2009" on Google Images, and sadly is not unrepresentative of what our pictures look like these days, albeit in exaggerated form to make the points clear. (Click to enlarge.)


Most notably, the light that should be shimmering on the surface of the lake in the background is blown out, near pure white. Worse, it's bleeding into the white clothing and blond hair of the portrait subjects -- you can't make out the contours of the girl's head who's wearing the blue shirt. This harsh area must take up about one-quarter of the entire picture, too. Nothing like a bigass spotlight shining in your face to make you want to continue looking at the scene.

There's not much range in the brightness levels either, which would have given it a more striking presentation. The lighting is more or less the same intensity, aside from that overexposed lake. Where are the dramatic shadows to counterbalance all that brightness?

The colors don't really jump out at you either. They're not Matrix-level bland, but they aren't vivid.

And everything looks realistically smooth, without the kind of grain that you see on projected film when you go to the movies. The atmosphere looks totally ordinary, hence will not tickle your senses or stick in your mind.

Now for a picture that belongs in a photo album. Here's one of the first Google Image results for "girls 1985" --


What a world of difference! The whites look pretty bright in the sunlight, but they're nowhere close to being blown out. If you click the larger version, you can make out the details of folds in the fabric of the tank tops, which would be wiped out into harsh, flat white space if this were shot in digital. And look at how much shadow there is at the same time, not just around where they're standing, but the darker cast of the trees way in the back, and the "inner" areas of the trees nearby that aren't at the outer edge catching sunlight. The lighting spans a wider range here, and it gives it a theatrical look.

Colors are lush, warm, and saturated -- welcoming -- and they don't look like something you could just pull up on MS Paint. It looks like there's a red or pinkish cast to the whole picture, too -- probably by accident, but what the hell, it makes it look special.

And there's enough grain to make the atmosphere somewhat hazy and dreamy. It just looks like a memory. Contrast this with the "you are there" photorealism of the first picture, which looks like it was shot for a news item in the local paper (who's going to remember that?).

I didn't say the second picture was going to be hanging in a museum, that's an irrelevant standard. It is the kind of picture that we'd all be happy to keep in an album and enjoy refreshing our memory by pulling it out and huddling around it.

In general, our film pictures look like a stylized portrayal of a reality that we're familiar with, while our digital pictures are more uncanny the more you look at them -- like strange robots coated with passable human skin, changelings of the people and places we thought we had recognized.

I plan to do a longer, mostly image-filled post on film vs. digital photos, as they really exist in our albums and on our hard drives, not hypothetically. In the meantime, here's some further reading on how the two media differ in ways that matter to real-life viewers:

120 Studio has a bunch of informative and lively essay-ramblings, such as this one on the incredible range of brightness that film can capture compared to digital, which easily blows out bright areas. It sounds analogous to the dynamic range in volume that's been lost during the "loudness wars" in the recording industry. He has two broader reviews of the divide here and here.

Here is an even briefer but information-packed guide to the differences from eBay.

And here is a photographer who experimented around with both media using the same subjects and environments. The film shots don't look quite as lush and slightly surreal as the '80s girls track team above, showing that film has changed quite a bit since then. But the differences with digital are still there.

November 18, 2013

Millennials don't knock

When students drop by the office for help, whether during scheduled office hours or not, I've noticed that they almost never knock.

Our door swings shut, so it's not as though it's open and I'm expecting a courtesy knock. I just hear the handle turn and then there they are, walking right in. And usually they don't make much small talk first -- just launching right into "So I don't know what to do on #5 here..."

Everyone born before the Millennials always knocks, whether they're other grad students, tenured professors, or secretaries. We haven't had too many early Millennial grad students come by, so the lack of knocking may belong more to those born around 1992 and after. As bratty and socially awkward as the early ones are, the later ones are noticeably worse.

Where did they learn his habit? Or rather fail to learn the normal habit? They certainly didn't enter the teachers' lounge at school before college, and most of them have never worked, let alone barge into their manager's office. And they generally don't sleep over at each other's houses when young, or even hang out much in their "friend's" houses as teenagers.

So they must've picked this up at home, more or less the only place where they've ever had to interact with others socially, and certainly when it's in an environment where there might be closed doors. But why didn't they at least pick it up at home?

Do helicopter parents maintain an entirely open-door home environment? That would help them constantly monitor their kids, and allow the kids to come complaining no matter when. Floor plans are a lot more open these days than in the '80s, typified by the new "great room" in the McMansions of the past 20 years. Or do they have closed doors, but just give their socially clueless kids a pass whenever they interrupt someone else's activity unannounced, rather than prepare them for the real world?

Beats me. Whatever the cause, though, it's yet another telling sign of how unsocialized this generation is. Each of these things is small in itself, but piled all together, you understand why their co-workers have been complaining about how entitled the Millennials are in the workplace and at school.

November 16, 2013

Diet quality and inequality

In the post on the return of old diseases and pests with rising inequality, someone in the comments asked about the obesity epidemic, which also began around the time that inequality began rising -- the late '70s and early '80s.

That definitely is one of the most striking class divisions nowadays -- poorer people are fat, while wealthier people are thin. It certainly wasn't like that in the '50s when the classes were much closer to each other in material terms.

Rather than look at obesity, which is affected by several factors, I'd rather go straight to the diet itself. We also have data going back much further on food availability than for obesity. Here is a post reviewing historical data on the availability (per capita) of different food types in America from 1910 to 2007 (the source is the USDA). We infer amount consumed from amount available; changes in availability track changes in consumption.

If we want to look at a single measure of nutritional well-being, we look at red meat. In the Peter Turchin review of the longer history of cycles in inequality, he too looks at meat consumption as an indicator of how well-off folks were. Here's how it changed over the last 100 years:


It does show a steady decline during the recent widening of inequality, whereas it had been rising throughout the mid-century as inequality was narrowing. Meat consumption had fallen earlier as inequality was rising toward its peak circa 1920. That fall lasts a little longer than expected, not turning up until the late '30s, noticeably after inequality had begun to narrow. (Some but not all of that could be related to the Great Depression.) Overall, though, the fit is pretty close.

Since the top levels of the income pyramid had enough to buy red meat, the large swings up and down reflect the dietary changes of the middle and bottom levels. Meat is not a far-out-there luxury item that only sells more when there are more millionaires. Its sales depend on how well the average family is doing. Is the son going to have steak several nights a week, or will he be nuking a bowl of Ramen noodles and washing it down with a Mountain Dew?

I won't re-post all the graphs here, but going through the rest of them...

- Dairy shows a positive link with inequality, as though middle and lower-income folks are substituting dairy for red meat when times are harder for them. That's what cattle-herding people do around the world -- if you come into a windfall, feast on meat, and otherwise drink milk.

- Eggs don't seem to be related one way or the other.

- Poultry and fish were flat until around 1940 and began increasing steadily afterward. However, this did not make up for the decline in red meat over the past 40 years. Around 1970, there was about 145 lbs of red meat, 35 lbs of poultry, and 10 lbs of fish, or 190 lbs total for animal meat. By 2007, there was closer to 115 lbs of red meat, 75 lbs of poultry, and 15 lbs of fish, or 205 lbs total.

That may sound better, but remember that most of that increase is due to a soaring amount of it being poultry -- i.e., 99% fat-free boneless skinless chicken breasts -- and less coming from beef. Lean chicken won't provide much of the essential fatty acids, or allow your body to absorb the fat-soluble vitamins (A, D, E, and K). Fat also has more than twice the calories per weight as protein, so switching from red meat to chicken reduces caloric intake. Chicken also has lower concentrations of vitamins (like B-12) and minerals (like iron) compared to beef. In all, the shift away from red meat and toward white meat has been for the worse.

- Fruit and vegetable consumption has gone up since 1970, but you don't get much out of those foods.

- Grain and potato consumption looks directly related to inequality as well, rising since about 1970, and falling during the Great Compression. However, that decline began at least as early as 1910, when there was still a good 10 years left in the rising-inequality period. Still, a very close fit. As with fruits and vegetables, you don't get too much out of these foods, other than sheer calories and a mega-dose of glucose, and all the metabolic problems that brings.

Taking all of that into account, it does look like diet quality takes a turn for the worse (i.e., toward vegetarianism) when inequality is widening, and becomes more wholesome when inequality begins narrowing. As always there are other factors that help to explain why those food availability graphs look the way they do -- but it's striking how much you can capture just by pointing to the rising or falling trend in inequality.

Wide variation in basic physical appearance is one of the most alienating forces -- "those people just don't look like us," and sheer size is hard to miss or dismiss. You may not feel the alienation from seeing luxury shoppers pass by homeless pan-handlers unless you live in a city, but everyone is surrounded by disparities in weight, so it's hard to ignore how far apart the classes are separating. As Robert Putnam's work showed in a multi-ethnic context, wider diversity means lower feeling of community.

That point is either under-appreciated, ignored, or outright denied by mainstream conservatives -- that inequality leads to a breakdown in community, through a similar process as ethnic diversity weakening communal bonds. But that's for another post.

We should ask how deliberate the elites have been in fattening up the middle and lower classes of our society. They were the ones who pushed this meat-phobic, fat-phobic, low-cal, grain-fruit-veggie dogma on everyone else, right as inequality began rising. Sure, the elites themselves drink their own organic kool-aid, but they still eat a lot of red meat, cheese, eggs, and so on. Whereas the lower classes take the anti-meat message seriously, and load up on processed carbohydrates instead.

There's likely a class difference in preferring meat over starch-and-sugar, but even allowing for that weakness among poorer people, the elites should then be even more careful about preaching a carbo-holic diet and demonizing meat. Wittingly or not, this boneheaded vegetarian crusade has become a tool of class warfare.

And part of the increasing junkiness of lower-class diets reflects a drop-out mentality when they perceive the deck to be stacked against them as inequality widens. Fuck it -- why try and be healthy when nothing matters at the end of the day, when we're going to be even worse off next year? Maybe to try and keep up appearances, to keep ourselves presentable to others? Screw that, too, if our superiors are going to act antagonistic and snotty toward us in the first place. I don't care who sees me looking fat and disheveled in public, when none of them give a damn about me anyway.

I don't really agree with or resonate with those sentiments, but it's not hard to see where they come from, and how they would abate if we all ate better, instead of income inequality polarizing the distribution of fat across the classes.

November 15, 2013

Codependency as a sign of anti-cocooning

While browsing around the stacks, I saw a spine labeled Loose Girl: A Memoir of Promiscuity. I thought, Oh great, another one of those intentionally outrageous titles trying to over-hype how sexually active girls are these days, when they'd rather diddle their phone than flirt with boys.

But then "1980s" caught my attention on the first page, referring to how common it was then for adolescents to have divorced parents. Oh OK, this is going to be a look back at a time when sluts and nymphos could still be found stalking the high school hallways. I read the first chapter of about 10 pages and flipped through parts of the rest.

The takeaway is that giving herself over to an endless series of interchangeable guys, whether she truly desired them or not, and rarely enjoying a moment of pause where she wasn't pursuing or hooking up with some guy, came from her need for attention, fear of being alone, and wanting to be taken care of emotionally by someone more effectual, in order to alleviate an ever present gnawing anxiety about being left to fend for herself in an uncaring world. She is a textbook case of codependency.

That's not the kind of personality we'd like to see in our daughters, and such behavior would worry us if she were our friend. Still, these extreme cases are useful to infer how approaching vs. cocooning the general population is. The farther we are toward the approaching and self-denying side, the higher the fraction who will exceed the threshold of codependence. The farther toward the cocooning and self-focused side, the lower the fraction.

Looking at how this fraction grows or shrinks therefore tells us where the entire population is moving toward. In times when the fraction is higher, it doesn't mean the average person is way off in the codependent extreme -- that's a hysterical exaggeration of the same type that says Thank God we're out of those highly homicidal 1980s, as though we all fought off murderers every time we stepped out the door. Rather, it means that the average person was simply more approaching toward others, and was more willing to set aside their personal wants in order to satisfy those of others.

Unfortunately, public surveys don't ask these kinds of questions, let alone over a long time span. However, let's take a look at how common the term "codependency" has been in Google's digitized library of books (Ngram):


You can't tell because of the scale, but there are sporadic hits in the 1960s and '70s, before it soars during the '80s and early '90s. When you un-smooth the curve, the peak year is 1992. And it's come crashing down over the last 20 years. The same pattern shows up no matter what variation on the term you choose.

That fits with the publication of the most popular books on the topic. If you google "codepen..." it will suggest the title of the mega-selling self-help book, Codependent No More: How to Stop Controlling Others and Start Caring for Yourself. Two others by the author on the same topic appear in the suggestions at Amazon. All were published between 1986 and 1990, foreshadowing the reversal of the phenomenon starting in the mid-'90s.

What has replaced the needy, clingy codependent since then? Their opposite: the dismissive avoidant type. In the framework of attachment theory, the clingy type has a view of others that is positive -- others are effectual and deserving of support -- and a view of self that is negative -- ineffectual and unlovable. The dismissive type is their inversion -- others are incapable and worthless, while I'm so awesome at what I do, and you'd have to be delusional not to love me.

We would like for everybody to be within the healthy, normal range of personality and behavior, but in real life those traits are distributed in a bell-shaped curve, so that we're always going to have some kind of extreme cases with us. The open question is -- what type of extreme will they be? I don't know about you, but I'd rather go to school, work, and bed with someone who was needy and self-effacing rather than a dismissive egomaniac.

November 14, 2013

The sex trade and inequality, 1920 to present

In earlier posts on the historical cycles of perversion, I looked at how popular culture revealed the audience's preferences. See here for a list of posts on the unwholesome mid-century. Briefly, in times of cocooning people show a lurid interest in sexuality, and in outgoing periods, a jocular attitude.

I'm most interested in what made people tick in different time periods, so I look at the demand side. The supply side is complicated by technological progress, which moves independently of the social-cultural zeitgeist. And in any case, I don't find the production or business side of pop culture as fascinating as what some phenomenon reveals about everyday people.

But for some phenomena, you might have to take into account what factors motivate the people on the supply side. Not so much "what attitudes to business owners and managers have?" but more like what would lead the actual workers to take part in the business that creates the product. Those choices could also reflect the cocooning / outgoing trend, but decisions about economic life will probably respond more to the status-seeking / status-ignoring trend.

Turchin's work on the dynamics of inequality puts status seeking, over-production of aspiring elites, and intra-elite competition behind inequality (an effect of those divisive processes). See this post of his on the topic, using lawyers as a case study. I'll refer to status-seeking or rising-inequality depending on which aspect seems to be more important.

Something I could never explain using just the cocooning vs. outgoing framework is why the mid-century sex culture, while just as lurid as ours is, nevertheless did not specialize in pornographic photography or movies, lapdance clubs, 1-900 phone sex services, and so on. It was more lurid comic books and pulp novels, voyeuristic girlie shows (with no contact or proximity), and the incredibly rare stag film (hardcore). *

The state of their technology allowed for all of our forms of commercialized vice, yet they didn't make nearly as much use of it (they didn't even take a stab at inventing phone sex). So the answer must lie more with economic motives of the workers themselves.

During the Great Compression of circa 1920 to 1970, young women showed a declining interest in climbing up the status ladder as fast as possible and in financing the acquisition and display of luxury by earning quick-and-easy money. Their increasingly modest tastes made them feel more comfortable with a slow-but-steady typing job. Sure, a stint as a pornographic actress could have raked in a lot more dough than that, but what would she do with all that money? She wasn't going to stain her feminine honor just to add unnecessarily to her bank account.

This shows the importance of looking at both supply and demand. The demand was there, but in those days it just wasn't true that "everybody has their price," or that demand would have been met.

Beginning in the '70s, and becoming more and more visible over the '80s, women shifted their priorities, becoming more fashion and status-conscious. Their interest in fashion was more fun-loving in the '70s and '80s, and not so hostile like it became in the '90s and 21st century, when the separate shift toward cocooning made them less interested in getting along with others on an interpersonal level.

And sure enough, that's when hardcore pornographic movies began to grow, continuing upward right through the present. That steady rise since the '70s/'80s is unlike the changes in interpersonal sexual behavior, where folks were more and more connected during the '70s and '80s, but segregated themselves from the opposite sex during the past 20 years and don't get it on as frequently as they used to. You see the same steady rise in lapdance clubs, and virtual outlets like phone sex, nude photos, and webcam girls.

The only area that shows a decline is prostitution, where arrest rates fell by over 40% from 1990 to 2010. I interpret that as a substitution effect from all those other outlets. They're all ways to get relief through transactions rather than interactions, and the men who pay for those things weren't so concerned with making it with a real-life partner that they'd risk disease, arrest, fines, etc.

In periods of rising inequality, the bottom chunk of society feels like the deck is stacked against them. So lower-status men, sensing the widening divide, might reason that it's not worth even bothering with the real-life interaction game, where the available women are already taken by high-status men, or have effectively taken themselves off the market by waiting around for a taken high-status man to become newly available. **

Those lower-status guys still have a libido, though, so they'll drive up the demand for commercialized sex. As long as he has enough cash or credit, his lack of status won't count against him. This effect on males comes more from inequality, while the effect on female sex workers comes more from the status-seeking trend closely related to it. Giving schlubs a lap dance, or shooting a porn scene once a week, is a quick and easy way to finance your conspicuous consumption and student loan debt (the higher ed bubble being another consequence of status-seeking).

All this predicts that we ought to find a burgeoning economy of commercialized sex during the Gilded Age and Progressive Era, as inequality rose toward its peak in the mid-1910s. Would it surprise you to learn that prostitution was widespread during the Victorian era? Perhaps a more helpful name for the period would be the Victorian-Dickensian era, to remind folks of the heartlessness and the growing disconnect between the upper and lower layers of society.

I planned to review the history before 1920 in this post, but it's already gotten long enough, so I'll save that for a follow-up. It really deserves its own post, too, since all of it will be unfamiliar to today's readers.

* Judging from ubiquitous ads in comic books, teenage boys were eager to buy X-ray specs and pocket spy telescopes in order to see girls naked, or learn the art of hypnosis in order to get her to sleepwalk into their beds and get it on against her will. You didn't see lurid stuff like that in the teenage culture of the 1980s, because boys were more socially connected to girls, went out on dates, and got what they wanted.

** Is economic inequality reflected in dating-and-mating inequality? Hunter-gatherers don't have harem organization, while concubines and other forms of polygyny are often found among more stratified groups like large-scale agriculturalists.

November 10, 2013

The return of old diseases and pests with rising inequality

When the rich get richer and the poor get poorer, you'll see all sorts of new phenomena that reflect the rise in the ranks of very wealthy people, particularly if the phenomenon allows the person to engage in intra-elite status competition (a correlate, or even a cause of the inequality).

Like that show on MTV about which wealthy family could spend the most, and in the most original way, on the daughter's 16th birthday bash. As you might expect, the spoiled daughter was never grateful and often cursed the parents out or cried about how they had ruined her party. But making their daughter happy was never the point -- it was to step up the competition against the heads of other wealthy families.

How about at the low end? There may not be an upper bound to how much wealth you can own, but it would seem that you can't get any poorer than $0. Even if you count debt, most lenders only let you get so far into negative wealth before they cut you off. Can it go lower still? Yes, if we look more broadly at quality of life rather than only income or wealth.

Health is the most obvious place to look first. Now the lower bound on "poor" goes down to being dead. Before you hit the lowest low, though, you can develop all manner of sicknesses. Or contract them -- repeatedly contracting a disease will make your life miserable for a lot longer and more intensely than the slow, barely perceptible build-up toward a debilitating disease of old age.

Let's start with an example from infectious person-to-person diseases. Despite all that antibacterial soap and hand sanitizers, influenza isn't any less common than it was 15 years ago. That quick reality check tells us to be wary of technological explanations for trends in disease.

To make the point more forcefully, we need a case with data that go back a long time. Say, whooping cough (pertussis). Here is the CDC's page on trends in the disease, which contains the following chart:


Those three acronyms with arrows pointing down refer to the introduction of different vaccines, the first coming out in the late 1940s. Right away we see that they have absolutely nothing to do with the dynamics of the disease they're supposed to protect against. Incidence is rising until a peak around the mid-1930s. The steady decline begins a good 10 years before the vaccine is even introduced, and its introduction does not accelerate the decline (it decelerates after that point, if you want to make anything of its introduction at all). Without any vaccine at all, the incidence had already dropped by roughly half as of the late '40s. Since a low-point around 1980, the incidence has exploded by a factor of 50 -- despite not one but two new vaccines introduced during that time.

That graph does not match a graph of medical technology, but of inequality. Why? Beats me, but it does. It's not access to health care -- that graph shows cases, not dire outcomes like mortality. And again the main preventative measure that could be obtained from health care doesn't explain the rises and falls anyway. For crowd diseases, I lean toward blaming increasingly crappy living conditions -- not just for the private household, but the wider crappiness of the block and the neighborhood that the bottom chunk lives in. It's a return to the dilapidated tenements of the early 20th C., which were made into much nicer places during the Great Compression of 1920-1970.

You can fill in "increasingly lax regulation" for all these examples, to explain how the conditions were allowed to worsen. In general, welfare state policies and regulation go together with falling inequality.

Speaking of ramshackle apartments, what ever happened to that children's line about "Good night, sleep tight, don't let the bed bugs bite?" Well, the suckers are back. I checked Wikipedia's page on the epidemiology of bed bugs, and poked around the JAMA source. Bed bug infestation shows roughly the same dynamics as whooping cough, even though these are macroparasites instead of a contagious bacteria. They began decreasing sometime in the 1930s and bottomed out sometime around 1980, increasing since then. I never noticed any problem with them growing up, but exponential growth looks slow when it starts out -- the part where it takes off like a rocket happens a little later, during the 2000s in the case of bed bugs. By then several major cities began freaking out.

Not knowing anything about the ecology of bed bugs, I'd point again to crappy living conditions as the breeding ground, although they may certainly spread out from there to better maintained places nearby.

Immigration also plays a huge role here, since if foreigners carrying bed bugs not only drop by but stay put in our country, there's little we can do. Immigration is another correlate of inequality (see the Peter Turchin article linked in the post below). After it was choked off during the 1920s, bed bugs began to disappear. As it began rising in the '70s, it wouldn't be long until bed bugs made a comeback.

Does immigration relate to whooping cough in the same way, being brought here and maintained by a steady influx of pertussis-stricken foreigners? Probably, though someone (not me) should look into that in more detail.

Finally, there are food-borne illnesses. We all remember the deplorable sanitary conditions detailed in Upton Sinclair's muckraking novel The Jungle from the Progressive era. I don't have any data from back then or the middle of the century, but given how little hysteria there seems to have been by the 1950s, I presume that the incidence of food-borne illnesses had fallen quite a bit.

What's the picture like since then? Unfortunately the CDC's data only go back to the late '90s. Several diseases are down, while Vibrio is way up, but the main story is Salmonella -- by far the leading food-borne illness. It shows no change one way or the other since the late '90s. Eurosurveillance, however, has some data on England and Wales going back to the late '70s, and ending in the late '90s. Aside from Shigella, they all show overall increases, including the two leading offenders, Salmonella and Campylobacter.

My hunch is that the causes today are what they were in the days of The Jungle -- crappy sanitary standards among food producers who cater to the bottom chunk (or those at the top too, for all I know). That's pretty easy to get away with because most people who get sick won't report it, and won't know which of the variety of things they ate made them sick. Only if you let it really get out of hand will people complain, the FDA or USDA figures out it was you, and recalls your product.

These three examples span a range of causes of disease -- bacterial crowd disease, macroparasitic blood-sucker, and bacterial food-borne diseases. Yet they all show pretty similar dynamics, matching up with inequality and bearing little resemblance to the state of the art in medicine. They are all symptoms of the deterioration of living standards as basic government oversight falls by the wayside for such areas (again, primarily the bottom chunk is no longer being looked out for). They also reflect the divisiveness that accompanies over-production of elites and intra-elite competition. No time to look out for the little guy -- let him fend for himself.

For our neo-Dickensian society, the simplest policy improvement is to shut off immigration, including legal immigration. Why not? They carried that out in the '20s and enjoyed a half-century of falling inequality and narrowing of political polarization. Of course other changes in the popular and elite mindset took place, but we ought to begin with the policy that is the simplest to think up and to implement.

Conservatives (or the handful of populist liberals who are still hanging around) should not get side-tracked by legal vs. illegal immigration. Both increase the supply of labor, and thereby decrease its price, i.e. wages. And it'll be asymmetric -- loads more competitors for bottom-chunk jobs, and few or none for top 0.00001% jobs.

Ditto for worsening the public health burden -- whether they came here legally or not, if they stay put, they're yet another potential spreader of whooping cough, another harborer of bed bugs, another food maker with shoddy standards. And they come from places with higher rates of those plagues than we already have to deal with here. Unlike their effect on wages, though, immigrants could very well infect the elites with their diseases, however indirectly, and albeit to a lesser extent than they would infect native bottom-chunk folks who live on the same block.

Final thought: I wonder if the 1918 Spanish Flu pandemic has to do with the turbo-charged levels of globalization and open borders that reached their peak around that time. After the borders were slammed shut in the '20s, you didn't see that level of pandemic in the Western world during the Great Compression. There was that flu "pandemic" in 2009, but that will probably go down as just the first in an escalating series of major outbreaks. We aren't right at the peak for widening inequality yet, so our return of the Spanish Flu hasn't materialized just yet. The future promises to be an interesting time.