August 31, 2013

When did white people stop listening to rock music?

What I mean is, when did black music overtake white music among white audiences themselves? The purpose here is to clarify the shape that white music tastes took after Baby Boomers and even some Gen X-ers stopped paying attention to popular culture. I've read or heard things to the effect of white people these days having their own music, and blacks their own music.

The most recent was the comments at this post at Steve Sailer's about the Ramones, punk, who removed black influences from white music, and what's happened since. The end result being whites with really white-sounding rock music (indie), and blacks with really black-sounding rap music (gangsta / braggadocio).

It's easy to understand why the folks in these discussions tuned out of pop music during the '90s, early 2000s, late 2000s, and this decade -- it's all been downhill since the peak in the '80s. They probably tuned back in for a little bit during the mid-2000s, when there was a brief revival of styles from the late '70s and '80s. But that would leave a mistaken impression that those songs were the dominant ones of the time, or that they'd been around for awhile before and have stuck around since.

That post-punk / new wave revival only left one hit among the top 20 on the Billboard Year-End singles charts -- "Mr. Brightside" by the Killers in 2005. The rest of the mid-2000s is Rihanna, Beyonce, Usher, Nickelback, Evanescence, and other boring 21st-century-sounding junk. That sound wasn't there at all in the '90s, and had already burned out by 2008.

So then -- when did white people switch over to mostly black music? The year-end charts for 1992 show rock music still doing well. In the top 20, we find two power ballads -- the stereotypical '80s sound -- in "To Be with You" and "November Rain". Heartland rock, made famous by John Cougar Mellancamp also in the '80s, is hanging on with "Life Is a Highway". Michael Jackson's representing pop rock with "Black or White". And alternative rock scores its highest hit ever, very early, with "Under the Bridge".

All of a sudden in 1993, rock disappears completely from the top 20. Everything is now R&B, rap, reggae, and new jack swing (a kind of mix between R&B and rap). The sole exception is a pop ballad from the Aladdin soundtrack, "A Whole New World". Things hardly let up in 1994, where "Wild Night" is the only rock song to crack the top 20. In '95, there's one pure rock song ("Always"), with another Latin rock song ("Have You Ever Really Loved a Woman?"), and the biggest blues rock hit since rock 'n' roll began -- "Run-Around" by Blues Traveler. The Gin Blossoms barely broke into the top 20 in '96 with the watered-down "Till I Hear It From You".

It's not until 1997 that whiny-dorky rock comes into its own, with "MMMBop" by Hanson and "Semi-Charmed Life" by Third Eye Blind both entering the top 20. That's about where it's stayed ever since -- a couple of dull rock songs at best, and mostly R&B, rap, and pop. And it's not uncommon for the rock songs to have a heavy rap, reggae, or other black influence -- especially in the late '90s and early 2000s with that whole rap-rock fusion phenomenon. Just outside the top 20 in 1999, Everlast scored another blues rock hit with "What It's Like". (To underscore the black influence on the white man's music, the album's title is Whitey Ford Sings the Blues.)

In hindsight, the mid-'90s hegemony of R&B, rap, and new jack swing seems like an overshoot of the "ditch the white music" impulse that whites themselves began to feel. That time in general was a great big divorce period from the New Wave Age and the Eighties in particular. When you're going through a divorce, you exaggerate how much you hate the other side, just to ensure a permanent break with no hesitation. Once you've been separated for awhile, the hostilities die down somewhat, and you can at last occasionally speak to each other, though with the upper-hand party never forgetting to remind the lower-hand party that they're never going to accept them back. Such was the fate of rock music over the past 20 years.

If this painfully awkward trip down memory lane serves to correct the Boomer perception of white kids listening to uber-white music these days (aside from a handful of SWPL indie nerds), it should also correct the misperception that Millennials and perhaps even some Gen Y folks have about this always being the state of affairs. You guys have no idea how culturally assertive and cohesive white America was back in the '80s. And it lasted right up through '92, even if that year represents a grinding-to-a-halt more than a final victory lap.

Let's end with a reminder of how healthy, bouncy, and CATCHY rock music still was in '92:



August 30, 2013

Reluctance to read new authors, '80s vs. the 2000s

This NYT article discusses the smashing success of the new book The Cuckoo's Calling -- which only started selling like mad once it was revealed that the author was none other than J.K. Rowling, founder of the Harry Potter colony.

Several sources are quoted to the effect that a book needs instant brand recognition these days in order for it to make it big time. They all avoid pointing their finger at today's conformist risk-averse audiences, and treat it like some new constraint on the industry that came from who know's where, but that we've all just got to adapt to.

You see the same thing when music, movie, and TV people talk about the same phenomenon in their own industries. No one wants to see the big picture because blaming the consumer would be bad for business. Still, you'd think at least the critics would point this out -- that folks today are so disturbed by novelty that they won't take a chance on anything unfamiliar. It's not that they're afraid. More like suspicious, not trusting... but also OCD. "Wait, there's something new here -- that messes up the old inventory!" It's like being a picky eater -- OCD to the max.

We're supposed to live in a fascinating new era where so many options are at our fingertips, and everyone is dipping into this and then dipping into that -- a crazy world where no one commits to anything. In reality, people these days rigidly adhere to what is familiar, forever. The utter lack of novelty and variety, the never-ending reign of blandness, is a world apart from cultural life in the good old days.

To put some hard numbers on this, let's stick for now with book publishing (although the same approach will work for pop music or movies, and I'll get around to those later). We want a statistic that will capture how dominant the top authors are over time, vs. how easy it is for new authors to break into the big leagues. So we'll look at Publishers Weekly list of best-sellers by year. Dominance, hegemony, etc. -- these all refer to a certain author appearing time and again, or having multiple hit books in a single year.

Therefore, we'll look at a consecutive 5-year period, and take the top 10 best-sellers in each year. Then ask: how many distinct authors are represented among these 50 best-sellers? Or even better, how many authors had just 1 hit, 2 hits, 3 hits, and so on? Then summarize that distribution with something like the average or median number of hit books by an author of that time period.

Ideally, I'd do this for every 5-year period in the data source. But this is just a pilot study to quickly prove a point (how much things have changed since the '80s). Let's contrast the period of 1984-1988 to 2004-2008. Only 20 years apart, not even a full generation. Mature industries like book publishing couldn't change very much in so little time, right?

Well, in the period centered on 1986, there were 28 distinct authors who wrote the top 50 books, while in the period centered on 2006, there were only 22. The average author in the '80s period could have expected to have 1.8 hit books, while his counterpart in the 2000s period would have had 2.3 hit books. Even looking at the median, which is much more resistant to change than the average, increased from 2 books to 3 books by best-selling authors. In the '80s period, only 2 authors had more than 3 hit books; by the 2000s period, it had risen to 6 authors who were that dominant.

The maximum was, however, the same in both periods -- 7 hit books (Danielle Steel claimed the prize in the '80s, James Patterson in the 2000s). The mode (or most common) number of hit books in both periods was also the same -- 1 book. As in other domains where eminence has meaning, the distribution is geometric -- most have only 1 hit, fewer have 2, fewer still have 3, and so on, until there's only a small elite who have lots of hits.

So, indeed readers have become more reluctant to take a chance on new authors since the 1980s. It would probably look even worse if I chose the late 2000s/early 2010s as the end-point. You'd see the same thing if you looked at how many artists are responsible for the top 10 songs across a 5-year period from the '80s vs. the 2000s. And the same for how many movie stars were in the top 10 movies at the box office. There are so few fresh faces these days.

The fact that these trends span so many different industries means that we can dismiss any explanation that is idiosyncratic to an industry. My take is that this is just one piece of the larger pattern of the decline in novelty-seeking since the early-mid 1990s. That has both a physical side, where people don't go out joyriding, dancing (with another person), having casual sex, or hanging out in public places anymore. But even private consumption shows the pattern, where anything unfamiliar is either suspicious or throwing the existing order outta-whack. I can't tell whether mistrust or OCD is the dominant factor here. Whatever the case, it goes to show that cocooning is not only a spatial phenomenon.

Finally, I suspect today's pattern would also show up during the mid-century, another period widely regarded as one of cultural stultification. The approach used here would let us figure out whether that stereotype is deserved, or whether the Fifties just gets a bum rap. I plan to look into this in the future, for all three major industries (books, music, movies) that have some kind of "best-selling" lists. Stay tuned.

August 28, 2013

Jews and the European civilizational fault-line

The last installment in this series looked at how the great civilizational fault-line in Europe affected the spread of Christianity. Basically, on the more hilly/mountainous/pastoralist side, it spread early and eagerly, while it took far longer to spread across the farmland of the Great European Plain, and was more violently resisted.

I'll get to the Reformation and the fault-line in a future post. For now it's worth taking a look at how the fault-line affected the migrations and persecutions of the Jews. Most of the big migrations of European peoples ended with the Germanic and Slavic movements of the Middle Ages. The Jews were never that huge in numbers, but they do provide a more recent case of large-scale migrations, allowing us to see if the influence of the fault-line holds even later.

Jews had settled northwestern Europe by the Middle Ages, becoming specialized in white collar professional work, generally of a financial nature (such as money-lending and tax farming). I'll be talking about the Ashkenazi Jews of northern Europe since that's who we think of when we talk about Jews in European history.

There were both push and pull factors involved in their migrations toward the end of the Middle Ages and into more modern times. The biggest push you can get is expulsion from the country. Here are some maps showing when the Jews were expelled from various parts of Europe, and where they went, up through modern times (the red-shaded map is of the Pale of Settlement):




Out of all those arrows, there are basically two movements -- one toward Poland, and another toward the Ottoman Empire. The latter are the Sephardic Jews, and not who we're looking at. (They also have a small number who remained in Iberia, who went to Italy, and a contingent who settled in Amsterdam.) The bulk of the Jewish population, the Ashkenazim, were kicked out of one place after another, always moving a bit more toward the east and toward the north. Unless they were already very far east and somewhat south, like in the Ukraine, in which case they moved west and up north.

Why do the arrows tend to stop, or at least get a lot shorter, once they got into the European Plain? Why didn't they converge on Italy instead of Poland? Something about the ruling elite in the hilly/mountainous side of the fault-line made them willing to kick out the Jews, or to not find much of a point for them to stay, while the agrarian elites of the Plain were less hostile. In fact, the rulers of Poland and Lithuania welcomed them as an administrator middleman class.

I'm not sure the difference is so much between the elites in the west and east, but more among the common people who they ruled over. Like it or not, that's what the rulers have to respond to, lest they wind up on the wrong end of a revolt or regicide. Celto-Germanic people are more willing to let it be known if some outside group is not welcome. They try to give them unfriendly looks, or plain old chase them out of town, dust their hands off, and consider the job done. We don't forget a face, so don't bother trying to sneak back in.

The pastoralist way of life affords considerable mobility, just in case you felt like chasing after somebody for awhile, since your wealth is movable -- it walks on four legs. You can also leave your wealth alone by itself for awhile as you spend the day making outsiders feel unwelcome. You do have to worry about raiders and predators if you left your livestock unattended, but you wouldn't have to worry about it being able to tend to itself otherwise -- the sheep are perfectly capable of grazing on pasture without you there. Just hand them over to one of your kinsman to watch for the day.

Balto-Slavic people favor bottling up their emotions and opinions instead, and generally not being confrontational in personal affairs. Agricultural life is all about keeping your nose to the grindstone, so there's no time to make waves. Your land and crops need laborious tending to most of the day, day-in and day-out. If you wanted to set those chores aside and physically leave your plot of land to go chase off some unwelcome group -- someone else might be squatting on or taking over your land when you get back after a week, or your plants may be suffering from neglect because you couldn't bring them along with you, nor find someone to "plant-sit" for you because they would have their own plot of crops to laboriously slave over.

You need to hover over crops in a way that you do not need to for livestock, because you're not spoon-feeding the animals, or washing them or what have you, on an individual basis. You're mostly standing guard to ensure the safety of the collective flock. And one shepherd could stand guard for 1, 10, or 100 sheep. The more one-on-one tasks are relegated to women and boys, such as milking. The farmer's need to hover severely limits his ability to do as he pleases and pursue other projects should the fancy strike.

In short, then, I think the rulers of hilly/mountain country sensed that having a foreign ethnic group administer the peasants as a middleman group would have provoked their rowdiness. Can't even collect the taxes yourself, or send one of your own to do it, eh? You had to send some foreign group who feels nothing but contempt for us to suck our blood on your behalf, did you? Sensing the potential for rowdiness among their commoners, the Celto-Germanic rulers figured it was better to administer their realm by themselves, as much as possible.

The Balto-Slavic rulers figured, meh, these hardscrabble peasants wouldn't upset their monotonous lives just to chase off the Jewish tax farmers. That's good for our security, since the peasants won't associate losing money with us, but with the Jews. They'll either bear the humiliation of having a cold foreigner group take their money, or they might not even mind who takes it -- we ought to just keep our nose the the grindstone, don't cause a scene. Somebody's always going to be ripping you off, does it really matter if it's Jews or Poles? No time to worry about that -- back to sowing seeds.

These two opposite ways of dealing with their personal frustrations toward Jewish money-lenders and tax farmers led to drastic differences in how they sought to deal with The Jewish Problem as a group. On the Celto-Germanic side of the fault-line, they treated Jews like an acute infection -- a pesky group who needed to be sneezed out, and that would be that. Don't obsess about them afterward. There was very little virulent, bitter, enduring anti-Semitism in hilly/mountain country, and no feeling that the Jews needed to be hounded even when they'd left your territory, let alone tortured or killed en masse. Just kick them out and don't let them back in -- we know who you are and what you look like, and we'll be ready for you if you try to come back. Sounds pretty simple, doesn't it? Like a healthy, normally functioning immune system.

Well, that assumes that you have a certain comfort with standing your ground, which pure farmers do not. The phrase "long-suffering" comes to mind. For them, the feeling slowly and slowly builds that the Jews are a bloodsucking foreign race, and if we could only get rid of them, we'd finally be able to breathe free. But we're tied to the land, and they aren't -- they're white collar professionals and can roam to whatever group of nobles will pay for their services. So what's to keep them from just coming back to suck our blood again even if we did manage to chase them out of town in the first place? I guess we'll have to really hit them hard so they don't ever feel like coming back. The longer they bottle it up, the more cataclysmic the explosion when it finally gives.

Now the immune system has switched over from run-of-the-mill expulsions and beatings, to the more aggressive all-guns-a-blazin' kind of immune response -- the kind that could go haywire and cause septic shock if it's not careful.

And sure enough, the Plain of Europe was the main stage for the history of pogroms. Sporadic mass violence against Jews was not unattested in the hilly/mountain side of Europe, especially in the Rhineland where a sizable Jewish population had remained uncharacteristically west/south of the Plain. But it was more like an occasional chasing-out than a momentous rising-up-against. In the Plain, it was more bitter, long-lasting, and took the character of an apocalyptic mass revolt against their oppressors, more or less from the 17th C. through the 20th. Knock them off their pedastal for good.

The greatest enthusiasm for the Holocaust was not surprisingly throughout the Plain, including the Baltic and north/east Germany / Prussia. Western and southern Germans were not very eager for the Nazis to rise to power, and were the main base of opposition -- another reflection of the great fault-line. Here's a map showing the percentage of each region's Jewish population that was killed in the Holocaust:


So far these are all influences of the fault-line on the host populations in their treatment of Jews. What about the inclinations of the Jews themselves? Again, when they were kicked out of England and Germany, why didn't they head gradually toward Italy and the Balkans? Why didn't they find Serbia as attractive as Poland? Both are "Slavic" nations, after all. But one is mountainous and pastoralist, while the other is on the Plain and agrarian.

If they sought to continue their unusual levels of wealth as financial middlemen and administrators, they had better seek out a large, sprawling, rigidly hierarchical economy. And you tend not to find those in hilly/mountain country. It's more likely to be located in the large-scale farming systems of the fertile Plain.

Also, middlemen minorities would prize protection from mob violence and retaliation as a job perk from their employers. That assumes more of an elite monopoly on violence, and that degree of concentration of power tends to be found in more hierarchical agricultural societies. Central authorities are more impotent in hilly/mountain areas where the masses herd livestock and are more mobile during the year.

Just imagine how Jews would feel settling backwoods Kentucky as tax farmers. Now you know why they flock to massive-scale, hierarchical cities like New York instead. There, they'll have a leader with concentrated power, whether Gentile like Giuliani or one of their own like Bloomberg, who can more or less suppress group-against-group feuding, even if it takes awhile like suppressing the mafia and later black gang warfare. Opportunistic, individual crime may still occur, but that's a risk everywhere -- what the middleman minority is really sweating about is that an entire group might band together as a mob and advance on their community in its entirety.

These various push-and-pull factors, operating on both the host and Jewish societies, seem to be most easily explainable as consequences of the fault-line between hilly/mountain country and the Great Plain in Europe. Lots of ink has been spilled over the topic of European Jewish history, but shockingly little of it has drawn attention to the differences between (agro-)pastoralism and sedentary agararian farming that have conspired to lead history on the course that it has taken.

August 27, 2013

Chinese morality: "Trafficker Gouges Out Boy's Eyes," "Teenager Sells Kidney To Buy iPad"

Reports here and here. Like I said before, East Asians don't feel disgust very strongly, and that removes one of the most powerful checks on sick and immoral behavior.

This isn't serial killer stuff, where the criminal is obviously fucked in the head. It's just some enterprising middleman who's simply doing his best for the greater good by raising the supply for a high-demand commodity. If that commodity happens to be human corneas, and if that means he has to gouge out the eyes of a 6 year-old child after drugging him out, leaving him for dead afterward, well, y'know...

It take tough person to survive in China! No eye build strong character! In fact, YOU pay ME for make your kid so strong!

Re-reading the report, it turns out the criminal was probably a woman. Newsflash / reminder to all you chink-loving HBD nerds out there and your fantasies about sweet, feminine Asian women (LOL).

As for the kid who sold his own kidney, notice the hard-luck story about how "The teen was from Anhui, one of China's poorest provinces, where inhabitants frequently leave to find work and a better life elsewhere." Poor baby, had to suffer the disgrace of not being able to diddle an iPhone and an iPad in public. But, y'know...

Apple very prestigious in China! No have Apple toy, very shameful! Dishonor family spirit! I make sacrifice to please ancestor -- unlike selfish American!

The notion that we have anything to learn from this irreparably rotten race is one of the most bizarre mass delusions in human history. China has always been a miserable shit-hole, and will stay that way. The only relationship we ought to have with it is one of quarantine.

August 26, 2013

Alarm clocks also switched to dark text on light background

Speaking of how computer screens now usually look like dark text and pictures against a light background, alarm clocks have made the same switch. They tend to look more like this these days:


Just like the dark-on-light scheme for computer screens, this looks a little too calming, soothing, and tranquilizing -- not how we ought to feel when trying to stir awake. People in today's Prozac nation must not feel like getting out of bed in the morning.

Plus you can't see it in the dark, i.e. when you need it most, before and after going to sleep for the night. They usually have a button that will temporarily light it up, but it only lasts five seconds and you must be nearby to activate it. I remember when you could tell time from across the room during the night and early morning, with no room lights on.

We need something more stimulating, like the bright text against black background that we saw on computer screens from the '80s. Vintage alarm clocks to the rescue. As with the old computer screens, notice again how futuristic and technological these old clocks look.


Nothing like a little neon in the morning to open your eyes up. Just enough of a jolt to get you out of bed. They come in the same range of colors that the old computer text came in -- red, amber, green, and blue.

Even the non-digital clocks with the flipping numbers had a white-on-black scheme, when they could just as easily have done black-on-white. Here are Marty McFly's clock from Back to the Future, and the one that kept waking up Phil Connors in Groundhog Day:


Alarm clocks from the '70s and '80s made it even easier to wake up ready for your day because they often came with a radio, and you could have your favorite station switch on instead of those annoying clanging and beeping sounds from before. Of course, that was back when the average pop song could put a little pep in your step. No point in waking up to Fergie or John Mayer -- might as well go back to sleep. Who knew that turning on the radio would become the same as hitting the snooze bar?

Product designers got ever more creative and ambitious during the '80s. Here by Soundesign is an alarm / clock / radio / cassette player / telephone:


Looks cooler than a Zen minimalist iPhone, without trying hard to whore for visual attention. It's got an interesting two-tone contrast, and a couple points of bright blue and orange, with red light from the clock. Neat and eye-catching without looking self-aware as a design object, like you'd find today at Target.

Dark-on-light computer screens are making us less productive than we could be, and dark-on-light alarm clocks don't exactly make our eyes feel energized first thing in the morning. When you need something more engaging and stimulating, it looks like light-on-dark is the winner yet again. It's nice to find this in a totally different domain of life from typing in front of a computer screen, since it shows how general the result appears to be.

Fortunately for those who want to change back to the more effective way of being woken up, mainstream retailers still stock the light-on-dark types. Walmart's website shows that of the top 20 best sellers for digital alarm clocks, half are dark-on-light and half are light-on-dark. People may have changed their use of computers from productivity to goofing around, shifting all color schemes to dark-on-light. But alarm clocks still need to wake you up in the morning, so the more stimulating light-on-dark kind is still easy to find.

August 25, 2013

Dark-on-light color schemes making computer users less productive and creative

I was checking out some pictures of old computers, and started to recall how common it was for them to have a dark background and a light-colored foreground, usually text characters but also images. This is what comes to mind when I think of the first computers we learned how to use at school (click any image for not-so-squinty resolution):


Sure looks more futuristic and computer-y than what we have these days. The dark background and brightly colored light in the shape of text and images calls to mind neon lights at night. Just more stimulating and engaging to look at, while the light background with dark text / images feels more soothing and tranquilizing -- like you feel when reading a book, browsing the newspaper, etc. More leisurely than purposeful, more receptive than active, and so on.

I wonder if this difference between stimulating vs. tranquilizing color schemes matters for how well we use computers. We ought to check people whose livelihoods depend on using them, such as coders. Here is a poll confirming my hunch, not being much of a coder myself, that they overwhelmingly prefer dark background and light-colored text. We don't need to know why, just that they do -- something about it works better.

Back in the '80s and even into the '90s somewhat, ordinary people used computers to be productive and creative, whether at work or at home. They've since changed gears, treating computers more like toy gadgets to soothe their boredom during leisure time, or distract them at work (and in public) while still at least keeping their brain online.

And sure enough, in the good old days it wasn't only the command line interface that featured a dark background and light text, but the major productivity programs of the day. They could have chosen to use a lighter background and darker text similar to today's scheme, but uniformly adopted the opposite combination. Below you see what the most widely used spreadsheet and word processing software packages looked like for the user. They are Lotus 1-2-3 and VisiCalc, and WordPerfect and Microsoft Word.


As people have come to use computers more for goofing off, the color scheme has been turned inside-out. The more I think about it, the more it does feel like the average computer experience puts your mind in a more passive and tranquilized mode, like when you're reading the newspaper for leisure.

But some people are still getting paid to use these things for productive work. Aside from all-purpose coders, there are the finance whiz kids (or wannabes, at any rate) who use the Bloomberg terminals:


Pretty '80s looking, right down to the two-tone keyboard design, which also features a variety of bright-colored keys. If you're a quant big-shot, you need to stay awake and visually engaged. The pure desaturated Zen minimalism of the typical Apple set-up would put them to sleep.

I tried reconfiguring Word or WordPad to display a black background, but to no avail. The best you can do is get a single row to show black, not even the entire writing area, let alone the margins (which cannot be darkened). So don't blame me if my posting seems beneath my abilities -- it's this damned black-on-white composer that I have to use.

August 24, 2013

Independent record labels thrived in the 1980s

Awhile ago, this post drew attention to a fact that many who study Hollywood quantitatively have known for awhile, but that has gone completely unnoticed in non-specialist audiences -- that the mid-1980s saw the peak of the independent film studios, 180 degrees away from the hegemony of the majors during the mid-century and again during the Millennial era.

Was there something similar in recorded music? There must have been. Pop culture had a real homespun quality to it in the '80s. Consumers were more willing to take a chance on unknowns, and the producers were more willing to give rookies the benefit of the doubt. There was no "brand loyalty" like you see to Apple these days, or IBM in the mid-century. Folks just weren't hung up on silly status contests like signaling which corporation you got your stuff from. As long as it was enjoyable, who cares where it came from or how much it cost to make, right?

It turns out that judging whether a record label is major or not is a lot harder than for the movies. I'm going by this history of record labels, and what it considers the big ones around the time that the songs were released. I chose the year 1986 because that was the single most fortunate year for independent film. Here is a list of the Billboard top 100 singles for the year, of which I'll study the top 20 to save time and space.

Sure enough, songs produced by independent labels were chart-toppers back then. Arista, Island, and Virgin seem to have been the record label counterparts of Orion, Carolco, and Touchstone in the movie industry. Nobody cared if there wasn't immediate "brand recognition" -- not only for the label, but for the artist either. For example, Mr. Mister were more or less unknowns when their album Welcome to the Real World came out, yet that didn't stop them from enjoying two spots in the year-end top 20.

That album was produced by RCA Records well after their hey-day as a major, and before they were eventually folded into the mega-label that would become BMG. Plus they chose a pretty dopey name for a band, though again listeners were too carefree to worry much about weird band names back in the '80s (like Orchestral Manoeuvres in the Dark).

Below are lists of songs by major labels (7) and independent labels (12). The one uncertain case is Prince's song "Kiss," recorded for his vanity label Paisley Park, which was partly funded by Warner Bros. -- I'm not sure how heavily funded it was by them, how much say they had, etc., so who knows exactly where it goes.

Majors

"That's What Friends Are For" (Warner Bros.)
"I Miss You" (MCA)
"On My Own" (MCA)
"Party All the Time" (Columbia)
"Glory of Love" (Warner Bros.)
"West End Girls" (EMI Music / Parlophone)
"Never" (Capitol)

Independents

"Say You, Say Me" (Motown)
"Broken Wings" (RCA Records)
"How Will I Know" (Arista)
"Burning Heart" (Scotti Bros)
"Kyrie" (RCA Records)
"Addicted to Love" (Island)
"Greatest Love of All" (Arista)
"Secret Lovers" (A&M)
"Friends and Lovers" (USA Carrere)
"There'll Be Sad Songs (To Make You Cry)" (Jive)
"Alive & Kicking" (Virgin)
"Higher Love" (Island)

Uncertain

"Kiss" (Paisley Park)

I suspected that the indies would have done well, but not having nearly twice as many top 20 hits for the year. If you look at the major label songs, on the whole they're definitely not as cool and fresh-sounding as the indie ones either -- Dionne Warwick and Eddie Murphy vs. Whitney Houston and Steve Winwood. Get real.

As with independent movie studios, the independent record labels bit the dust as the '90s began. They're almost all now subsidiaries of the Big Three -- Universal, Sony, and Warner. Indeed, turning to the year-end singles for 2012, we find only two from indie labels: "Somebody That I Used To Know" (Eleven) and "Set Fire to the Rain" (XL). All the others are controlled in one way or another by the Big Three.

The brief history of record labels linked to earlier says that indie labels began gobbling up market share when rock 'n' roll first blew up in the late '50s. My take on that is that the mid-century was generally a period of domination by the majors, closer to today's climate than to the free-wheeling Eighties. The rock music explosion was one of the first pivotal shifts away from that whole zeitgeist of hive behavior and mass society.

When people come out of their cocoons, they don't demand the security of mega-corporations regulating society's affairs. They can accomplish a lot of that on a more local level, and in a more face-to-face fashion. Big business is still there, but the enthusiasm for it has deflated and it's kept on a shorter leash than in cocooning times, when people outsource social regulation to corporate efficiency experts rather than have to interact with others themselves.

Critics often use the term "indie" to refer to a style rather than an economic organizational stance. It's the style of music or movies that defiant or independent-minded artists are supposed to make, in the critic's view. But in real life, it turns out the opposite -- truly free-spirited folks like Whitney Houston, Steve Winwood, and the makers of Platoon and RoboCop would rather engage the audience and give them something catchy, enjoyable, or memorable to hold onto. Not emotionally muted, whispery-mumbly songs that have no musical motifs or instrumental solos. Borrring...

August 23, 2013

YouTube vs. MTV -- targeting vs. browsing

Here is an NYT article about the rising role of YouTube videos in driving the sales of albums and singles.

The message you're supposed to take away is that YouTube is removing the "gatekeepers" or middlemen like MTV program directors, and letting listeners directly decide what to watch and listen to. But it sounds like sales still rely mostly on what radio gatekeepers play:

A wildly popular video on YouTube, besides generating an additional stream of revenue for labels from ads that precede videos, often persuades radio programmers to champion a song, which in turn spurs sales of albums and singles, music executives say.

Listeners have to get their initial exposure from somebody, and the ordinary person is not the ultimate prime mover -- they haven't listened to hundreds or thousands of new songs, or watched their videos. Most people wouldn't want to bother with that -- let somebody filter through all that and give me a rough list of things I might like, and then I'll check those out. Probably I want a screener with similar tastes to mine.

YouTube is not a screener, but a virtual warehouse. You go there and search for a video that you've already been tipped off to by a screener such as some internet celebrity whose recommendations you tend to check out, on the basis of having similar tastes. Then you might recommend it in turn to your Facebook feed, your Twitter feed, your whatever feed.

So, YouTube is to MTV (and radio) what Amazon is to Barnes & Noble. A brick-and-mortar store has a condensed inventory compared to Amazon, and they feature recommendations all over the store. They more heavily promote books that have been favorably reviewed by outside critics. And you navigate it in more of a browsing way, rather than going there to buy a particular target book.

At Amazon, the selection is too vast to look through on your own. You go there when you have a specific book in mind. They attempt to promote books that have been favorably reviewed by outside critics, but not really. They know that you've come for a specific book -- yet, a book that you were tipped off to by some outside source, not you yourself (unless you work in publishing).

Like Amazon, YouTube has too many videos to browse through. Anything that uses a search engine is unbrowsable -- it is for targeting a specific item among a staggeringly vast selection, and at most wandering in a small, fixed radius around the initial target (you searched for X, so you might also like Y). You go to YouTube to look for a video that you've already been tipped off to by a cultural gatekeeper (they're still there, only online). And you don't get exposed to much that's far away from your initial tastes.

MTV, back when it used to show music on television, was more like Barnes & Noble. They did not "force feed" you anything -- if you didn't like the video, you changed the channel, simple as that. The selection they chose was based on what the screeners thought would resonate with the audience. Screeners came both from within the channel (like B&N's staff picks) and from outside (like book critics in major newspapers).

Most importantly, you were able to browse a wide selection and discover songs you never would've found if you had only used the channel in an on-demand fashion. MTV, like B&N, had to choose its selection based on catering to a wide audience. There was no search engine, so the results were not micro-tailored to any individual's wants. You were a dance fan, so you tuned in for Madonna, that video from Flashdance, and that video from Footloose, but there are only so many of those they can play before it becomes a dance-only channel and alienates most viewers. (And imagine if B&N only stocked books in the gay vampire genre.) So you also saw videos that introduced you to heavy metal, R&B, heartland rock, new wave, synthpop, college radio, and so on.

Sure, you didn't keep the channel on as often if it was showing music from a style that you don't care for, but it's not that hard to just leave it on and give it a chance every now and then.

Even my dad, who has always been stuck in the mid-1960s for music, got turned on to "Bette Davis Eyes" in the early '80s, probably from MTV as well as radio airplay, and regularly played a tape of Kim Carnes' greatest hits along with the Stones, the Beatles, the Byrds, Creedence, and his other favorites. That never could have happened if we only had search engines to demand that they play specific songs that we already knew about and liked.

The NYT article also goes over the stylistic trends in videos now compared to at other times, but that's a topic for another post. The main point is that these tiresome, recurring gushes about Eliminating the Middleman are always bullshit. The middleman is there because you don't have the time or willingness to sift through everything yourself. Cultural gatekeepers will always be with us, and the shift from analog to digital media has not changed that fact. What the defining qualities are of the gatekeepers -- that's more important than if they're there or not (and a topic for another post, since it appears to reflect the zeitgeist rather than the medium).

What has changed is the browsing vs. searching method of consumption. Analog is harder to search, digital much easier. Each method has its own benefits and costs, but in our revenge-of-the-nerds culture of fanboy geek-outs, people only emphasize the costs of analog and the benefits of digital.

Here's to songs that you would never have imagined would suit you perfectly, if you'd only stuck to the bespoke, custom tailoring of cultural search engines.



August 22, 2013

The spread of Christianity and the great civilizational fault line in Europe

An earlier post took the main split in Europe to be between the more sedentary farmer regions and the more pastoralist (livestock herding) or agro-pastoralist regions. The focus was on personality or character traits, and how those influence larger-scale social and political organization. A follow-up post looked at how these differences have influenced the history of cultural achievement in Europe.

Subsistence mode is strongly influenced by geography, in this case between the fertile European Plain, where you can thrive by planting crops, and the more hilly and mountainous region to the south and west of it, where you can't plant too much but where pastoralism can flourish. Here's a topographical map showing the Plain, which begins as a somewhat thin strip on the northern edge of the continent in the west and central parts, and then fans out to the south also as you move eastward into northern Germany and the Baltic area (click to enlarge):


I think of that division between the Plain and hill/mountain country as the great civilizational fault line in Europe.

Herders and farmers lead fundamentally different, in many ways opposite, ways of life, and have generally been in an off-and-on state of war against each other ever since pastoralism emerged as a splinter movement from agriculture. In short, pastoralists have mobility and a greater taste for risk on their side, so they make good raiders; farmers can use their sedentary ways to their advantage by building fortifications or other huddling-together defences, and advancing slowly but surely through putting down new settlements.

This clash of civilizations flares up all over the world and throughout history -- the Steppe cowboys of Central Asia vs. the farmers of the East Asian plains, the desert nomads from Arabia vs. the agrarian civilization along the Nile, the Han Chinese rice farmers vs. the Tibetan yak herders, the Tutsi pastoralists vs. the Hutu farmers in Rwanda, etc.

Less remarked on is the European incarnation of this clash. Nisbett & Cohen's book, Culture of Honor: The Psychology of Violence in the South, applied the basic approach to the United States, contrasting the Celtic pastoralist groups of the hilly and mountainous parts of the South (i.e., excluding low-lying plantations) against the more Anglo farmer groups of the agricultural and industrial North.

But in the European context, it has attracted surprisingly sparse mainstream attention. I haven't previously read about the connections I'm going to draw in this ongoing series (and I read a lot), but perhaps the sources are obscure, not in English, and/or out of academic fashion. My goal is more to get the ideas out there rather than claim originality (which I still hope they are).

Let's move on from personality and social organization to cultural institutions like religion. Here is a map showing the Christianization of Europe from early to late adopters:


Not a bad match with a topographical map, is it? (Remember that Scandinavians live in the low-elevation, farmer-friendly areas in the very south, not in the great mountains of Norway.) Christianity began among pastoralists in hilly/mountain country in the Near East, and quickly spread to other peoples who followed a similar subsistence mode around the Mediterranean.

The interesting part is its spread north of the Mediterranean European coastal areas. It did not simply move northward, or merely take longer to reach the more northern latitudes. Rather, it hit the most pastoralist regions first, including Ireland and its off-shoots in Scotland, which are well north of -- and across a sea from -- such continental areas as what was recently called Czechoslovakia. Moreover, it didn't simply move in a straight line from the continent -- it didn't catch on in England at first and then Ireland and Scotland, but the other way around. The latter countries sent missionaries to convert the English. That's because England's lands are more farmer-friendly, while pastoralism has found greater success in Wales, Scotland, and Ireland.

The areas that took the longest to Christianize are in the Plain off to the northeast -- the Finns, the Baltic peoples, and the purer, more northeastern Slavic peoples (i.e., the ones we usually think of as prototypically Slavic, not the hawk-nosed feuding pastoralists of the mountainous Balkan countries, who happen to speak Slavic languages).

The greatest test case is the family of Germanic language speakers. Some were very early adopters -- for example, the Goths who invaded Rome had already been Christians for over a century. The Visigoths went on to rule Spain, first as Arian Christians as they began, but quickly converting to Catholic Christianity by the end of the 6th century. The Lombards, the Burgundians, and the Alamanni, who settled much of what is now northern/central Italy, southern/eastern France, southern Germany, Austria, and Switzerland, were other early adopters. Most important were the early-adopting Franks, who went on to control France and the Rhineland.

Suddenly when you get into the more Plain-influenced regions of Germany in the north, people not only take longer to adopt Christianity, but they bitterly resist conversion until they're militarily defeated by a Christian army. The result of the Christian, Frankish king Charlemagne's victory in the Saxon Wars was a shift -- unwanted by the people -- from Germanic paganism to Christianity in northern Germany. Their Anglo-Saxon relatives in farmer-friendly England also took a little longer to convert, as mentioned before.

The Scandinavians are really part of the broader pattern of reluctance to join the Christian world that you also see among Baltic and (purer) Slavic peoples nearby, all of whom did not convert until centuries into the 2nd millennium. And unlike the eager Germanic speakers from the southern/western part of the continent, the Scandinavians don't appear to have been very enthusiastic during the initial conversion process. Maybe they were thinking, "If you can't beat 'em, join 'em."

In the first post laying out the basic ideas, I drew attention to blond hair as a salient marker for Peoples of the Plain, and whatever personality traits it correlates with. * Without even understanding the reasons why, it is probably no accident that the blondest part of Europe was also the last hold-out against conversion. Lithuanian nobles did not convert until nearly 1400, and the common people even later after that.

I'm going to leave this at drawing a connection rather than trying to explain it. For one thing, I haven't given enough thought to why farmers wouldn't dig Christianity, while (agro-)pastoralists would feel a more natural affinity for it. And for another, it would be wiser to wait until I'm through with the rest of Christianity's history in Europe, to be better able to tie the whole story together.

As you may have noticed, the split among early and late adopters of Christianity looks suspiciously familiar if you know your maps of the Reformation and the Thirty Years War. Or of higher vs. lower levels of religiosity even today. But there are some interesting wrinkles to be pointed out as well, so I'll save that for another post. For now, take note of how deeply rooted these cultural schisms are.

* Pleiotropic effects are common when natural selection makes a quick, large change, as when animals are domesticated. The target of selection is some personality or behavioral trait, but the gene that influences it also influences some random aspect of appearance -- like floppy ears on dogs, or piebald coat patterns on cows.

August 20, 2013

Children not enculturated to know their nursery rhymes anymore

Some thoughts in the previous post, about contemporary children's lack of familiarity with their folk's culture, reminded me of some baby pictures I found at home recently, showing me engrossed in a book of nursery rhymes. This is Philadelphia, around 1982 or '83, when I'm about 2 years old:


The left page shows the drawing for "The Bunch of Blue Ribbons" ("Oh, dear, what can the matter be?"), and the right page shows "Hot Cross Buns" at the top, and "Bobby Shaftoe" at the bottom. Here is an online copy of the book, including the pictures that must be in the public domain by now. The original book came out in 1916, during the "golden age of illustration" of the Edwardian and Jazz Age periods.

BTW, in the early '80s Philadelphia was only 50-55% white, and 40-45% black. Yet somehow kids still learned their European nursery rhymes. Pointing to unfavorable demographics these days is usually a rationalization for not teaching children the culture of their people. "Whaddaya gonna do in this majority-minority society?" Racial demographics make little difference; it's the zeitgeist that determines whether adults want to enculturate the young generation in a certain way or not.

I actually bought that Real Mother Goose book for my nephew last year, without even realizing that it had been the one I'd grown up with. Something about it stood out from the other nursery rhyme books. It must have been an unconscious memory of the drawings, which I probably studied more than the text since I couldn't really read at that point.

Unfortunately my nephew didn't really dig it, and his mother shows little interest in passing along folk culture to her kids. She's not atypical either -- if you watch any of the many reality shows that capture family life, parents don't teach nursery rhymes or folk songs to their children, and children never sing them spontaneously. You also don't see folk games being passed on, like thumb wrestling or freeze tag among boys, or hop scotch and jump rope among girls (let alone the rhymes that go with it).

My mother used to record herself and us kids with a microphone and cassette recorder, and send the tapes to her relatives or my dad if his ship was out at sea. There's the usual stuff about what I'd done that day at preschool, what was for lunch, etc., but there was also a part where she'd begin a nursery rhyme and have me finish it, or sing along to a tune that they'd taught us in Sunday school, like "I've Got the Joy Joy Joy Joy."

She says I had most of that Real Mother Goose book memorized. Looking over all the entries, some of them have stuck more than others, but there were a lot that I wouldn't have recalled off the top of my head, but came right back when I started them. Verse makes it easier to remember, using rhyme, meter, alliteration, and other verbal ornamentation.

Parents in the Millennial era sense that those stylistic techniques make for a more group-oriented culture -- everyone can follow the beat, memorize the lyrics, and feel the same pleasure when words rhyme or show alliteration. Prose narratives do not allow the listeners or readers to synchronize themselves in the same way; what they share is the propositional content (who did what to whom, when and how), the distilled lesson or message, and emotional reactions -- but not in a synchronized way.

Verse also lends itself to dancing, which is a kinesthetic form of synchronizing individuals into a superorganic group. In particular, circle dancing -- "Here we go 'round the mulberry bush, the mulberry bush, the mulberry bush..." "Ring around the rosies, A pocket full of posies..." There's no leader/follower dynamic, and it's not individual or pair dancing that you see in courtship displays / lekking. It's just meant to glue the group together and make them forget themselves.

In periods of cocooning and suspicion of all others outside the nuclear family, it would disturb helicopter parents to see their child synchronizing himself with another child, let alone a group. It's one step closer to crowd or mob influence, and there goes all your hard work to shape him precisely the way that you want him to turn out.

When parents are not in smothering mother mode, they figure it's a fool's game to try to micro-sculpt their child, and that it's not only inevitable but important for him to become influenced by peers and belong to a larger folk culture above, beyond, and outside the nuclear family. So they're not only happy but eager to teach their kids such songs, dances, and games, so that they can hit the cultural ground running when they go off to school.

Then there's the subject matter -- some of it is pure nonsense, which children find funny, but you also get your first exposure to the topics of hardship, death, violence, love, courtship, and marriage. You're born sensing that these themes are important and need to come up in your development at some point. If they aren't, and your parents only let you watch Nickelodeon and the Disney Channel, you come away with an understanding that such things are not to be talked about, and that bringing them up is awkward.

Awkward...

The rapid disappearance of all of these things over the past 20 years is part of the broader trend toward the Asianization of Indo-European culture. You don't see them teaching their children folk songs, dances, and games, and encouraging them to join in such activities with the other children. Learning to read and write only serves the purpose of studying, doing well in school, and earning more money as an adult. If they want relief, they're allowed their little distractions in isolation from others (these days, TV, internet, and video games -- back in the mid-century, radio, TV, and comic books).

Their joyless, fragmented, nose-to-the-grindstone culture is not fit for us. It might work for Baltic, Slavic, and Nordic groups, but not for Celtic, Germanic, and Italic groups. Blonde Europeans may be enjoying today's revenge-of-the-nerds culture, but it leads to alienation for everyone else.

August 19, 2013

Helicopter parents crushed by the costs of their own suspiciousness... uh I mean of finding "quality child care centers"

Here is an NYT article in a series about rising inequality in America. The gist is that the wealth gap between men and women is increasing due to mothers choosing to spend so much of their incomes on child care.

I guess the male-female gap is a form of inequality, though the article gives the impression that mothers of all social classes are behaving this way, so it wouldn't appear to widen what we normally think of as inequality. In fact, since the higher-class mothers are spending more than it costs to send a kid to college for the same length of time, while poor mothers are not (and are getting government subsidies to boot), you'd figure this might actually reduce wealth inequality between classes.

In any case, the author does not bother looking into why it's so expensive these days to send your kids to a childcare center. She and all the women interviewed come from Generation X, so they have direct personal memories of how affordable it used to be when their mothers were mothering them. Or not-so-mothering them, as the case may be -- back in the '70s and '80s, your mom dropped you off at the nearest daycare center.

It was run by 30 and 40-something women who needed no special training or certification, going simply on female maternal instinct, and going with the common sense of the time, which recognized that children have their own internal developmental guidance systems that allow them to adapt to and master their environments without constant direction by grown-ups, as though they were merely a fleet of remote-control cars. All the workers were required to have was a reliable work ethic, but that's not so hard to come by.

Education in daycare centers was minimal since kids will absorb whatever they need to when they begin grade school, and it doesn't take very long to teach them the basics like the alphabet, arithmetic, and so on. The focus was more on enculturation -- learning the childhood songs of your folk, the games they play, and what kinds of tales they tell -- as well as on socialization through peer interactions. All of that is incredibly cheap -- for socialization, just throw the kids together and let them run loose, and make enculturation from adults interesting by reading them stories in the right kind of voice, playing melodic music to sing along to, and so on.

Parents in those days did no research and relied on no academic studies to ground their choices. They just responded to their intuition which told them that it's not something you, the children, or the daycare workers need to obsess over in order for the kids to grow up normal. And sure enough, we grew up into socialized and enculturated adults.

In contrast, kids these days grow up socially awkward and ignorant of the tales, songs, dances, jokes, and games of their people's culture. Even worse, they're not inventing their own to replace the old ones, but rather aging without a strong sense of cultural belonging at all. Including in all-white areas, where there is zero competition from alien cultures. *

What makes the costs of child care so astronomical these days is the profound paranoia that parents have toward the Outside World, i.e. every individual or group outside of the nuclear family.

If only their distrust were focused on certain individuals or certain groups, then they could just avoid them and go with those they trust. But their suspicion is so broad-brush that only one out of a million people will be so trustworthy to put their anxieties to rest (...or, to rest enough to not be in a constant state of panic).

And if only a tiny handful of parents were so paranoid, they wouldn't face much competition over the supernannies.

But econ 101 says that a soaring demand for an increasingly minuscule resource will send prices through the roof. The simple solution is for mothers to get a hold of themselves, drop their paranoia, and send their kids to the kind of daycare centers that they themselves got dropped off at, and that led to their own healthy and normal development. Suddenly the supply of trusted caregivers expands to include any 30 or 40-something woman who hasn't been to prison and can hold down a steady job.

The same goes for their individual caregivers like babysitters. If nobody trusts anybody, then mainstream babysitters will vanish -- "I'm not going to trust some perfect stranger, a 16 year-old girl no less, with the care of my special little snowflakes." Well then have fun shelling out 50% of your income to hire a supernanny, you dingbat old broad.

Back in the good old Reagan years, any middle or high school girl who was minimally responsible could get a job watching the neighborhood kids. Not because sixth-grade girls used to take advanced child care classes as part of their schooling, but simply because people were more willing to trust strangers, to give young workers the benefit of the doubt when it came to their lack of work experience, and in general to just take a chance on people from your community.

The same goes for trusting other mothers to look after your children. That went without question in the '80s -- innocent until proven guilty. That's why we spent so much time at our friends' houses, during the day and at sleepovers. That's an everyday form of someone else's mother looking after you. Now, even if you do eventually set up a mothering "co-op," you spend two years vetting the potential allomothers. They're assumed to be ruinous influences on your child until they can clear themselves of the charges.

Again, this costly and stressful state of affairs is entirely caused by the mothers' own paranoia about the Outside World. So they are in no position to complain, let alone to demand that taxpayers begin subsidizing child care centers -- particularly when those centers will cost a fortune -- more than college -- from only being staffed by a thin supply of the supernanny elite.

You'd think this would be the obvious conclusion from the Gen X author who grew up just fine herself without enrolling in Harvard for Half-pints. But no:

The most radical solution of all is the most obvious: we need high-quality, universal, subsidized day care. And we should not be ashamed to ask for it.

Lord knows that mothers in the Millennial era are shameless about their helicopter parenting. And in typical nuclear-family-centric fashion, they don't care about demanding that the rest of society pick up the tab for their little dears attending pointless, frivolous "high-quality" daycare. Next up: mandatory Priuses for new mothers, mandatory child safety seats by IKEA, and mandatory MacBook Junior laptops for early educational enrichment.

Society, are you still sure you want to prize mothering as our most valuable source of moral guidance and community cohesion? Flush all of this amoral, neo-Dr. Spock bullshit back down the mid-century toilet where it belongs. Get a grip, as young people used to say. Strange how teenagers in the '80s had more common sense than parents do today -- sadly, often the same individuals.

* Folks who live in heavily diverse areas should spend some time in Vermont, West Virginia, Utah, or wherever, and see just how awkward and culture-less the young people are there too. Just because they aren't getting picked on by ghetto blacks at school doesn't mean they have the sense of belonging that young people used to when you were growing up. When you were largely unsupervised and trusted by adults to make your own decisions, learn from your mistakes, and grow stronger bones from taking the occasional hard fall.

Social and cultural membership does require a basic feeling of group security, but that's just the bare minimum. When parents keep their children isolated from one another, the more important parts of membership, beyond security, fail to thrive.

How college kids adjust to non-nuclear life in cocooning times

This article from CollegeView seems pretty representative of how both students and parents think about the switch from life at home to college, especially their social adjustment.

BTW, isn't it striking how in-agreement kids and parents are these days? And not only that, but that they both agree the kid is going to have as little of an independent life as possible? You'd think that would provoke rebellion in the kid, but not anymore.

At any rate, the moving-off-to-college stage gives us even clearer insight into the nature of nuclear family dynamics in a period of such profound cocooning. I mean, you'd think the parents would do everything they could to keep their kids under the same roof as always, and the general public wouldn't get to see the parent-child relationship, unlike if it were more out in the open at college.

Then again, both parents and kids probably expect -- and want -- the kid to move back home after college, perhaps only for a few years, though perhaps until they're 30. So, going off to college is more like an extended summer camp. Sure, your little robot might possibly get re-programmed by peer influences, but they'll be back soon enough for de-re-programming. Plus you'll be supervising and micro-managing every hour of their day from afar via cell phone.

While they're there, though, how are kids supposed to adapt to a social environment lacking direct parental supervision, when that's all they've experienced growing up under helicopter parents?

Listen to how little social skills they assume when giving the following advice:

To help make adjusting to dorm life easier, we recommend contacting your new roommate(s) prior to leaving for your first semester. This will allow you to get to know each other before even meeting, and will also give you the chance to discover each other’s interests, likes, dislikes and more. By becoming acquainted with your roommate(s) prior to arrival, it will help you to understand what is important to this new person and provide for a more communal dorm life.

Uh, or the both of you could just show up and get to know each other in real life and in real time -- without months of planning and negotiating beforehand. But remember, these people have rarely or never interacted with their peers in an informal, equal setting where give-and-take is the norm. Lacking the ability to hit the ground running, socially, they have to Make Preparations.

I see meetings like this every now and again in Starbucks during the weeks leading up to the start of the school year. It didn't feel like how college kids used to strike up a conversation as part of the getting-to-know-you process. It's more like an interview, albeit a friendly two-way interview, between potential business partners. And meeting in a Starbucks totally makes it feel like a business meeting (albeit friendlier and lower-pressure).

The more transactional and contractual nature of their first interactions betrays the deep lack of trust among young people these days. They might as well communicate through their agents or representatives.

Once you arrive to college as a freshman, our advice is to also have a discussion with your roommate(s) in order to establish preferences. One such preference is whether or not your roommate(s) prefer to go to bed early or are night owls. Another recommended preference to establish is the level of noise which can be tolerated. It is important to create a level of comfort between you and your roommate(s), and by determining each other’s personal needs, you will be more likely to avoid conflicts and have an enjoyable college dorm life.

Why does any of that need to be spelled out at all? Didn't these kids internalize these basic truths over the course of years of sleepovers at their friends' houses? Nope, they rarely got to sleep over or host a sleepover, so they're wholly unfamiliar with having to compromise with another person in a sharing-the-room situation. And not only for sleep-related matters, but in general adapting your behavior to the norms of your host house -- how and when and what do you eat for breakfast, can you run indoors, etc.?

Since these ways have not become internalized, they need to be explicitly articulated by experts, and discussed overtly by the parties involved.

You'd be surprised how inflexible college kids are these days in living around or with others. But when your only experience has been getting your way and living with a single set of norms (that of your nuclear unit), you get irritated and bitchy when living with a stranger disrupts your routine. They're not doing it the right way!

The best way to meet new people during the early part of your dorm life is to get involved on campus. Participating in available campus programs provides an easy way to meet new faces and to see what college life is all about. Typically, most campuses will have kick-off activities for incoming students to take part in. You can also utilize your school’s web site as well as bulletin boards around campus to aid you in your search for appealing clubs and organizations.

Nothing wrong with joining clubs, but you should be comfortable meeting people outside of a more formal, organizational setting. That's my impression of who your friends are in high school these days -- they're the other kids with you in the computer programming club, on the cheerleader squad, on the soccer team, etc. It's more fragmented, with little interaction between the social islands.

You should be able to just attend a party, join a group on the main green area, or what have you, and make new contacts informally. There are only so many other people who are going to share your narrow range of interests and obsessions. After that, time to branch out.

Grown-ups had that too back in the good old days -- it was called the "dinner party" instead of a mere "party," but the function was the same. Or the night club, another great place to meet and mingle with new people. In the '80s, adults didn't need more formal social settings like the fraternal organizations of the mid-century because they were more comfortable socializing informally. (And the vast welfare state had removed most of the material / economic reasons for joining a fraternal organization back in the '50s.)

As with college kids, so with adults -- only so many others are going to share your interest in vintage cigarette case collecting, so you'll have to meet people in settings that draw from your peer group or generation in general. The death of night life is a sad consequence of the cocooning drive.

But back to social life at college:

Your resident assistant will help to create a great community, and by simply leaving your door open and participating in hall programs, you will be able to make friends quickly during your college dorm life.

Another strong impression I have is of the R.A. these days acting as a more active force, a surrogate "play date" supervisor for their wards, rather than just someone you occasionally turn to when you don't know what to do and they have been there, done that.

Again, when all these kids have known is one form of play date or another, it would be too radical of a change to just let them run their own social lives. So the R.A. has become more of a camp counselor for still-developing children, whereas even 10-15 years ago they were more like hands-off guidance counselors who had their own life.

Leaving the door open... it's so weird how quickly this has changed from when I was a freshman in the fall of 1999. Judging from what I've seen around college dorms, and of the undergrad Millennials I've been housemates with over the years, it seems like leaving the door open is no longer an invitation to come in and interrupt me, since I'm not doing anything important. It's no longer a notice that I'm available and up for whatever.

Now, even though the kid has their door open, they're glued as always to one of their various devices -- TV, video games, internet, or texting. Those are activities that you tend to get sucked into for awhile, and find it hard to unplug once you start. You get caught on a treadmill. So if someone thought to enter your room, they wouldn't really feel invited -- there you are, ignoring them while you scroll through texts, keep checking Facebook for new comments / likes / etc., level up your video game character, or whatever else.

Have you tried having a real interaction face-to-face, in real life with a Millennial? Or have you at least seen them in a public place like a coffee shop, where they're closed off from the outside world by their laptop, phone, headphones, and maybe sunglasses (indoors) for good measure? They have the same always-distracted behavior even when there's a group of friends together in one of their rooms. They wait politely for the main offender to get off their laptop, but until then they might as well check their own Facebook on their phone, and the group stays locked in digital distraction.

So what's the point of leaving the door open, then? It's like they're pretending to be social. They want others to think of them as well adjusted, without having to actually "put out" socially. They're attention whores. Look at me through my open door, but don't plan on me being up for anything -- not gonna lie, I'm actually kind of busy right now. Busy "accomplishing nothing," as they'd say. Kids are such loathsome posers today, trying to front about being busy, busy, busy, when they have no social life, no job, and no commitment to schoolwork.

Like, if your answer to everyone is going to be, "I'm kind of busy now" or "I don't know..." -- then might as well keep your door closed. But, that would prevent the visual (only) recognition that they seek, so it has to stay open. Everyone's trying to appear as conspicuously unavailable as possible.

It really was not like that when I was in college. Sure, things weren't as free-flowing as they were in the '80s. (And I do remember that, even though I was in elementary school, because my babysitter once took me and my brothers to a dorm at Ohio State at night, when everybody had their doors open and were pouring in and out of each other's rooms, playing music, smoking and drinking in the common areas, and generally not minding the boundaries.)

Still, we were pretty restless when we got to college, and at least wanted to take part in a good old bull session, and preferably to Go Somewhere. Only one guy in the whole dorm had brought a TV, and no one had a video game system either. We did have desktop computers with internet access, but we would only surf the web to kill time until someone showed up, or until we got bored and left in search of something to do at a campus hang-out spot. If the door was open, and you found someone on the internet, they would totally look up in relief and hear your pitch for what to do, or get lost in a conversation if you just wanted to shoot the bull.

They would not ignore you while they managed four different IM conversations (that period's version of texting). The only thing I recall occasionally getting in the way was the computer game Snood and the file-sharing program Napster. Those things seemed to demand more attention and distract your would-be activity or conversation partner. But they weren't a problem for most people, most of the time.

It's rare in 2013 to see kids with the door open while they're sitting or laying down reading a book or magazine. Those are easy to interrupt. Unless you're deeply engrossed in a book (in which case you'd probably have the door shut), it's surprisingly effortless to lift your eyes away from the page, unlike unplugging them from a screen.

Sitting around listening to music -- another activity that's gone up in a puff of smoke. And one that was easy to walk in on.

I even remember some people taking a nap or resting with the door open. Like if they didn't really need the sleep, and just wanted to doze off a little until something fun presented itself, why not sack out on the bed or couch for a little while? "Knock knock... Uh, are you taking a nap?" "Nah, not really. You wanna go hit the dining hall?" "Yeah, just came to see if you were ready." "I'm ready, just resting my eyes. Gimme a second and we'll go."

Although we were advertising our availability by keeping the door open (rather than posing like today), I still wonder if that wasn't a sign of the first move toward cocooning on campus. Seems like we probably spent more time in our dorm rooms than we should have -- even if we made up for it somewhat by inviting others to interrupt us. If you're just waiting for a little while, then no big deal. But if the entire hall is sitting in their rooms with their doors open for hours and hours, then it hardly leads to socializing. It's more hive-like.

Just think if you spent most of your time at classes, eating / drinking, and otherwise being out-and-about, you wouldn't be spending so much time in your dorm room, door open or not.

Somebody who went to college in the '70s or '80s, please chime in about how common it was to hang out in your room with the door open.

Back to adjusting to your roommate:

It is also a good idea to contact your roommate(s) to clarify which items he or she will be contributing to the room. This will help to avoid bringing duplicate items as well as verify which items you will have access to during your college dorm life.

Not only did me and my freshman roommate not get to know each other before we showed up, we didn't contact each other to clarify who would be bringing what stuff. If neither of us brought a fan, I guess we'll take a trip to the Salvation Army and go halves. If we both brought a broom, well I guess one will remain unused in the closet or storage. How hard is this to figure out on the spot?

Again the relationship is so contractual. OK, so you'll be bringing A, B, and C, and I'll be bringing X, Y, and Z -- deal? I'll bet that on the rare occasions when these kids went to a sleepover or a summer camp, it was like that too, whereas we never had to make a checklist when we slept over at our friends' houses.

On their list of items to make sure you bring, they have a compact fridge and microwave. This is part the more hardcore level of cocooning these days, since nobody had those things when I was at college. We had a standard fridge and microwave (and stove top, I think) in the communal kitchen areas, about one per hallway.

Now kids and parents alike don't want the children to wander into an unsupervised common space, so why not play it safe (and pay big bucks) by giving each one their own fridge and microwave for their room. That will also dissuade them from wanting to go eat in the communal dining hall, where there would be too many potential peer influences and excitement. The more appealing you can make their room, the less they'll ever want to leave it, a theme I covered in an earlier post about today's obligatory dorm room makeovers.

And notice the list does not include first aid. I can't believe how much junk people hoard these days, yet do not have basics like hydrogen peroxide, or rubbing alcohol, and band-aids.

Finally, under the heading of Dorm Life Safety:

Share your schedule—Be sure to share your class/activities schedule with your family and friends to create a type of “buddy system” and allow your peers to be aware of your intended whereabouts.

Yeah, so your helicopter parents know when they can begin the daily barrage of "where are you?" "where are you going?" "who's going to be there?" "why didn't you respond when I called earlier?" "respond in 5 minutes or you're getting cut off," etc. I'm not sure how widespread this "buddy system" thing is, but definitely nobody did that when I was in college.

Do grown-ups publish their daily schedules, down to 15-minute blocks, to their family and friends? No, because it would be paranoid on their part, and disrespectfully paternalistic for anyone else to suggest that they do. "Buddy system" -- like college kids are four years old. Then again, when they've grown up so socially isolated, and thereby wind up stunted / retarded in young adulthood, parents start to worry about things like this.

They don't feel the pangs of guilt, like "Shit, I really messed up my kid's social skills and independence by sheltering them their whole life up till now. And now they can't get through the day without making a buddy system of their whereabouts... But, meh, sucks to be in my kid's generation, I guess. Glad it's not me. They'll end up all right somehow."

Taking in all the pieces, I'd say that parents and kids are treating the stay at college more like an extended, off-and-on summer camp. They could have let their kids play with their peers and slept under the same roof, learning social adaptability, when they were little -- but then kids can't grow up too early, like they used to when we were young.

Yet somehow we turned out OK, while our kids are awkward and inept... must be some kind of mysterious societal change, something we don't have any control over. And now back to hounding my kid about why they didn't respond when I texted them 10 times this afternoon.

August 17, 2013

Celto-Germanic accomplishment, with speculations on the Mediterranean and Middle East

The matter of national character is under-appreciated in America, where we think in larger racial terms of whites vs. blacks, whites vs. the Indians, and so on. Likewise, the defenders of a group called "Dead White European Males" contrast their heroes with contempo multi-culti women and queers. These broad-brush conceptions obscure much of the true evolution of our culture, namely the "who" and "where" in the European context.

In Human Accomplishment, Charles Murray quantifies the eminence of people he calls "significant figures" in the arts and sciences, for both Europe and Asia. This is measured by their share in subject-specific encyclopedias across a variety of languages. Michelangelo and Shakespeare receive lots of space in any language's encyclopedias on art and literature, so they rank very highly. More marginal figures may not make the final cut.

Taking stock of all the significant figures from Europe after the Middle Ages, where did they come from? I couldn't find a reproduction of his map on page 297, but here is the description:

If we ignore national borders and instead create the most compact polygon (in terms of land area) that encloses 80 percent of the places where the significant figures grew up, it forms the shape in the figure below [not shown here], with borders defined by Naples, Marseilles, the western border of Dorset County in England, a point a few miles above Glasgow, the northern tip of Denmark, and a point a few miles east of the city that used to be Breslau in German Silesia (now Wroclaw in Poland).

[...] All of the Netherlands is still in... Parts of Britain, France, Germany, and Italy are still in. Russia is out. Or you can think of it another way: 80 percent of all the European significant figures can be enclosed in an area that does not include Russia, Sweden, Norway, Finland, Spain, Portugal, the Balkans, Poland, Hungary, East and West Prussia, Ireland, Wales, most of Scotland, the lower quarter of Italy, and about a third of France.

Zooming in even closer, we uncover the following map:


The colored regions in the European core (light and dark blue together) account for the origins...of fully 50 percent of the total European significant figures. Just the five regions colored dark blue -- Ile de France, Southeast England, Tuscany, Belgium, and the Netherlands -- account for 26 percent of the European total. The other 24 percent come from (in order of their contribution) Bavaria, Venetia, Southwest England, Switzerland, Lowland Scotland, Lower Saxony, Saxony, Baden-Wurttemberg, Northeast Austria, the Italian Papal States, and Brandenburg.

What makes this broad area unique is its blend of peoples who come from different but complementary national character backgrounds. They combine Celtic curiosity and playfulness with Germanic diligence and orderliness. Or in the terms of evolution by natural selection, blind variation and selective retention.

Too given to imagination and rambunctious behavior, and you can't commit to the process of whittling, crafting, and putting the finishing touches on your Big Idea. Hence why the more predominantly Celtic areas do not join what Murray calls the "European core" of accomplishment. On the other hand, too obsessed with structure and nose-to-the-grindstone workaholism, and you can't let your mind wander into strange new territory to discover you Big Idea in the first place. Hence why the more purely Germanic areas fall outside of the core.

In the farther eastern parts of Europe, the Balto-Slavic people appear to lack either trait in a high proportion. On the whole, they are incurious as well as listless, not unlike the norm in, say, China. The gloomy, going-nowhere Vodka-drinkers, and the numbed-out opium den dwellers and slot-machine yankers. Adaptation to sedentary agriculture has selected for similar ways of life in both the plains of Northeastern Europe and East Asia. The lack of accomplishment in that area, too, is striking -- especially considering their political integration, social organization, and intelligence levels higher than the average in Western Europe.

Celtic people are more adapted to pastoralism, hence the playfulness and rambunctiousness seen among herder societies all over the world. Germanic people were also pastoralists, though they did begin around the Northern European Plain, and so might have also been selected for a certain degree of orderliness and industriousness if they dipped into agriculture as well as livestock herding, not to mention any such traits that they may have picked up from their Balto-Slavic neighbors (culturally or genetically), who are more exclusively sedentary crop-planters.

As for the southern regions of Europe, my completely uninvestigated hunch is that they don't show the healthy mix of pastoralist and agriculturalist personalities after the Mediterranean was eclipsed as the breadbasket, and settled farming shifted more toward the European Plain up north, leaving a more heavily pastoralist area more interested in the culture of honor. The Mediterranean's heyday seems to have been earlier, after farming had established itself, but when herding animals was still being practiced as well, as in the Greco-Roman world.

Speculating even more, I suspect that a similar dynamic accounts for the central role of Persia in the Islamic Golden Age. Naive Westerners assume that Islamic = Arab or at least Semitic, but so many of the towering figures were from Indo-European Persia. When you interact with their descendants today, Persians appear to have a familiar mix of playfulness, rambunctiousness, diligence, and orderliness. The Lebanese feel that way too.

In contrast, the Palestinians are more like the Irish of the Near East. (Purer pastoralist types, even more remote from settled civilizations, would be the tribes of Afghanistan, who are the local versions of the Balkan tribes.) I have no idea who the Middle Eastern counterparts of the rigid, martinet pure-Germanic types are. But the Fertile Crescent must have produced tribes similar to the farmers of the European Plain, especially ones like the Germanic tribes who began locally as pastoralists.

Certain modern peoples would then represent a blend of those two opposite ways of life -- agriculture and pastoralism -- and enjoy an outsized advantage when it comes to cultural production. Egypt for sure, the descendants of the agrarian Nile civilization and the desert nomads from Arabia who over-ran it. The Lebanese and Persians probably come from some fusion of Fertile Crescent farmers and pastoralist / sea-faring nomads (culturally at least, whether genetically or not).

This is another post in an off-and-on series about that urges fans of "human biodiversity" studies to take a closer look at the variety out there in the world, now and historically, and not to simplistically talk about the white or European race, Islamic culture, etc. And more importantly, to broaden their interests beyond intelligence and focus also on personality, subsistence mode, and so on. How else are we supposed to explain, for example, the state of affairs in China vs. Japan, or Russia vs. Germany? Or East Asia vs. Europe? There's way more going on than IQ.