April 30, 2015

Cocooning still continuing through new General Social Survey data

Now that the 2014 data for the General Social Survey have been released, we can see if recent social trends are continuing or reversing. I'll be focusing on those that relate to the cocooning-and-crime cycle, which plays out on the local and interpersonal level, where we typically only have impressions rather than hard data.

Here is an earlier post that lays out the dynamics of crime and cocooning behavior. Briefly, when people come out of their shells and let their guard down, it makes them more vulnerable to manipulation or predation by criminals and con artists. An outgoing social mood leads to rising crime rates. As crime rates get higher and higher, people begin to worry more about who they can trust. Rising crime slows down trust.

Ultimately they figure it's not worth the risk to be socially open around strangers and begin to close themselves off. That leaves slim pickings for criminals, so that cocooning causes falling crime rates. With the environment becoming so safe, people reassess how necessary it is to cocoon themselves from seemingly non-existent danger. Ultimately, low crime rates lead people to re-emerge from their cocoons, which begins the cycle all over again.

Violent and property crime rates have been falling since a peak around 1992. They fell dramatically during the '90s, looked like they would bottom out during the 2000s, but have continued a steady descent over the past five or so years.

That should have been enabled by the continuing of the cocooning trend, and indeed the new GSS data show no reversal in any of the key signals of people closing themselves off to others.

The main psychological trait here is trust, and it continues to fall. The GSS asks if other people can generally be trusted, or if you can't be too careful. A high was reached in the late '80s, with around 40-45% of Americans trusting strangers. After a decline, it appeared to hold steady at 32% from 2006 onward. In the 2014 survey, though, it took an extra dip down to only 30%.

This withdrawal of trust cuts across every demographic group, so I'm not controlling for any of them. Race, sex, age, education, class, marital status, region, size of local population, political orientation -- everyone is noticeably less trusting of strangers than they were 25 years ago.

One of the most dramatic drops I noticed was among young people. The only age group that is about as trusting as it used to be is 60-somethings. Every other group shows the decline, but the drop is steeper the younger the group. Among people aged 18-24, trusting others dropped from 35% to 14% from the early '90s to 2014. But even among 40-somethings, trust levels fell from 48% to 28% during the same period (that's the same size of a decline, but relatively smaller compared to how high it began).

I interpret that as younger people being more susceptible to cocooning because not trusting strangers and wanting to just play by yourself is a natural part of immaturity. Young people being as socially open as they were back in the '80s was more of a radical departure from what you'd expect based on their age, so it snapped back harder once cocooning set in (regression toward the mean).

You may be thinking, "Well, there's still at least 30% of Americans who trust others -- they're a minority, but it isn't like they're non-existent. And the maximum was only 40-45% before. How big of a change can that be?"

The difference is that trust is not part of isolated, individual behavior -- it relates to interactions among pairs of individuals, or larger groups still. Pick two people at random, throw them together, and see if both of them are trusting. If so, they can sustain a getting-to-know-you interaction. If only one is trusting, the interaction will sputter out. If neither one is trusting, it won't even be initiated.

The chance that two randomly chosen people are both trusting is proportional to the square of their fraction in the population. (Pick one, pick another, multiply the probabilities.) Squaring a fraction makes it much smaller, so looking at just the trust level among individuals is underestimating how fragmented society has become.

In a world where 45% are trusting, the chance that any two strangers who run into each other will both be trusting is 20%. In a world where only 30% are trusting, those two strangers have only a 9% chance of both being trusting.

Thus, even though a trusting disposition has "only" fallen by one-third, from 45% to 30%, trust-based interactions between a pair of strangers have fallen by half, from 20% to 9%.

It's even worse for those youngsters. When their trust levels fall from 35% to 14%, successful interactions between a pair of strangers who run into each other fall from 12% to just 2%. Of course, folks can make small talk without having to trust each other, but I'm talking about the ability of people who haven't met before to open up and connect with each other right off the bat. It may have been difficult before, but it's nearly impossible now. They might as well be toddlers who think everyone other than mommy and daddy are dangerous, or who are at least not worth trusting to share their toys with.

If you've wondered why you never see young people letting it all hang out and feeding off each other's energy, that's why. They simply don't trust anyone.

Going out to a bar on a somewhat frequent basis is also less common than it was back in the '80s. It's most pronounced among younger age groups, and only those who are 55 and older are more likely to go out to a bar or nightclub than their counterparts used to be. That's the Boomers refusing to age gracefully, and not really a sign of an outgoing disposition. They're there to engage in a contest of "who's still got it?" rather than to open up and have fun.

Spending an evening with a neighbor is still declining since a peak in the late '80s.

Both men and women continue to have less frequent sex. Doing it only once a month, or less frequently, afflicted nearly 40% of women in the early '90s, but nearly 50% in 2014. For men, infrequent sex rose from around 30% to around 40%.

Those questions establish that people are still cocooning. What about gradually realizing that the world isn't so dangerous anymore? Fear of walking around your neighborhood at night tracks the crime rate, lagging behind it by a few years (just to be safe). In 1994, 45% of Americans were afraid, and in 2014 it continued to drop, down to 31%.

Predicting how long this period of cocooning and falling crime will last is not an exact science. The last time, it was about 25 years, from a peak of crime in 1933 to a bottom in 1958. Keeping tabs on the social mood is more important: once we see a steady rise in trust and open behavior, we can expect crime rates to start rising shortly after. So far, though, that doesn't appear to be around the corner.

GSS variables: year, trust, socbar, socommun, sexfreq, fear

April 29, 2015

Neighborhood-level diversity prevents rioting among blacks and SWPL decadence among whites

Robert Putnam's research on diversity and civic participation shows that the more diverse an area is, the less likely the individuals are to coordinate their shared interests at a larger scale. That not only affects relations among individuals from different ethnic groups -- blacks and whites won't cooperate -- but even within the same group -- whites don't cooperate with each other, and blacks don't cooperate with each other.

Could there be an upside to the failure of individuals to coordinate their collective behavior? Yes -- if their purpose were anti-social or decadent. The decay on display in Baltimore provides a great case study.

It's no surprise that the rioting and looting are taking place in neighborhoods that are nearly 100% black. Blacks are more impulsive and inclined toward both violent crime and property crime. What is unusual, however, is that the neighborhoods that are closer to 50-50 black and white are not merely afflicted by rioting to a lesser degree than the 100% black areas, but are hardly affected at all. (See the maps at the end of this post.)

What gives?

Rioting and looting are collective behaviors, however fleeting and decentralized. They do not require sustained interest and permanent institutions to carry out the collective will, but they do rely on a minimal level of in-group cohesion and trust in order to keep the snowball growing rather than flaking into pieces or turning the members against one another.

In fact, with a little more regular participation and a bit more of an "honor among thieves" relationship, coordinated crime by an organized ethnic group could sustain a gang or mafia, again provided the area belongs entirely to that ethnic group.

The mafia operated in neighborhoods that were predominantly Italian, not in those that were half-Italian and half-non-Italian. Black gangs controlled South Central L.A. back when it was all black; Mexican gangs control it now that it's all Mexican. If the neighborhood was only half-Italian or half-black, the mafia and gang problem was not simply half as bad as in the fully Italian or black areas, but could not get going in the first place. (Of course, they would have still been subject to individual-level crime, just not collectively organized crime.)

White enclaves in large cities tend not to be stricken by rioting and looting, because anti-social whites express their deviance in non-violent ways. When they coordinate their deviance at the collective level, they take over the local education system and ban peanuts from school grounds, they carve out bike lanes for non-existent bike riders while clogging the narrowed streets for actually-existing drivers, and they take over any business area that once supported a variety of normal utilitarian shops and turn them all into arenas for decadent status contests (quirky bars, quirky coffee shops, quirky doggie yoga day-spas).

Yet as in the case of black rioting, these collective insanities only infect neighborhoods that are nearly 100% white. If it's only 50-50, the hipsters and yuppies don't feel emboldened enough to organize their decadent impulses. They don't have the sense that their ethnic group totally owns the place and can do whatever they want with it, for better or worse.

Overall, diversity is corrosive to society at any scale. But there is a silver lining: it also prevents anti-social collective behavior from catching fire.

Maps of diversity and rioting in Baltimore

Here is a map of racial diversity around Baltimore. Blue dots are blacks, red dots are whites. (From a series of similar maps here.)


The core of the city is where the dots are the densest, more or less in the center of the image.

There are two stark areas that are mostly black -- West Baltimore and East Baltimore, close to the core. Farther away from the core (for example, toward the northeast), the blue dots overlap red dots, showing a more mixed area than a pure-black ghetto.

There are three main areas that are mostly white -- North Baltimore (the large wedge pointing south that separates the black areas to the west and east), and two smaller but denser enclaves that lie just south (on a tiny peninsula) and just southeast of the core (in a red ring).

Yuppie and hipster decadence is concentrated in the all-white areas, such as Hampden in the North Baltimore wedge, Downtown near the center, and Fell's Point in the red ring lying southeast of the core. To the northeast, there are still plenty of whites, but they live in more diverse neighborhoods. SWPL decadence in a "nice boring" place like Belair is not at half the level of Fell's Point, but barely there at all.

As for black decadence, here is a map of the major riots in 1968, overlaid with the riots in 2015 (which are much smaller -- though wait until 2020). They come from this article at Vocativ.


The scale on this map is more zoomed-in than the map of diversity. The major and minor riots have afflicted the all-black areas of West and East Baltimore, close to the core. There are plenty of blacks living out to the west and southwest, as well as out toward the northeast, but they find themselves in more diverse neighborhoods.

In these diverse neighborhoods, would-be rioters apparently don't feel they can trust their fellow blacks enough to carry out an afternoon and evening of looting, trashing windows, and setting cars on fire. If only they owned the whole neighborhood, "shit would git real". But with nobody trusting anybody else in a mixed area, they're going to just watch the riots on Wurl Stah and vent their aggression on Twitter.

When the neighborhood might otherwise be burning down, here's one cheer for diversity-induced atomization.

April 27, 2015

Movie trailers as serial drama (STAAAAARRRRR WAAAAAARRRSSSS)

On the last episode of "Agnostic reacts to Star Wars trailers," we learned what the new trilogy will amount to -- a cosplay fanfic sequel for Millennials.

And now that they've released the next installment of "Trailers for That New Star Wars Movie," that assessment is certain. You can almost see the Millennial in stormtrooper costume walking up to Harrison Ford and nervously asking for his autograph. I wonder whether that'll be relegated to a making-of sequence during the credits, or be included in the main narrative itself.

("Gee Mr. Solo, you're some legend around these parts... It sure would do me the honors if you'd, uh, do me the honor of signing my toy lightsaber!")

I still don't know what the hell the movie is going to be about, but contemporary audiences don't want any SPOILERS whatsoever.

Trailers are no longer meant to reel you in on the first viewing. They have become a serial drama form unto themselves. The first reveals a tiny bit, and leaves the audience on a cliffhanger. The next one recaps the last one (barren desert landscape, speeder bike battle, lightsabers), but reveals a little more (Vader helmet, Han and Chewie, TIE fighter pilots).

Who knows how many more episodes there will be before the series finale -- the trailer that tells you what the hell the movie is going to be about.

Not following the hype cycle of modern movies, I was unaware of the trend of trailers as soap operas (gossip about them online when the new episode comes out!). I'm even more out of touch with video games, but their hype cycle is so huge that even someone who doesn't play them anymore may know about it. First there's a hint from the developers, then a spectacle teaser during E3, then a beta version, then a playable demo, and finally two years later, the actual game.

I remember when the movie trailer was a terse stand-alone format, and when new video games were announced once they were released, not years ahead of time.

But, that was back when people still had a life. Folks in outgoing times have too much of a dynamic social life to tolerate a serial format stringing them along and keeping them waiting. Soap operas were huge in the Midcentury, but were marginal by the '80s. Short film serials were popular at theaters in the Midcentury, but were also absent during the '80s, whose climate was similar to the Roaring Twenties. Only since the cocooning climate returned during the '90s did serial dramas return to mass entertainment, this time on TV.

They could have made a string of teaser trailers for movies back in the '80s, to be shown on TV commercials or in theaters, but they didn't. Those are a new development -- since when exactly, I don't know, although I have a hunch the Lord of the Rings movies had serial trailers.

Cocooners are bored out of their minds, so they crave a steady and regular fix of anything meant to wake them up. Previously, on "dissecting popular culture," we looked at entertainment as a mood stabilizer vs. experimentation, making the link to stabilizing vs. destabilizing types of drugs.

The stabilizing kind were popular in the Midcentury and have become popular again since the dawn of Prozac circa 1990. Ward Cleaver had Kellogg's Pep and Geritol, while his grandson has Monster energy drinks and Viagra. The destabilizing kinds like LSD are meant to be taken in stand-alone sessions, as though each trip were to somewhere different.

Movie trailers have clearly joined the mood stabilizer family of entertainment. Life is boring, but don't worry, another teaser trailer for Whatever Part Four comes out next week. And don't worry, it won't contain any spoilers -- which would ruin the fix you ought to get from the next trailer after that one.

Spoilers may not answer every question about who, what, when, where, why, and how, but they do close off certain paths through which the trailer-makers could have strung you along. And now that the function of trailers is to provide a regular dose of stimulation to bored nerds, they no longer tell you what the hell the movie is going to be about.

April 26, 2015

Exotic cuisine, status-striving, and achieved vs. ascribed status

What role does the increasing popularity of foreign food play in the larger trend of status-striving over the past 30 or so years?

The usual view, which I had bought into without giving much thought to it, is that it has to do with signaling how esoteric your tastes are, and by extension how erudite you are in the foodie world. Everybody knows about "Mexican food," but do you know what the "cuisine of Oaxaca" is like?

In this view, the players in the status contest are trying to one-up each other by discovering, obsessing over, and then abandoning one exotic cuisine after another. Each cuisine goes through a fashion cycle, and the larger contest is jumping from one to another, each cuisine less obvious than the last.

And yet, after three decades of fashionable foodie-ism, Asian restaurants are still basically Chinese, Japanese, and Thai. Japanese has not fragmented into increasingly esoteric sub-cuisines -- Okinawa, Hokkaido, Tokyo vs. Osaka, etc. Thai-mania has not led to obsessions over Vietnamese, Cambodian, Burmese, Laotian, Malaysian, or other Southeast Asian food.

North African is still Moroccan and Ethiopian, leaving out giants like Egypt as well as tiny places like Eritrea.

Caribbean food is still Cuban and Jamaican, leaving out dozens of smaller and more obscure islands.

South American food is still Brazilian. Central American is still Mexican, and still catch-all Mexican rather than dozens of sub-cuisines finding their own success.

Middle Eastern is still Lebanese and Persian.

"Indian" is still northern and western Indian, not Tamil, Bengali, or Nepali.

Eastern Europe is still totally avoided and unexplored, outside of the Mediterranean food of Greece, which has not led to a trend-hopping chain to Serbian, Bulgarian, Albanian, etc., after the initial novelty of Greek wore off.

This is not to overlook the occasional exception that finds a niche audience, like Mongolian barbeque. The point is that if the goal of the contest were to burn through ever more exotic and esoteric cuisines, Thai food should have been done by the end of the '80s, and Tibetan restaurants should have enjoyed a burst of success at some point along the way. Its no-show status is even more puzzling when you look at how much the elite likes to show its sympathy to the culture of Tibet.

If it's not a case of trend-hopping, how does the foodie phenomenon tie into the status-striving climate after all?

It looks more like it ties into the switch from cultural identities being ascribed status to achieved status, to use some sociological concepts. When some aspect of cultural identity is acquired by being ascribed, it's beyond the individual's choice and is usually inherited from parents or community upbringing. Your parents were Baptists, so you're brought up Baptist, and you remain Baptist in adulthood. If that piece of cultural identity were achieved, it's through a more effortful choice from the individual. For example, if your parents were Catholic and raised you that way, but you convert to Baptism as an adult.

An earlier post explored the link between the status-striving climate and identity as achieved status, as opposed to identities as ascribed status in an anti-striving or accommodating climate.

In short, if the impulse is to climb up the status ladder, to reside wherever you need to do so, to behave however you need to, then the norms must favor identity as something that you can choose and craft to suit your needs and preferences. If the impulse is to rein in the competitive war of all against all, then the norms must make identity something that is beyond the individual's ability to mess around with, and keep people more or less where they already are.

Thus, dynamism is supported by norms of laissez-faire, with collectively destructive competition as the side effect, while stasis is supported by norms of reining-it-in, with collective harmony as the side effect (or rather the intended goal).

Food has been part of ethnic identity forever, seen most clearly in food taboos that distinguish Us from Them. Incorporating foreign food into your regular diet tells others that your cultural identity is constructed rather than handed down. That signal lets them know that you're a serious contestant in the status-striving competition. Once you've identified one another, you get to feel a status boost over those who are not eating foreign food on a regular basis. It also lets you identify who your micro-competitors are -- everyone who is into Indian food can now begin the contest over who knows the best Indian places.

The broader importance of signaling your diet of exotic food, though, seems to be telling or reminding others that they shouldn't try to regulate anything you do. In a climate of greater regulation, a white person seen eating Indian food every night would be looked at funny until they started to eat what is normal for someone of their cultural descent. In an anything-goes climate, there are few ways to more convincingly flout the norms about regulating the self on behalf of group cohesion.

Even better, it's not a very flagrant, aggressive, or offensive way to let others know not to bother trying to regulate your behavior, unlike punk-y clothing and hairstyles that are unabashedly giving society the middle finger. Indian food isn't inherently anti-social, unlike shredded clothing, tongue piercings, green mohawks, etc. It doesn't offend us at the most basic gut level, as though we saw someone eating bugs (notice that the inherently gross stuff in exotic cuisines is strongly avoided). Bug-eating is offensive no matter whether that's native to their culture or a foreign adoption.

But what's so gross about palak paneer, an Indian dish of spinach and cheese? Nothing, and it wouldn't seem so out of place in European cuisine, except for its distinctly Indian flavor. Making it a regular part of your diet is not designed to offend the norms that regulate us away from eating inherently gross things, but those that steer us toward what our culture does and away from what a foreign culture does.

Broadcasting your taste for exotic cuisine makes your message of "don't try to regulate my behavior" a bit more palatable, as it were, since it requires conscious thought to construe your behavior as rule-breaking, rather than a gut reflex to see so. It's one of the most pro-social ways you could go about signaling your lack of social constraints.

That probably explains why the phenomenon is biased toward the elites, who want to appear superficially polite and civilized, whereas the bird-flippers at the lower-middle-class level will just buy some obviously offensive t-shirts, chains, and piercings from Hot Topic.

April 23, 2015

"Problematic faves"

I remember when having a problematic fave meant you were into Culture Club despite the singer being a cross-dressing faggot.

"I Really Like You" by Carly Rae Jepsen

Contrary to what everyone is saying, this song doesn't sound like the '80s, but it has a refreshing emotional tone nonetheless. It isn't bratty, emo, or self-absorbed. It's basically sincere, uncomplicated, and other-pleasing.

For 20 years, female pop singers have been broadcasting how little they depend emotionally on men. Either they're scum and don't deserve attention ("No Scrubs," "We Are Never Ever Getting Back Together"), or they're fleeting conquests of empowerrrd womynnn ("Shoop," "Blank Space"). Two sides of the same slutball coin (both types ironically sung by a virgin who only "dates" fags, Taylor Swift).

The songs that are supposedly about being in a loving stable relationship don't ring true and sound forced ("I Wanna Love You Forever," "Umbrella"). Perhaps that's because there aren't any songs about the initial infatuation that establishes the couple's chemistry as a prelude to love. (Again, not talking about the shallow "I'm hot, you're hot, let's do it" songs about lust at first sight.) If we're not convinced of the organic nature of their first encounters, then hearing about pop singers' steady relationships will sound staged and going-through-the-motions.

Wholesome, bouncy songs about the initial stages of courtship used to be a dime a dozen back in the '80s and early '90s -- "I Think We're Alone Now," "Shake Your Love," "I Love Your Smile" -- but there are notable differences from today's "I Really Like You".

The singers from the good old days were teenagers, who sound more believable than the nearly 30 year-old Jepsen when it comes to feeling butterflies in the stomach. They also sounded more mature back then, as though they'd been infatuated and in a relationship several times already, whereas Jepsen sounds more like a sixth-grader getting her first crush. Another case of Millennial stunting caused by helicopter parents socially sheltering them.

And of course they don't sound anything alike. The older songs are melodic, the verses are sung rather than mumbled-and-shouted, the drumbeat is more elaborate than a metronomic thud, and the instrumentation is rich rather than sparse.

It goes to show how superficial music critics are, that they lump songs together that use the same family of instruments, rather than, y'know, how it actually sounds. "Synths + drum machine = SO '80S!!!" It's more like a contempo pop song wearing an '80s costume. The video is also a dressing up as an '80s video, with Tom Hanks replacing Chevy Chase as the comedic actor who lip-syncs the lyrics while acting goofy.

Even the tone, while unlike the typical self-absorbed or self-conscious tone of today's music, isn't at an '80s level of letting your guard down. It's more like the atmosphere of the mid-to-late '50s, although I can't think of a good comparison song off the top of my head. Something in between the forced sound of the Chordettes, but not as sincere as the girl groups of the early '60s.

In general, the people looking to make the Next Big Thing should stop trying to copy the '80s and look more to the late '50s and early '60s. That was the beginning of the outgoing and rising-crime climate that would reach its culmination in the '80s. It's hard to imitate an apex, but less daunting to recreate the simple inchoate beginnings.

Once we finally do shift from a cocooning to outgoing social mood, it'll only be at the level of the last shift circa 1960. We're not going to skip straight to the end. Our mindsets, both the musicians' and the audience's, will be more aligned with those of 1960 than 1980 or 1990.

April 19, 2015

Millennial moms and dads reversing helicopter parent trend?

We've covered this topic last year (here and here), but I'm starting to see hints of it in real life now.

Yesterday afternoon I stopped at a park to eat, and the picnic tables were near a playground, where about ten children were playing. If it had been just three years ago, every kid would have had a parent shadowing their tiniest moves, serving as their playmate rather than one of the other kids, and any time a child came near a stranger (whether a child or grown-up), the parent would swoop in to block the potential contamination / abduction / whatever they thought was going to happen.

I glanced over a few times out of curiosity about how ridiculous helicopter parenting has become this year. But I was surprised to only see one obvious helicopter parent among the ten kids -- one of those overly involved goofball dads who thinks his kid would rather play with a grown-up goofball than one of the other kids. Just let them play by themselves -- except this time they were!

There was a group of much older adults, probably the grandparents, and being early-mid Boomers they were hands-off just as they were when they were new parents. But where were the other hoverers and smothering mothers? One group of children looked to be semi-supervised by a teenager, but not by an adult. These were all white kids, by the way, not the Mexican kids who are allowed to go out and play by themselves. That really stood out as unusual.

Then as one mother was leading her son back to the car, he jumped up on a picnic table, walked to the other end, and leapt off. A helicopter parent wouldn't have allowed any of those actions to take place (jumping off a table = skinned knee alert), and would've flown into containment / safety landing mode right away. Not out of respect for public picnic tables in a public park, but because she'd be paranoid about her son's safety, and embarrassed from her son making her look like a negligent parent in front of the other parents, simply by letting kids be kids.

She didn't encourage his behavior; she just went along with it, apparently thinking "boys will be boys." No parent would've thought that in this situation just a few years ago.

Aside from the teenager, these children were all about 3 to 7 years old. Their parents must be in their late 20s and early 30s, i.e. Millennials. The nonchalant mom with the up-up-and-away son didn't look old enough to be a late Gen X-er.

The small sample size here is not a problem, since there has been almost no variation in the basic parenting style for years now. Any break from uniformly 100% helicopter parenting is highly out of the ordinary.

I've heard Millennials on the internet and on TV say they're going to be less hovering when they're parents, but had yet to observe it in real life. Now that their kids are old enough to be seen on the playground, you might start to notice a change back toward the good old days of hands-off parenting from now on.

Don't expect it to jump right to the '80s kind of environment, when children went to the playground with no adults at all. It'll be more like the late '50s and early '60s, when the Dr. Spock and drive-in cocooning trends were just beginning to loosen up.

I have no delusions about how hilarious it's going to be watching the Millennials attempt to raise children. But I am still glad that the community-fragmenting trend of helicopter parenting is finally going to come to an end, and that kids around the neighborhood will once more be part of an organic connected peer group, without having to route all interaction through their parental delegates.

April 16, 2015

The no-show of Jews in dance music: A survey of disco, new wave, synthpop, and dance-pop

While Jews may dominate the business side of the music industry, their accomplishment on the creative side has been more uneven.

They have always had an outsized influence among rock groups, although typically in the angsty misfit genres, such as heavy metal, punk, and alternative (Lou Reed, Kiss, Quiet Riot, Twisted Sister, the Ramones, NOFX, Bad Religion, etc.). Despite their participation in adversarial African-derived genres like rap (the Beastie Boys) and ska (the Selecter, the Specials), as well as aloof / too-cool black genres like Midcentury jazz (Stan Getz), they scarcely took part in the more agreeable genres within black music like R&B and disco. And where mainstream pop ranges in tone from cheerful to longing, the range of Jewish crooners is less sympathetic to the listener, ranging instead from schmaltzy to complaining (Barry Manilow, Barbara Streisand, Bette Midler, Neil Sedaka, Carly Simon, etc.).

They are well represented in genres where the relationship between the performer and the audience takes the form of spectator and spectacle, but are no-shows in genres where the performer is more of a background instigator trying to work the audience members up into a participatory activity among themselves, such as dancing.

Aside from reflecting the Tribe's well known tendencies toward neurosis, these differences also show their inclination toward the verbal and psychological (whether cerebral or emotional) and away from the corporeal and kinesthetic. Dancing takes as much basic body coordination as other salt-of-the-earth pastimes like playing sports, hunting, fishing, and camping -- all activities that the mentally oriented Jews find awkward and off-putting.

You really notice the absence of Jews in cheerful, danceable pop music when you listen to an '80s compilation. I usually listen to albums by a single group, where broad patterns in the genre are not so evident. But with the much larger sample size on the compilation I was listening to the other day, I was struck by how few of the groups I'd seen on lists of Jewish cultural figures.

Pursuing that hunch, I perused several lists (such as this one and this one), and did my own search of musicians whose Wikipedia articles mention them being Jewish and being a singer or musician in the new wave, synthpop, disco, or dance-pop genres. This restricts the focus from roughly the '70s through part of the '90s, when dancing was a popular activity.

The hunch panned out, with hardly any Jews in the more dance-oriented genres, unlike their heavy influence in rock and crooner pop.

In all of disco, there was only a single Jew -- Steven Greenberg, who founded the multiracial act Lipps Inc., the one-hit wonder known for "Funkytown".

Likewise in new wave, I could only find one confirmed Jew -- Nick Feldman, the bass player and half of the core duo of Wang Chung, who had a string of hits but are best known for "Everybody Have Fun Tonight". Jon Moss, the drummer for Culture Club, was adopted by a Jewish family from a Jewish-run orphanage, but I couldn't find a source that said his birth parents were themselves Jewish. Indeed, when asked in a recent interview if the orphanage accepted goys, he replied only with, "Probably, yeah," as though he himself is unsure of his genetic background.

Nor did the gods of synthpop treat the Jews as their chosen people. There's only one, and a halfie at that -- Pete Burns from Dead or Alive, who had a few hits but are best known for the dance classic "You Spin Me Round (Like a Record)". One of the members of Army of Lovers, whose biggest hit was "Crucified" in 1991, comes from an Algerian Jewish family, but I'm talking about the Ashkenazim here.

Paula Abdul was a dance-pop star throughout the late '80s and early '90s, though she too is Sephardic on her Syrian father's side (and Ashkenazi on her mother's side). I suspect her success owes more to the part of her blood that comes from the belly-dancing world rather than the tax-farming world. Taylor Dayne, however, is fully European Jewish; her song "Tell It to My Heart" from 1987 is the beginning and end of the story of Askhenazi dance-pop.

My search also turned up a handful of Jewish musicians listed under "new wave," but they're from the bands that were mostly playing rock, punk, and ska, with only a hint of disco, dance, and synth-rock -- the Knack, Rob Hyman and Eric Bazilian from the Hooters, Susanna Hoffs from the Bangles, and Danny Elfman from Oingo Boingo.

A tougher case to call is Blondie, whose guitarist (Chris Stein) was Jewish. They started off as a stripped-down punk and power pop band, and gradually evolved into a more eclectic style that mixed in synth-rock, reggae, disco, and rap. They were more of a bridge between the punk and new wave scenes, maybe proto-new-wave. Whatever you want to classify them as, they deserve an honorable mention in this survey.

Throughout human history, dance and music were two sides of the same coin, and only relatively recently has music become primarily passive on the audience's part, whether it's elite classical music or generic radio-friendly crap. Dancing is a group activity that bonds members together, giving music a key role in creating and maintaining a sense of community. Contemporary pop music that sets the stage for carefree dancing is an attempt to preserve those traditional roles of music.

Thus, the relative absence of Jews in dance music is part of their broader hesitation as culture-makers to create a more cohesive group-iness among their host population. (Please no retarded comments about the debt that Gentiles owe to all those schmaltzy Jewish winter-time tunes that don't have anything to do with Christmas.) They don't mind making a buck off of it as managers and record label executives, but actually creating it themselves -- too awkward, yucky, and shameful. Moving your body around in dance is fit only for the half-animal goyim, beneath what appeals to the mind of the mensch.

April 10, 2015

Are cops more likely to harm innocent whites in places of high diversity?

Like the Rodney King video of the early 1990s, the recent over-reaction of a white cop who shot an unarmed fleeing black suspect in the back in South Carolina will provoke much discussion about white cops and black victims.

Too many whites settle into the view of "Well, whatever the police have to do to keep the violent blacks at bay." But it is not realistic that a cop who is that callous toward blacks will somehow transform into a respectful servant when he's dealing with whites. The cop sees himself as pest control, and whether he has to unload his bug spray on hornets, termites, or your pet dog who didn't get out of the way like he was ordered to, makes no difference to him. All those different species of pests had it coming.

One of the key findings on ethnic diversity, from Robert Putnam's research, is that it erodes trust. The "no duh" outcome is that diversity makes people of one race lower their trust in people of a different race. But the surprising and disturbing outcome is that diversity even makes people of one race lower their trust in fellow members of their own race.

In Los Angeles, not only do whites not trust the Mexicans, they don't even trust the other whites, and remain fragmented and impotent to organize for their own collective good. It's the polar opposite from white civic participation in a homogeneous part of the country like North Dakota or Iowa.

In short, when an individual is confronted with a Tower of Babel environment, which offers no possibility of coordinating a group's interests at the collective level, he withdraws from communal life and focuses only on his nuclear family, or perhaps just himself.

I suspect there's a strong influence of this dynamic at work in the growing and unregulated police state around the country. You tend to only hear about it in places with high levels of diversity.

The apologetic white response is that white cops in such areas would prefer to stick to their preference of targeting only blacks and Mexicans, and leave the nice whites alone, but are compelled by The Powers That Be to appear less racist, and therefore go after innocent whites to "narrow the gap" and avoid harassment, firing, and shakedowns.

When you look into what white cops are up to, though, you don't see people who love their own group and hate different groups. You see people who are in a hunkering-down, under-siege mentality just like Putnam's research would predict for folks living in areas of high ethnic diversity. Only these paranoids are armed to the teeth and don't even have to let you know you're about to be raided.

Thus, the more likely reason behind white cops over-targeting white folks in highly diverse areas is not to appear to be closing the gap, avoid harassment by the federal Department of Diversity, etc. Those white cops simply don't trust their fellow white citizens.

Contrary to liberal propaganda, these types do not put "white pride" bumper stickers on their car, but ones that say, "I'm not racist -- I hate everyone equally". Again, they are not trying to avoid harassment by the anti-racism squads: they honestly perceive members of their own group as potential bugs that may need to get squashed if they act too uppity, like sleeping below the window that you lobbed a flashbang grenade through.

This is impressionistic, but I think on the right track. Unfortunately the data that could resolve these questions are not collected, let alone published -- over-reactions by police, broken down by race of cop and race of victim, and broken down by geography.

Here are a few suggestive maps, though. The first comes from the Cato Institute's effort to map out botched SWAT-style raids (see full details by using their interactive map here). The second is USA Today's index of diversity, showing the chance that two randomly chosen people will belong to different ethnic groups.



The raid map would need to be made into one showing per capita rates, but I don't think that'll make such a big difference. Also bear in mind that the pin marks look crowded and exaggerate how far north the signal goes, since it's only the point at the bottom that they are measuring.

Highly homogeneous states like Ohio and Michigan are in the top 10 US states by population size, yet there are few pin marks on the raid map, and most of them are near the few hotspots of diversity in the region, like Detroit and Cleveland. Smaller but more diverse states like Colorado have more pin marks. So do similar-sized but highly diverse states like Georgia.

Leaving aside the marks that represent the killing of a cop, and focusing only on harm from cops to citizens, Ohio has 7 pin marks and Michigan just 4. Their population size is 11.6 million and 9.9 million, respectively. Colorado has 10 pin marks, about as much as both states combined, yet it has only about half as many people as either state alone (5.4 million). Georgia also has 10 pin marks, while being comparable in population (10.1 million). What Colorado and Georgia share, and what distinguish them from Michigan and Ohio, is a much higher level of ethnic diversity.

Zooming into the city level, the Columbus metro area has 0 marks involving harm to citizens, whereas similarly sized metro areas that are highly diverse like Las Vegas and Orlando have 2 and 7 marks. The 90% white Pittsburgh metro area only has 1 mark involving citizens, despite being similar in size to highly diverse metro areas like Baltimore and Charlotte, both of which have 4 marks against them.

A more exhaustive list of incidents would have to be made, and a more fine-grained analysis performed, to settle the matter. But at first glance, it does appear that a higher level of ethnic diversity is linked to a greater tendency of callous over-reaction by cops.

Still, are the victims of these over-reactions white or black? Again we need better data. Sticking with the topical location of Charleston, SC, there is a pin mark on the raid map showing a lockdown style raid of Stratford High School in 2003, with the aim of busting up drug deals. Today that school is 60% white, and so back then would probably have been more like 65-70% white. Yet video from the school's surveillance cameras show whites as well as blacks being treated like bugs by the pest control.

Diversity not only corrodes civic participation from citizens, it also leads to callous aggressive harassment of those citizens by the police. This problem is compounded by the difficulty of citizens organizing in highly diverse areas -- they can't coordinate an effort to de-escalate the increasingly paramilitary tactics of their own police forces.

Whites and blacks both got harassed by The Man in Stratford High School, but blacks and whites can't team up on anything, so The Man is free to continue his SWAT-style raids into the future. See also the poor labor history of the South, where whites and blacks couldn't coordinate to collectively bargain with owners and managers. Worse: whites can't even coordinate with their fellow whites and fight a one-team battle against the elites.

This ought to be the focus of the anti-diversity movement for the 21st century -- not the obvious conflicts that will erupt between different ethnic groups, but the corrosive and authoritarian effects it will have within the white group itself. Putnam's research and these various real-world phenomena show that there is no silver lining at all to diversity, not even an emboldened "Us vs. Them" mentality. Instead it results in "every man for himself," the worst possible scenario.

April 8, 2015

Big glasses babe du jour

Jan Smithers as Bailey Quarters, from WKRP in Cincinnati (circa 1980).

In more outgoing times, even the shy types wanted to connect with others, leading them to wear glasses with large inviting frames. Something like the awkward but well-meaning girl in the freshman dorm leaving her door open in the hopes that someone will drop by and interact with her.

Contrast with the narrow, beady-eyed glasses that are preferred by the mousier introverts of today (and during the last cocooning era as well, epitomized by the cat eye glasses of the '50s).

Some pictures from back when "geek chic" aimed to look welcoming rather than repellent:





April 4, 2015

Why do butch dykes copy the hair-do's of twink fags rather than men?

A popular but misguided view of homosexuality is the "opposite sex role" theory -- that gays are feminized and lesbians are masculinized.

I've shown in earlier posts that this theory fails to explain the full behavioral syndrome of gays, who are infantilized rather than feminized, and who only appear feminine in some ways because females are more neotenous (childlike).

The defining female traits of nurturing babies, keeping house, settling down, being a wet blanket, being a worry-wart, giving time to small local charities, etc., are alien to the male homosexual, who in fact behaves like a bratty girl-hating 5 year-old with a turbo-charged sex drive.

Normal men respond to gays not as though they were feminine, but as though they were an annoying and creepily over-eager toddler trying to join the big kids, one who can only be bullied away because he's too socially retarded to take a hint.

What about lesbians being masculinized? I find them harder to study because they don't stand out quite as much. But the butch dyke types sure do. Over Christmas I was standing behind a pair of lesbian parents and their utterly undisciplined children at the airport. The more feminine one had normal-looking medium length hair, and I expected the masculine one to have a man's haircut. I could tell from behind that it was short and parted, so that much checked out.

When she turned around, though, she had one of those severe sideways-pointing hair-do's with the sides and back shaved. The technical name is "undercut," although I find "gay whoosh" more descriptive.

Here are a few examples of this distinctly gay haircut on real-life gays:




And here are only a handful of many, many examples of twink haircuts worn by butch dykes:








If butch lesbians were simply masculinized, why wouldn't they look more like normal men? Why do they copy so specifically the grooming and even clothing habits of gay men, who look and act kiddie? Nothing kiddie can be masculine or macho, including that "I'm such a little stinker" smirk on the dyke at the end.

Maybe they want to look recognizably male, but only a degenerate and abnormal kind of male, to give the middle finger to straight society. And what more familiar model of abnormal male do they have ready to imitate than the faggot?

How ironic that in emphasizing the rejection of hetero patriarchy, the butch dyke winds up looking like a goofy little kid rather than the strong warrior she imagines herself to be.

April 2, 2015

Blame Jewish residents for awful foodie scene in Manhattan's Upper West Side?

A top-featured article from the NY Post reviews how bland, generic, and flavorless the restaurant scene is in the Upper West Side of Manhattan, and places blame on the demand side with the residents themselves. Restaurants that ought to do great business flounder in the UWS, while run-of-the-mill Chinese take-out will never die. Any place that tries to do something bold is immediately watered down to appeal to dull taste buds.

You don't have to read between the lines very carefully to see who the problem is among the residents -- it's primarily the Jewish palate that the Italian-American critic is blasting.

Savvy readers may have suspected this already, given that about 1/3 of the neighborhood's residents are Jewish. But the critic can't come right out and say that in the mainstream media (for similar reasons that he would not be able to discuss openly). He did manage to drop a rather big hint toward the end, though, while quoting some other source (my emphasis):

And a new place that sticks to its guns must put up with what [Jewish restaurant manager Ed] Schoenfeld calls the “kvetch factor.”

On Christmas at RedFarm [Chinese food on Christmas], “A lady at the bar was counting people and seats to see who should get a table next.” She made a loud stink and “made my manager cry,” Schoenfeld recalls ruefully.

He asked her to leave — “I basically fired my customer,” he laughs. “You’d never see that downtown.”

Perhaps locals share lingering nostalgia for the days of Mexican beaneries and dairy cafeterias. Call it Karl Marx’s revenge on a neighborhood that prefers Gray’s Papaya to the eats that make this city the most famous dining destination in the world.

And this related hint:

Restaurants that bravely open with creative menus quickly dumb them down for proletarian tastes left over from the age when bearded “intellectuals” debated Sino-Soviet relations over refried beans, and “fine dining” struck West End Avenue sages as capitalist decadence.

Propagating and magnifying capitalist decadence is a Jewish specialty. Hence their sneering at "fine dining" is a sour-grapes defense mechanism to keep the world from noticing how sub-functional the taste centers in their brains are.

You saw something similar in their sneering at representational art, which had to be dumbed down into color field painting and the like. Or decorative motifs in buildings, which must be eliminated and exploded in the deconstructionist approach to, or rather retreat from architecture.

This suggests that the lack of Jewish accomplishment in a domain of taste stems from a more fundamental weakness in basic perception, akin to a blind man who cannot paint. (Their low scores on tests of visual-spatial cognition have been documented and accepted for awhile now.)

Why, though, do they insist on ugly art, brain-hurting buildings, and food meant for the barfbag (Mexican)? Why not just go with the flow and not make a big display out of your rejection of fine taste? It all traces back to their characteristically antagonistic stance in interpersonal relations, reflecting their genetic and cultural adaptation over the centuries to an economic niche as tax farmers, financiers, and other middleman roles.

Being upstaged by a bunch of dumb goyim is too threatening to the Jewish ego, so they turn it around and call it an abomination what is delightful, and seek delight in what is abominable.

April 1, 2015

Homelessness and rootlessness out West

From a post at Movoto, here's a map of how the states rank on the size of their homeless population per capita:


The West Coast, Nevada, Hawaii, and Alaska are all in the top 10. All the Mountain states are in the top half of the nation, except for Mormon Utah. The northern Plains states are doing poorly too.

There are pockets of heavily homeless states back East, but not entire regions. Massachusetts, Vermont, and Maine are up there, but not so much New Hampshire, Rhode Island, or Connecticut. New York is plagued by homeless, but the other Mid-Atlantic states aren't even in the top 20. Aside from New York, the only other centers of homelessness back East are in Florida and Georgia, and perhaps Tennessee.

Overall, though, the Deep South, Appalachia, the Midwest, and the southern Plains regions have comparatively small homeless populations.

These patterns reflect how deeply rooted the people are, with most places west of the Mississippi River having shallower roots than folks who still live where the original American settlers lived.

My hunch is that it's not just due to the shiftless, transient, and footloose tendencies of Frontier people, which would only apply to the professionally homeless. There's also those who are only temporarily homeless, and they ought to at least have family and friends to rely on for temporary relief, if the alternative is to live out of a car or on the street.

But where roots are shallow, people are less likely to have those connections. With less slack in the social system, a small accident is more likely to bring the whole thing down. Where roots are deeper, the norm is that "We take care of our own".

The data these maps were drawn from come from HUD, and reflect total homeless numbers. About 15% of the total homeless population falls into the "chronically homeless" group that we associate with drifters, bums, and the like. The rest are down on their luck, poor, lazy, addicted, or something else that makes them prone to occasional homeless living, without it being a full way of life.

I've downloaded the HUD data for myself, so if there's time, I'll re-do the rankings looking only at the chronically homeless, to see where the bum problem is the worst. Glancing over the numbers, it looks like that will tilt the rankings even more toward the West.

The data are also broken down into smaller geographic units below the state level, so we can see which cities and metro areas are more over-run.