April 27, 2012

Are Jews uniquely non-religious?

This topic has probably been asked before, and perhaps answered using the same method. But since I don't know off the top of my head, I'll just present three brief but clear results from the General Social Survey.

People who are more intelligent and more politically liberal tend to be less religious. Jews (i.e. Ashkenazi Jews, if they're American) are both, so we'd expect them to be more atheistic. And everyone knows they are. But what about if we compare them to their white Gentile counterparts who are also smart and liberal?

The three graphs below show the responses of people who are white, identify as liberal in their political views (from "slightly" to "extremely" liberal), and score at least 120 on an IQ test (or, for the first graph where that didn't give a big sample size, an IQ of at least 114). That's what used to be called "college material" before everyone got into college. They are then split up by religious preference.

First, here's a graph showing how confident you are in the existence of god:


Christians are exactly the same, with just over 40% knowing god exists without a doubt, and less than 10% being atheists or agnostics. Remember that these Christians are white, college-material liberals. Not surprisingly the "no religion" group are much more doubtful of god's existence. Jews are in between, with just over 20% being sure and nearly 30% being atheists or agnostics. Even controlling for brains and liberalism, Jews are noticeably less religious than Christians.

The next graph shows attitudes about the Bible, not specified as the Old Testament or both it and the New Testament. Possible responses are, "1. The Bible is the actual word of God and is to be taken literally, word for word. 2. The Bible is the inspired word of God but not everything in it should be taken literally, word for word. 3. The Bible is an ancient book of fables, legends, history, and moral precepts recorded by men."


Again the Christian groups are identical, with about 25% saying it's legends, etc., not the inspired or actual word of god. No shock that the "no religion" folks hold that view overwhelmingly, about 85%. Again the Jews are in between, though closer to the "no religion" group, with a little over 60% holding the non-divine view.

Finally, there's how often people attend religious services (which type of services is left open and unspecified -- so, any services). The less frequent bars are lower down in each column:


Here we find a subtle but noticeable difference within Christians, reflecting the greater orientation among Catholics toward church ritual, compared to Protestants. The "no religion" people hardly go at all, those who do attend presumably not being "out" or accomodating the wishes of their religious friends and family. Jews once more are in between Christians and "no religion" people, closer to the latter. Those who attend services nearly every week, every week, or more than once a week make up about 30% of Protestants, 40% of Catholics, 2% of "no religion" people, and 7% of Jews.

Looking only at whites who are pretty smart and politically liberal, Jews still come out as far less religious regarding their beliefs about supernatural higher powers, their attitudes toward sacred texts, and their participation in ritual practices. If anything, they're more like those who profess no religion, making phrases like "Jewish atheism" somewhat redundant.

This uniquely Jewish tendency toward atheism must reflect some other difference between them and the goyim, not race, IQ, or liberalism. The ecology that the Ashkenazim are adapted to is white-collar financial work, often in the service of the state. So the religious lobe of their brain has evolved to take as its object of worship the technocracy and technology that can deliver him into paradise, not something superstitious like the grace of god, or something less supernatural but still transcendental like loving thy neighbor as thyself.

And of course there are differences in the religions that the Jew and the Christian are exposed to growing up. Both religions build on the Old Testament, but the New Testament has an exciting new cast of characters who illustrate moral points during dramatic narratives, whereas the Talmud can only come off to youngsters as pointless grown-up bickering about whether grandpa is allowed to clean out his earwax on the Sabbath or has to hire a shabbos goy to do it for him.

It cannot surprise us that what we call "Judaism" -- i.e., Rabbinic Judaism -- hasn't gone anywhere, while the world has been taken over by Christianity and Islam, the other successful Abrahamic religion that also stars a dramatic new figure not found in the Old Testament. What makes Judaism unappealing to would-be convert outsiders probably makes it unappealing to in-group members too. So, Jews' lack of enthusiasm for religion is understandable, apart from their low baseline as shaped by their managerial ecological niche.

All that really is left for them to get excited about is the stuff that would only appeal to insiders, and perhaps explicitly turn off the outsiders. But that shades so easily into ethnic, rather than religious, chauvinism that they'll just get excited about their ethnicity instead of their religion. And sure enough, "proud to be a Jew" refers to the accomplishments of those in their ethnic group, not the beliefs and practices of their religious group.

GSS variables used: race, polviews, wordsum, relig, god, bible, attend.

April 23, 2012

Cocooning home and yard design, 1: The present day

Part 2 will discuss the mid-century.

Earlier I looked at the drive-in culture of the mid-20th C. and the past 20 years as a sign of how socially avoidant people are during falling-crime times. Bustling dining rooms emptied out as customers began ordering from and eating their meal inside their car. And just sitting next to others in church was too close for comfort for those who preferred to pull into a drive-in to hear the Sunday sermon.

That was one way that people who wanted to be isolated from others still managed to get some things done outside the house. But what about while they were home? The first response of a cocooning population is to simply withdraw as much as possible from public spaces. But it soon becomes apparent that just staying home doesn't totally sever the connections to your broader community. How do people modify their houses and yards to further achieve a longed-for isolation?

Look around your neighborhood today, and you will see stockade-like privacy fences everywhere -- even if like me you live in a very white middle-class area, where there is zero threat of Comanche bands raining destruction in a surprise pre-dawn raid. The fences are high (usually at least 6 feet), totally opaque (no space between planks), and surrounding most or all of the sides and back of the house.

I rarely see them in front, probably because people don't go out front anymore -- too much exposure to the neighbors. If kids are allowed to play outside at all, it is in the back yard, and grown-ups too have retreated from the porch in the front yard to their deck in the back.

All that you need to shield your private space in front is a set of blinds, such as the ubiquitous vertical sliding kind that snap completely shut. A kind of retractable, vinyl privacy fence. Blinds now block out even smaller and less visible windows, like basement windows.


We already can sense that these are all fairly recent developments, but to get a better feel for when they took off, I searched all American newspapers in Lexis-Nexis. Privacy fences in the '80s seem to have been used mostly by celebrities who wanted to live unnoticed. Reports of their popularity start trickling in during the mid-'90s, and by 1996 the practice is talked about each year. The same goes for building decks: they have always existed, but it's not until the mid-'90s that articles portray a mania. For window treatments, too, it seemed like the mid-'90s was when blinds began to noticeably replace less opaque things like sheers and lighter curtains or drapes.

To summarize the change over the past 20 years, then, first we just stopped going so frequently to public spaces like restaurant dining rooms, arcades, dance clubs, parks, and the apotheosis of the built public environment -- the all-under-one-roof mall. All of those saw declines by the early '90s. So now we're sitting at home a lot more, but we still feel like at any moment the public could just come on over and see what we're up to, maybe even try to socialize with us. That led to privacy fences and blinds on all the windows. Having abandoned public spaces, we tried as best we could to recreate a leisure space in our back yard, focused mostly on our deck. Hanging out more in back only compounded our desire for a privacy fence.

The scan of newspaper articles rings true to me. We had a deck put in during the mid-'90s, and the privacy fence went up a couple years later (before, it was a 3-foot chain-link fence -- without vinyl strips inserted to block out the space). We must have been late to the trend of vertical blinds in front and blinds on all other windows too, which I don't recall until the early or mid-2000s.

Growing up in the '80s could not have been more different. Our neighbors to one side did have a line of tall hedge trees, but they didn't look offensive, and although they lived on a corner lot, they did not wall off either side adjacent to the streets. Nothing at all separated us from our neighbors on the other side (where the high school guy who cut our lawn lived), and our houses couldn't have been more than 30 feet apart. Nothing kept us apart from our back yard neighbors either. Their collie occasionally wandered over to our back door, and we'd let him in. Sometimes my brothers and I left our toys near or across the border between yards (wherever that was), and our neighbor in back would bring them over to my parents so they didn't get lost or broken.

It wasn't unusual for kids to have birthday parties out in the front yard either. My brothers had one with several tables, decorations, boxes, wrapping paper, and presents all over the place. I'm sure anyone walking, biking, or driving by thought it looked like a mess, but seeing some neighborhood kids enjoying themselves at a birthday party must have made up for it. We'd definitely feel self-conscious trying to stage that today for my nephew's birthday. It's so rare that I had to pinch myself last week when I walked by a house where there was a kids' birthday party out front. Probably the first one I've seen in 20 years.

I don't remember blinds at all. Maybe some thin curtains, but mostly I remember sheers -- those let plenty of light in, yet they were gauzy enough to keep outsiders from having a crystal-clear view of your home. And we never thought of putting in a deck or an extensive paved patio. The most that we did to alter the natural look was to set up some small-scale gym equipment. It was one structure no taller than 6 feet, with a slide, a swing, and a teeter-totter (back before they were banned). It had no foundation laid overtop of the ground and wasn't anchored into it either.

This relative lack of physical barriers made it easier for people to look out for each other too -- how can I keep an eye out for my neighborhood if I'm inside and walled off by blinds, or if gigantic fences keep me from seeing suspicious things going on in other people's yards? Now we're happy to just let each person look out for themselves. And no, we weren't constantly scrutinizing every last little detail of our neighbors' open spaces -- just keeping an eye out, keeping them in mind. Only a spastic shut-in would equate open spaces with neighbors continually spying on each other.

My friends' houses didn't have much in the way of palisades and trip-wire either. We roamed all over the neighborhood, and I don't remember strangers' houses being sectioned off from one another. I guess that's what made it so easy to explore the neighborhood. Sometimes you'd see low picket fences (with planks spaced apart), or low chain link fences (again without the vinyl slats), or if they were really wealthy they might have a low stone wall out front. Some people might have had a little patio out back, but I can't recall anyone who had a deck.

To end on a morbid note, I wonder if that's part of the reason why we don't bury our pets in the back yard anymore. If it's bearing a lot more of a burden as a place to entertain and escape, in the absence of going out, then it would really kill the mood to look over from the deck and see where your pets were buried. We were a little more sentimental about ours (cat people), so in our back yard off on one side you would've seen three small wooden crosses with their names written on them.

I originally planned to write just about the most recent rising and falling-crime period, but I stumbled on a landscape architecture book that has a chapter entirely on the obsession with privacy in home and yard design of the mid-century. It's eerie how contemporary the pictures look, and how familiar the tone of voice sounds in popular writings about the ideal design. That post should be up soon.

April 18, 2012

Your most guilty pleasure song?

Mine is "Toy Soldiers" by Martika:



It's a total chick song, and a pretty melodramatic one to boot. Plus she got her start on Kids Incorporated, a TV show where mostly pre-teens sang hit songs and acted cheesy skits in between.

But that melody is just so damn catchy, and since it isn't repetitive, it should be harder to remember than the average pop song. Still I always find myself singing it the rest of the day whenever I hear it. And it's got the right amount of variety, novelty, and distinctness in the sound textures -- unlike the faggy indie, techno, etc., songs that become so self-absorbed in their exploration of texture that they neglect basic things like melody and harmony. Just an overall really well crafted pop song.

April 17, 2012

From sociable to isolated video game players

A recent GameFAQs poll asked about the social context of the respondents' video game playing -- are you usually with another person, playing physically alone but with someone else online, playing totally alone, etc. Here are the results from over 31,000 American respondents, and here are the results of over 54,000 American respondents for the same question asked in December, 2009.

Only one response says that other players are physically present when you're playing, and that answer has fallen from 21% to 16% just within the past 2 1/2 years. So the cocooning trend of the past 20 years has not bottomed out yet, and it shows up in every corner of life that I've checked out -- including apparently playing video games.

Even circa 1990 you could not have found 84% of people playing video games primarily while alone.

We used to go to our friends' houses all the time, rarely for the purpose of playing Nintendo, but we'd usually do that for 30 minutes or an hour, in between throwing the football out in the street and riding our bikes over to the park or woods. This was a great way to play games you didn't own yourself -- unlike with today's online multiplayer games, where both friends need to have plunked down the $60 for the game.

Some friends had an entirely different console than the ubiquitous Nintendo, also an experience you can't replace with online "connections," where both friends need to own the same console. My best friend had a TurboGrafx-16, another best friend had an old Atari 2600 (yes, considered "old" even back in 1991), and another friend had a Genesis early on when no one else did. There weren't many games for it, but it did have pretty good ports of Altered Beast and Golden Axe, which before you could only have played in arcades.

That was the other mainstay of social video game playing -- arcade cabinets. You didn't need to have a separate arcade room set aside in the local mall. Arcade games used to be in every bowling alley, a good number of convenience stores, pizza parlors (including the cocktail table games at Pizza Hut), even laundromats. Then when the ultra-mega-multi-plexes opened up, they had their own arcade section just off of the main lobby, before the ticket stand, so you didn't need to pay to see a movie.

I don't remember the video game culture of the early '80s, and wasn't alive in the late '70s. But arcade revenues hit an all-time peak in 1981 (they climbed back to another local peak in 1988 before their steady fall afterward). I don't have many references to pop culture showing kids playing video games at each others' houses, though. The advertising of the time shows family members gathered around the TV and video game system. (E.g., for the Vectrex, Atari, and Intellivision.) Then there's the beginning of Vacation where Rusty and Audrey are playing Atari together in the living room, before they use it to mess around with their father Clark's course-plotting computer program.

The social setting became much more isolated during the '90s, I'd say even by 1994 or '95 it was noticeable. Arcades were already in decline, I didn't go over to friends' houses as much to play games, and didn't feel like playing games with my brothers that much either. By the later half of the '90s, it was just the occasional visit to my best friend's to play GoldenEye or Mario Kart 64, or staying in with my brothers to play (what else?) GoldenEye and later Perfect Dark. As with the broader society, the collapse of social networks has affected our ties with non-kin more than with close kin.

There wasn't any online technology for video games back then either, not for 99% of players. So it wasn't a substitute of online multiplayer for real-life multiplayer. People just started shutting themselves away, a trend that's only gotten worse since. But the key is not to see that as reflecting an outside tech change; it was an inside change in our social tendencies. The rise of online multiplayer is just an outlet for an already cocooning society.

April 12, 2012

Why are Parsi elites welcomed, while Jewish and Chinese elites are reviled?

Foreign ethnic elites who have a disproportionate influence in their host society's economy are called market-dominant minorities. The two best examples are the Chinese who settled southeast Asia and the Pacific Islands, and the Ashkenazi Jews who lived mostly in the Pale of Settlement in eastern Europe, and more recently in western Europe and its offshoots.

In her book World on Fire, Amy Chua looks at how the presence of market-dominant minorities can easily spark ethnic tensions, as the lower-status natives feel envy and anger toward what they come to perceive as an intrusive race of bloodsuckers. Again the Ashkenazi Jews and the Chinese provide the strongest examples -- no matter where they go, the locals usually come to view them with antipathy. Occasionally that escalates into full-blown ethnic riots, like the pogroms against Jews in eastern Europe and the series of anti-Chinese riots in Indonesia.

Explanations for the psychology underlying the native masses' hatred of ethnic elites tend to portray the envy and resentment as an inevitable consequence of the presence of market-dominant minorities. Yet there is a clear counter-example of a market-dominant minority group that has been welcomed wholeheartedly by most of the host society -- the Parsis of India, who have a disproportionate influence at the higher levels of the Indian economy.

Even though they are only one case, it is such a strong counter-example that it must make us reconsider what truly underlies the psychology of anger toward ethnic elites. The Parsis, like the Jews and the Chinese, are not a native ethnic group of the society where they have strong influence, having come from Persia into India. (While they do share some genetic and cultural heritage, it would still be like a group of Armenians settling and wielding much control over the economy in Ireland.) They also came to their high status gradually through greater intelligence and industriousness, not through force. And they have been living in their host society for hundreds of years -- plenty of time for the seeds of envy and rioting to have been sown.

And yet, there has been no history of pogroms against the Parsis. If anything, they're seen as more of a national treasure, not that Indians worship them or anything. All the ingredients for an explosion of ethnic hatred and rioting would seem to have been present for centuries, so what gives?

The general consensus by native Indians and by European observers, for at least the last several hundred years, is that the Parsis are incredibly charitable, preferring to spread around their wealth. (See some representative quotes in their Wikipedia entry.) They themselves emphasize this aspect of their community in the phrase "Parsi, thy name is charity." Most importantly, they aren't only generous toward one another, but toward the masses of their host society. A 20th-century Parsi captain of industry, J.R.D. Tata, was right out of the progressive mold of Andrew Carnegie and Milton S. Hershey.

So, it looks like the primary way that they've avoided the fate of so many other market-dominant minorities is to not behave like a bunch of greedy gold-hoarders. They don't give away all of their wealth, but they do donate enough to prove their generosity. Moreover, no one sees them as doing so without any real care for others -- i.e., just being charitable to gain approval or to keep the would-be rioters content. All observers seem to agree that it's out of a sense of duty and empathy.

And it's empathy where the Ashkenazi Jews and the Chinese are lacking. I touched on this in a longer post about why they tend not to be very good social scientists. Popular stereotypes everywhere that they've settled depict Jewish and Chinese people as brusque and rude, whereas the opposite stereotype prevails about the Parsis. They would also not fail basic tests of the recognition of facial emotions like the East Asians do. And unlike Jews, the equally high-IQ Parsis haven't produced scores of fruitcake intellectuals and political "thinkers," from Karl Marx to Ayn Rand, whose failures stem from nothing more than their inability to get other people.

In general, looking over this list of famous Parsis, they don't seem to produce many autistic or nerdy people. It looks more like professionals, entrepreneurs, and entertainers. (The Han Chinese have over 10,000 times as many people as the Parsis, and yet they can't produce a single Freddie Mercury.)

What was it about their niche in India that preserved their empathy, unlike other market-dominant minorities like the Chinese and Jews? Beats me, I don't know their history well enough. Something about the types of white-collar jobs they held must not have selected for having a dim and suspicious view of other people, unlike the case of Jewish tax farmers in Europe.

Their story should give us hope that it is possible for an ecological niche to select for higher average IQ, as well as for business skills, while not corroding our social nature. Sadly they do have very low birth rates, but then what brainy group these days does not?

April 8, 2012

Video games' place in the realistic vs. stylized visual zeitgeist

The usefulness of the idea of "a visual culture" is that similar trends tend to affect so many seemingly separate areas. That is why, without having seen them beforehand, you could easily group together an ad, an album cover, a book cover, and a movie poster from 1954, and ditto for 1984, with little confusion between the two groups.

One area that tends to get left out of these surveys is video games, which doesn't sink the whole approach -- it's just one not very crucial area. Still, they're worth including since they do move with the overall zeitgeist, providing stronger support for claims that "the look of 1994 was such-and-such a way." And they have come to occupy an increasingly larger place in the visual media that people are exposed to.

Without going into the broad pattern, which would be another much longer post, I'll just state that in falling-crime times the visual culture becomes more photorealistic, whereas during rising-crime times it becomes more stylized. Rising-crime periods were the Romantic-Gothic of ca. 1780 to 1830, the Art Nouveau and Art Deco periods of ca. 1900 to the early '30s, and the Psychedelic-New Wave periods of the '60s through the '80s. Falling-crime periods were in between: the Victorian era, the mid-century, and the past 20 years.

Why artists and their audiences respond those ways to the rising or falling trend in rates of violence is another matter, not really worth exploring here, again to save space. The basic reason I see is that in times when the future looks less stable and predictable, people value imagination more in all areas of life, not just visual culture. Since the old ways are apparently not working as planned, people hope that outside-the-box thinking will lead to new solutions. It's not experimentation for its own sake, a useless "skill," but more like creative improvements that began with trial-and-error.

Let's take a look then at the change in how realistic vs. stylized the typical video games have looked since they became popular in the late 1970s. I'm mostly going to link to galleries, a) to save space here, and b) to not have to make and upload my own image files.

In the beginning it was arcade games that had the greatest graphical capabilities, and hence the most choice for the creators. Popular games from this time are all stylized, e.g. Galaga, Ms. Pac-Man, and Robotron: 2084. Even the artwork on the stand-up cabinets that housed the games were highly stylized, showing that the makers and target audience wanted that look. If they had truly wanted a realistic look, the game itself might still look somewhat stylized because of technological limitations, but the artwork on the cabinet meant to draw players close would have been photorealistic.

I'll return to the "technological limitations" point in a sec, as it turns out there were some old arcade games that were capable of showing live-action video footage, but chose to show hand-drawn animation instead.

As for more recent popular games, they are almost always as realistic-looking as the technology will allow, and the artwork on the front of the box is that way too (often not even done in a separate style, but just lifted directly from how the game looks itself). That doesn't mean they do not ever show fantasy environments or creatures, only that these too are portrayed photorealistically. For example, Grand Theft Auto IV, Call of Duty: Black Ops, and The Elder Scrolls V: Skyrim.

When did the shift begin? As late as 1991 (just before the peak in the crime rate), a smash hit like Sonic the Hedgehog was still heavily stylized, even though the technology allowed for more realistic graphics. Just around then, though, the hit games started going for digitized copies of video-captured actors, such as Mortal Kombat. Some went for full-motion video altogether, like "cutting-edge" failures on the Sega CD, 3D0, and computer platforms.

But a more insightful approach is to look at video games made for a much older medium that allowed live-action video to be displayed, and see how the creators and audiences chose to make use of it. That way we can be sure we're not confusing aesthetic changes for technological changes. Only one medium like this was used back through the early days of arcades -- the laserdisc.

Here is Wikipedia's list of laserdisc video games if you want to look for yourself, but I checked them all. Most of the games from the first half of the '80s are entirely animated and stylized, at least to the degree you'd seen in a Disney cartoon from around that time. Don Bluth, the artist for the most popular such games, Dragon's Lair and Space Ace, was a former artist for Disney and had formed his own animation studio by then.

It would have been impossible for the animators to draw the images live while the player moved the character this way or that, took this or that action, etc. So these games were more like interactive movies, where you must perform a fixed action when prompted -- if you pull it off, the animation sequence proceeds, and if not, the relevant death scene is played. (These are the ubiquitous "quick-time events" of recent games.) You might also have a choice of where to go at certain fixed points in the narrative. By limiting the player's control, they could store all the animated sequences on the laserdisc, and jump to one or another depending on simple predictable actions the player took.

The few such early laserdisc games that weren't entirely animated only used live-action video to capture background environments (e.g., filming a sky with clouds to use as a background for a flying game). The objects involved in playing the game (e.g., the plane and any enemy planes) were still the stylized hand-drawn or computer-generated sprites that you'd see in other games from that time.

Later, during the first half of the '90s, American Laser Games released many laserdisc games, all of them showing live-action video footage, not traditional animation or even CGI. Like the earlier ones, they had to store all sequences on the disc, and jump to one or another based on simple actions the player took. Unlike the earlier ones, the player didn't just have to push a button in time -- using a light gun, they had to accurately shoot a spot on the game screen where an enemy was. If they hit him, the video sequence of that guy falling off the roof or whatever would play, and if not, some kind of failure video would play.

If you were still hanging out in arcades during the falling-off-a-cliff period of the mid-1990s, you might remember one of the Mad Dog McCree games, which featured point-of-view video footage of a vigilante who confronts a band of outlaws in the Wild West.




Since the laserdisc technology allowed for either the more photorealistic approach of live-action video or the more stylized approach of hand-drawn animation for children, we can be sure that the aesthetic changes from the earlier ones to the later ones are not confounded with technological changes. During rising-crime times, the stylized look was a no-brainer. As the crime rate began to peak and decline, the visual culture shifted into realistic mode, and it has only gone farther up through today.

April 4, 2012

The headphone craze

One of the more bizarre trends of the past 10 years is to go out in public and see so many people with headphones on or in their ears. This is not related to ownership of portable music players, since everyone had a Walkman back in the good old days but did not wear it out in public so often -- hardly at all, in fact. Rather the change is in how people use the same kind of technology during a time of greater cocooning, namely to block out other people, turning would-be public spaces into hives of private cells.

The typical buyer may spend around $30, though a booming market for $100+ headphones now accounts for 6% of units sold. Someone who plunks down $50 for a pair of Skullcandy headphones is doing so partly for the fashionableness of the brand, but they also see something very utilitarian in them, i.e. the improvement in sound quality. According to a marketing report from NPD, sound quality was an important purchasing factor for 48% of all headphone buyers, and for 76% of the buyers of $100+ headphones.

And yet the music they're listening to is almost always an mp3 (or similar low-quality file like a streaming YouTube clip or even satellite radio), where 80% of the source data has been compressed out of existence. Or maybe using their phone to watch a movie or TV show that they downloaded from a file-sharing service, where again more than half the data on the original DVD has been compressed out. If they were so concerned with sound quality, they would be listening to vinyl or CDs, and watching on DVD or Blu-ray. No amount of headphone engineering can replace the data being read off the storage medium, once it's been gutted.

Another sign that the headphone craze is not driven by a true concern for sound quality is that even when they listen to music at home, it's over the cheapo speaker in their laptop or headphones. A basic boom box or stereo system with speakers isn't very expensive, and it wouldn't have to be very sophisticated at all to beat headphones for sound quality.

What they really value is the cocoonability of wearing headphones connected to an iPod, phone, or laptop. It's not quite accurate to say that they value "portability," as it is specifically the ability to isolate yourself from others while in public that they value.

You don't have to be a Luddite to feel repulsed by what's happened with consumer electronics in the past 10 years. Just 20 years ago, the norm was to listen to music from a high-quality medium like vinyl records, CDs, or FM radio, and coming out of a decent set of speakers, whether from a home stereo system, car stereo, or the set-up of a bar or nightclub. And nobody used consumer electronics to convert a public space where people are supposed to at least acknowledge each other, perhaps engage with one another, into a hive of cubicles.

On a final related note, I wonder if wearing sunglasses is more common now too. Meaning, a greater fraction of people owning them, and those who own them wearing them more frequently. Particularly within the past 10 years or so, it seems like a huge fraction of girls wear sunglasses regularly when the weather gets brighter. I don't remember seeing many girls like that at the mall in the '80s and early '90s, and I doubt that's an inaccurate memory. Being surrounded by so many people with un-seeable eyes is the kind of thing you notice.

Headphones, sunglasses, holding a phone or placing a laptop always in front of us like some shield... won't be long until everyone's wearing the scramble suits from A Scanner Darkly.

April 2, 2012

The Beatniks as the Silent Generation's form of Millennial indie faggots

During the latter half of the falling-crime mid-century, the Beat Generation and their followers emerged as antagonists of the conformist and materialist majority. Three traits capture the flavor of these efforts:

- A paralyzing self-consciousness -- obsessing over and shouting about doing your own thing, rather than just doing your own thing.

- A retreatist approach to the problem of material abundance, instead of discriminating between the good and bad sides of materialism and technological change.

- And a profound naivete about the potential dangers of blind experimentation, whether artistically, sexually, or drug-related.

In these ways their view of the world was not very different from the mid-century mainstream, just that they took the opposite course of action from the majority, who still shared their assessment of the world.

The mainstream strongly fretted over whether what they thought, felt, and did was within prescribed boundaries, but they chose to stay inside while the Beatniks chose to stand outside. The mainstream had a simplistic view of materialism -- that it's either full steam ahead or withdraw into a pre-industrial age -- but they chose to dream about what gadgets to buy while the Beatniks abandoned material comforts. And the mainstream knew little about what might happen if they got hooked on pot, gave the go-ahead for homos to screw each other, or pushed the shock value of experimental art, but they chose to abstain from those things while the Beatniks blindly cheered them on. Same perception and appreciation of the real world, just a different course of action based on that.

How did the entire generation of Beatniks come to share these traits? Like their mainstream counterparts, they came of age mostly during the falling-crime period of 1934 to 1958. Such periods are marked by higher level of cocooning compared to the rising-crime periods just before and just after. That obviously explains the naivete, and almost as obviously the self-consciousness -- being more cut off from society, you have a heightened awareness of yourself as an island with sharper boundaries. The black-and-white view of materialism or technological change may also stem from social isolation -- being more embedded in real social life allows you to see first-hand what is good vs. bad about buying a car, working hard just to earn more money and not other reasons, and so on. When you're more isolated, you imagine it more simplistically going either one way or the other, with less of a reality check from your social interactions.

To give a brief example of someone who did not succumb to these mainstream tendencies, consider Sloan Wilson, author of The Man in the Gray Flannel Suit, a sincere reflection on mid-century materialism. He was born in 1920, and so went through infancy, childhood, and even early adolescence during the (extended) Roaring Twenties, when people were out-and-about, engaging with the real world. Gregory Peck, who played the title character in the movie, was born in 1916. So they were part of the Greatest Generation, whose formative years were the rising-crime Jazz Age.

A few of the Beats were part of that generation too -- Lawrence Ferlinghetti (b. 1919), the sick fuck William Burroughs (b. 1914), and the halfway likable and conservative Jack Kerouac (b. 1922). But everyone else, including the most visible and hyped-up figure, Allen Ginsberg, was born in the second half of the 1920s and the early '30s, making them part of the more sheltered Silent Generation.

Who are their descendants today? Since the main influence on the zeitgeist is the trend in the homicide rate, we just look at who was born in the same place relative to its peak. The earlier peak year was 1933, and the most recent peak was 1992. So adding 60 years to the birth years of the Beatniks, we get people who were born in the second half of the 1980s and early '90s -- namely the Millennials.

They're in their 20s now, and we're in the same place in the homicide / zeitgeist cycle as the early 1950s. As in that time, most young people today are gadget-worshiping, cocooning conformists. Who, then, are today's Beatniks? They don't want to call themselves by any label since they, like, don't fit neatly into society's boxes, man. Meanwhile everyone can identify them ten miles away. I prefer the term "indie faggots," although the more common terms "hipster" and "hipster doofus" do connect the group back better to their mid-century ancestors.

Given the anti-materialist poses of the Beatniks, I guess I really mean the sub-group of hipsters who try to live off the grid, and not the ones who can't leave the house without their laptop and phone. The funkier-smelling hipsters are also the sub-group more likely to get into weird sexuality, whereas the check-out-my-Mac hipster chicks are as frigid as mainstream girls.

Where will this all lead in the near or medium term? Well, where did it lead before? Once the crime rate began rising during the 1960s, the zeitgeist changed direction, moving away from the '50s and toward the '80s. Because it doesn't change completely overnight, a fair amount of mid-century culture was still hanging around throughout the '60s and early '70s -- just diminishing steadily over time. Kennedy's New Frontier and Johnson's Great Society were fading holdovers from the heyday of liberalism under Roosevelt and Eisenhower. The Sears Tower was a fading holdover from the heyday of the International Style in architecture.

And the hippies were a fading holdover from the heyday of Beatniks. They were not a harbinger of things to come -- they were still sleeping in the naive mid-century, when the rest of the population was waking up to reality in the New Wave age. They were also closest in age to the Beatniks. Of course some of the hippies were Silents themselves -- a 25 year-old in 1969 was born in 1944. The prototypical ones, though, were the oldest Boomers, born in the late '40s. They therefore grew up through childhood during falling-crime / cocooning times, although the environment flipped around when they went into adolescence.

By the time you get to mid-'50s births, you find very few hippies because they went to college in the mid-'70s or later, after the counter-culture had died off circa 1973. Think of Kevin Arnold from The Wonder Years, or perhaps your own parents. The youngest Boomers, like the guys in Duran Duran, probably weren't even in middle school during the Sixties counter-culture.

Now, the next crop of hippies are probably just being born, so it's too soon to draw comparisons between them and their ancestors. But they'll have a similar relationship with the Millennials as the hippies did with the Beats. Sounds fucked up that so many people will come to idolize Millennial indie faggots, but it happened once before. Thankfully most of them won't, and the sheltered generation will be ignored or even disobeyed on account of their cluelessness -- that happened once before too.