April 30, 2014

Careerist women eating up sham wedding of careerist beard to closeted Clooney

Here is only the latest reminder (among many) from BlindGossip detailing George Clooney's 100% homo proclivities, not like you needed to be told. Ever wonder why he has zero chemistry with women? Why he acts instead like a mischievous 7 year-old boy who only teases and keeps his distance from the girls on the playground that want to play with him? He's a neo-Cary Grant for our newly naive, neo-Fifties culture.

But don't let the truth get in the way of a good BOO-YA vindication. In this case, from status-striving women who want their man-repellant choices validated. Here is a good example from the NY Post about how a 36 year-old human rights lawyer can not only make her career her foremost priority, but also land GEORGE FRIGGIN CLOONEY as her husband... IN HER FRIGGIN THIRTIES. That proves all the haters wrong — suck it, bitch! ("Or is it not feminist if I put it that way...?")

Nevermind that the wedding is a fake publicity stunt to distract the clueless fanbase from their sex symbol's penchant for boy-boffing. Then again, she may not mind if she's a lawyer, bent on desacralizing all human relationships through deceptive contracts.

Still, I doubt the typical career gal cheerleading from the peanut gallery could handle such a sham existence. They wouldn't feel the endorphin rush from knowingly serving as a faggot's beard, or from finding out about his lifestyle choices somewhere down the road, no matter how desirable they had originally found him. Their "have it all" dream is marrying their naive image of George Clooney, Straight Guy — not George Clooney, Please See Footnote Under Personal Life.

It's only one event, how large of an effect could it have? I dunno, with someone that famous and in-demand, it could warp the perceptions and ruin the prospects of millions of women. But, let's not be naive ourselves — it's best to stay out of the way of a stampede of rationalization once it has been unleashed within the female brain. Ain't no grabbing those bulls by the horns.

On the bright side, it'll weed out the bottom 20% or so on the cluelessness scale, and act as a loud warning signal for the remainder. She's drawn to George Clooney? — then she only wants the image, not the substance of a man. It'd be like some dude who dreams about the chance to hit on Gillian Michaels. Don't bother, move on to someone who still responds to our red-blooded sex drive.

Helicopter parents feel jealous when their kid is with another family

In a cocooning period, the social world shrinks from including genetic strangers ("peers," "the community") to only the nuclear family, or a proto-nuclear family ("a couple").

A single little nuclear family cannot provide for all of a person's social needs, or even a good chunk of them. It places unnecessary stress on each family member, who is expected to fulfill too many roles within the family, leading to cabin fever and an incestuous vibe around the house. Now you know where Norman Bates came from.

Earlier posts (such as this one) have explored the effects of the child's total social world being the family. When your parents provide most of your social interaction, you wind up brattier because you don't get as much honest feedback as you would from genetically unbiased people AKA your peers and other adults in the community.

And it undermines the parents' authority when they make the family the extent of the social sphere. You can't be hanging out with your kid one moment and then order him around the next. Friends cannot boss each other around, and authority figures do not casually hang out with their subordinates. You can pick one role or the other, and helicopter parents have chosen to abandon their authority and act as substitutes for the kid's peers.

They'd rather die than let Outside Influences undo all of their tireless parenting. That would be like leaving your carefully worked clay sculpture right out on the sidewalk before it had a chance to harden. Might as well hand it over to the dogs as a chew toy. This blank slate mindset is one obvious reason why they don't want their kids to spend any time with their peers.

But I've noticed that it goes further than that, to include jealousy. When they think about their kid spending dinner at another family's house, they obsess over all of that quality time that the kid is lavishing on an outside social unit. Lousy ungrateful traitor!

Especially if he chose to go over there by himself, not as part of a parent-orchestrated "play date." The parent feels like they've been ditched by a fairweather friend, or like a jilted lover who's been stood up.

In the good old days, parents didn't feel jealous but joyous if their kid was invited over for dinner, a movie, a round of mini-golf, a sleepover, or whatever. "Great, my kid's making friends and becoming part of the larger community!" Their worst fear was that their kid would be a social loner, headed down the path of solitary vice (drugs, heavy metal music, cult membership, suicide).

Grown-ups back then viewed other grown-ups as their social circle, and expected their kids to interact mostly with their own age-mates. Peers and the community, rather than the family, was the primary social unit, so their kid spending time at another family's home was not a loss or a fragmentation but a gain, a solidification.

Aside from breaking apart the bonds of community, helicopter parents have also injected a creepy incestuous vibe into family social life. And you know what they say about a woman scorned. That only traps the children more tightly from the outside world.

The last time around, in the mid-to-late 1950s, the only way out for young people was to disobey their parents and hang out with each other in public against the parents' wishes. And it wasn't the end of the world.

Naturally with all those people out and about, mostly as potential targets, the crime rate began rising until just after folks started cocooning circa 1990. But we just have to take the good with the bad. The surest way to eliminate crime is to cut ourselves off from one another and hide away for good in private bunkers. The early '90s was as bad as crime got, and that wasn't the end of the world either.

April 22, 2014

Transcendence: A provocative character study, not a showdown between man and machine

After two prefatory posts on the wider context of responses to the movie (here and here), we can now get on with the actual review of Transcendence. This will be on the long side because I'll be exploring many of the ideas that the movie brings up, in addition to reviewing the movie itself.

There will be some plot spoilers, but they will help with the larger goal here — to reframe your expectations so that, if you decide to see the movie, you won't feel like it was a bait-and-switch, and can simply enjoy the movie for what it is. It is not an action-driven, galactic-stakes showdown between a mad scientist and the forces of humanity, but rather a human-scale character study of the central players and their motives that might push us over the brink toward a strange, untested technology and way of life.

Let's make it clear at the outset: Johnny Depp is not the star or protagonist, and was only billed that way to "open" the movie — to provide a sure thing that would draw in audiences on opening weekend (and that didn't work very well).

From the outset he is shown to be a man of inaction, who prefers to avoid the limelight and toil away on mathematical proofs that only three people in the world will ever read. When he is roped into addressing an audience during a TED Talk-style fundraiser by his wife, he makes it clear that he finds it boring or besides the point to ponder the whole "how is this stuff going to be used?" side of things. He is a hardcore systematizer who only wants to understand how machines work, and how a sentient machine might work. It is pure research, not applications, that motivate him.

When he begins dying, it is not his idea to upload his consciousness to a computer, let alone to the internet. That was his wife's choice, once more, and he goes along with her plan, once more.

Thus it is Evelyn, the idealistic, starstruck, save-the-world wife who is the film's protagonist. She is the one who prods her husband's project toward applications that will heal the world. She is the one who brings up the idea of uploading his consciousness to a computer, the one who blithely rationalizes away any objections to it — it's no different from uploading an mp3 file to your iPod — the one who forcefully pushes the plan forward, who is the most vehement about the cyber-consciousness being "him" rather than him-plus-something-else or no-longer-him, the one who supervises and executes the plan to buy up a small town in order to build their underground headquarters and above-ground solar power array, the one who grapples with the rightness of her beliefs and the consequences of her actions, and who after deciding that she has done wrong, volunteers to become infected with a computer virus so that she can pass it on to the cyber-consciousness and disable it, atoning for her sins.

And unlike her husband, Evelyn is portrayed as an emotional and ambitious creature throughout the movie. Y'know, the kind of person who makes the major choices that steer the direction of the narrative.

I was surprised and fascinated by this inversion of the standard tropes of the mad scientist and wet-blanket wife. It's not the monomaniacal mad scientist who's going to bring about the apocalypse, who's going to use it for world domination, and so on. And it's not his wife who will continually nag him away from his work and warn him against the dangers of melding man and machine. And it's not even the absentminded professor whose gizmo-obsessed short-sightedness will lead him right over the edge of the cliff and pull the rest of the world along with him. Nor will it occur as the culmination of a deliberate plan that has been in the works for some time.

Rather, a spur-of-the-moment decision will be made under pressure — either upload Will Caster's consciousness, or he dies for good in a few weeks. The scientist is just going along with what seems like the only plan that allows for his basic self-preservation, and is not doing so eagerly or as a stepping stone toward some larger self-aggrandizing goal. The person who comes up with the idea and advocates the most strongly for it will be an emotional creature with deep personal biases — she is desperate to find some way to keep her husband alive, both because she adores him as a husband but also because his research holds the key to her ambition of healing the world.

Naturally, then, she will prove to be the greatest obstacle for the parties that want the cyber-consciousness shut down, who fear what it might do if left to its own whims and wielding such power. What they consider prudence would kill off not only her husband but all hope of realizing her heal-the-world ambitions.

We've seen such overly protective behavior before among female characters who have created a monster, but typically they are mothers who produce monstrous sons, yet who are still governed by Mama Bear protectiveness against the forces of good who want their sons dead. Now we get to see the other dark side of womanly devotion — covering for not just her husband, but a husband whom she has created. Undoing him would be more than unfaithful: it would be an admission that she made the wrong decisions during her creation of him.

Throughout the film, Rebecca Hall plays Evelyn sympathetically, rather than as a caricature of the devoted wife. This natural approach convinces the audience that any loving woman could find herself in her position, and makes the story all the more disturbing on reflection.

As for Evelyn's accomplice, her husband, many reviewers have complained about how flat and unemotional Depp's performance was. Like, what were they expecting for the character of arch-computer geek — Boy George? Once it's clear that he's not a power-hungry, resentful, or malevolent mad scientist, but who says he just wants to understand machine consciousness (NERD!), you should not expect emotion. You ought to expect a flat delivery from a recluse. Maybe they thought he should at least behave like an animated paranoid such as Ted Kaczynski, but that would be confusing him — the tunneling-away researcher — with the technophobic terrorist group that assassinates him.

The reviewers wanted someone more charismatic like Leonardo DiCaprio's character from Inception, but while that makes for greater drama, it takes away from plausibility. Nothing wrong with that if the tone is more what-if ("willing suspension of disbelief"), but when the tone is speculating on where current trends are taking us, it's better to favor what is plausible. And a charismatic computer nerd is not easy to swallow. In real life, it probably would be someone more like the nerd's emotional, ambitious, do-gooder wife who would make a snap decision to fuse man and machine, if it served her greater vision. The tunneling researcher has no grand vision — he just wants to be left alone to tinker with his ideas.

The husband's flat monotone also makes for a more interesting approach to the narrative of man transforming into machine. Like, what if he's 90% robotic already? And what if the rest of society is still about 80% robotic itself, more comfortable plugging their brains into their digital online devices than taking part in human activities? We're not exactly crossing the Rubicon anymore. Would uploading our consciousness to a computer be like a frog that is slowly boiled alive? For folks who are as flat and monotone as we are today, it just might.

Ultimately the inactive husband redeems himself by choosing to upload the virus from his wife, who in doing so is atoning for her own sins. Up until the end, though, it is not clear how much of the cyber-consciousness is the original Will Caster and how much is the computer intelligence already installed on the machine. This is another reason why Depp's flat delivery works so well — if he had been emotional as a flesh-and-blood human being, it would have been obvious that the monotone cyber-consciousness was the machine rather than him. A flat delivery in both stages leaves it more ambiguous, keeps us guessing about the cyber-thingie's true nature, and leaves us with a more disturbed feeling from the uncertainty of it, lying in the "uncanny valley."

But choosing to bring about his own downfall is presumably something that only a human consciousness would do, proving that at least some of the original person was still in there the whole time. And true to his original personality, he does not plan out the computer virus idea and set about achieving his goal. He just goes along with what he believes is the wise plan of action thought up and advocated for by his emotional wife.

As far as I know, this imagination of who the players will be, and what motives will drive them, is original in the heavily colonized niche of "when man and machine first become hybrid." At least from the examples that someone who isn't obsessed with the genre would be familiar with. It is a refreshing and stimulating approach that was unfortunately disguised in the ad campaign by the typical tropes about mad scientists and societal annihilation.

Reviewers should have kept a more open mind, though, once it was clear who the protagonist was and what her motives were, within the first 15-20 minutes of the movie. "Hoodwinked by yet another ad campaign — why do we continue to believe them?" should have been their response. That was just to draw in audiences who want more of the same junk, rather than take a chance on a totally new approach to man-meets-machine. I don't mind if a smart and original set of ideas has to sneak in through a Trojan Horse ad campaign about evil scientists, if we couldn't enjoy it at all otherwise.

April 20, 2014

Can today's reviewers remain clear-headed when a movie frustrates their hardened expectations?

In our 24-hour news stream culture, critics and audiences alike seek out information about upcoming movies, months in advance. By the time it is released in theaters, their expectations are so hardened that any deviation will deal them a major blow of cognitive dissonance. And rather than adjust in a humble way — "Huh, this is very different from what I was expecting, but let's go with it" — they follow the standard human programming and belittle the movie instead.

It's not just that it has failed to live up to their expectations — that happens all the time, and those expectations might not have been terribly high in the first place. It's that it has turned out to be of a different nature than they had expected, whether they were deliberately misled by the ad campaign or they were overly eager to form preconceived notions of their own, to alleviate their OCD fear of uncertainty.

When the viewers construe a movie as a bait-and-switch scam, or a glossy apple with a slimy worm inside, or a Trojan horse, they will naturally feel disgust, immediately vomit the product back up, and warn others to stay far away from it. This reaction of disgust, which pans the movie in black-and-white terms, goes far beyond how they would respond if it had merely been disappointing or not-so-good.

But, just because a movie's ad campaign and industry buzz turned out to be misleading, doesn't mean you can't still enjoy it. In fact, that's what you ought to expect — that the packaging will try to appeal to the lowest common denominator, to maximize butts in seats. If you thoughtlessly accept the packaging devised by high-priced ad agencies and Hollywood publicists, then you are a naive fool. Especially if the campaign leads you to expect something mind-blowing — you know what they say about something that seems too good to be true.

I know — shame on the advertisers for framing the movie in a different tone or genre than it actually will be. Still, get used to devious advertising, and be open to being pleasantly surprised when it goes somewhere you weren't expecting. Otherwise you'll spazz out instead of enjoying something like Man of Steel (which I reviewed, along with the spazz-fest, here).

I don't think people felt such a stinging disappointment to movie releases back before everyone developed OCD and the need for micro-forecasting, and before they became so trusting of the propaganda put out by faceless bureaucracies (whether corporate or governmental).*

All worth bearing in mind when you try to use reviews as a guide for what to see, or to inform your own expectations.

This has been another prefatory post to my review, hopefully up today, of Transcendence. Each time I sit down to write it, there's another layer of culture-smog that needs to be blown out of the room first. Perhaps there is more to say about the reaction to it, and what that reveals about the state of our culture, than about the movie itself (but I'll do that too).

* These abnormalities are symptoms of cocooning syndrome. People with zero social safety net are much more unstable to small perturbations from their plans — there's little slack in the system when you're the only one in it. And if you are too creeped out by other people to interact with them, including your own white middle-class neighbors, then you look to a larger-scale authority to mediate and control your relationships with others. True, you feel more like a slave, but more importantly you don't have to interact with other people — cuh-reeeepy!

April 19, 2014

In going from director of photography to director, focus on action

When a cinematographer decides to try his hand at directing, his first film should probably not involve much narrative or conceptual complexity, given how heavily visual and visceral his training and experience have been.

During the late 1980s and early '90s, Jan de Bont brought style into the summer thriller genre by contrasting shadowy settings with bright, warm lights, usually from a neon or other artificial source. This choice made the chiaroscuro effect look and feel distinctly modern, showing that striking contrasts of light and dark are not the kind of thing that you could only see by carrying a torch through a cave, or lighting candles after sunset.

Go back and see how much more fascinating these movies look compared to the typical entries in their genre: Die Hard, Black Rain, Flatliners, The Hunt for Red October, and Basic Instinct.

When he took control as director in 1994, he chose a project whose source of drama can be summed up very simply: "Pop quiz, hotshot. There's a bomb on a bus. Once the bus goes 50 miles an hour, the bomb is armed. If it drops below 50, it blows up. What do you do? What do you do?"

Although the early scenes in Speed show de Bont's look and feel (see the top shot below), style alone cannot support an entire movie, and after those initial scenes there is little emphasis on striking lighting, color, shallow focus, and shot angles, except for the very end (see the bottom shot).


However, the focus does not shift toward dialog, concept, and character development. It sticks with the visual and visceral, with cookie-cutter character types, only in a way that can sustain our interest — creating a sense of panic and menace, and throwing obstacles in the way of the protagonist's attempts to regain control of the situation. That way we feel cathartic relief when he finally succeeds in escaping from the villain's trap and rescuing the hostages.

So, was de Bont successful at directing an edge-of-your-seat thriller flick? No doubt about it: my friends and I must have returned at least a dozen times to see it over the summer of '94. If there was ever a lull or uncertainty in the day's plans — "Wanna go see Speed again?" I watched it on DVD last summer and agreed with my teenage self, a rare exception when I re-evaluate the pop culture of my adolescence.

The point here is not to survey the successes and failures of every DP who has taken his turn at directing, but to detail a single relevant example from the not-too-distant past.

The relevance for today being the release of Transcendence, the directorial debut from Wally Pfister, whose eye has given the popular thrillers of Christopher Nolan their own striking chiaroscuro cinematography. I thought it was a provocative character study of the major players who will usher us into the techno-apocalypse, while bringing up a bunch of intriguing ideas about how things might unfold if a person's consciousness were uploaded to a computer and then to the entire internet. But it was not the gripping, edge-of-your-seat thriller that the ad campaign had led us to believe.

His efforts to explore characters and concepts proved much more successful than I had expected, given his background as a DP rather than as a screenwriter (a role that better prepares one for directing). Still, while they are rich enough to carry the movie, it lacks the heart-pounding action that allowed Speed to pull in the audience almost entirely without dialog and character arcs.

Pfister took a huge risk by shifting focus away from his comfort zone and toward the narrative, and exceeded my expectations. But I wish he would have taken a safer project, driven by action, for his first tour in the director's chair, and come around to a conceptual sci-fi narrative after becoming more comfortable with the director's role. Such a difficult changing of roles should not also unfold in unfamiliar territory.

Critics are heavily panning Transcendence, for reasons I don't get — perhaps it wasn't the nerd-gasm they were hoping for, or it came as such a downer following in the wake of Her, which reassured the critic nerd that it was cool to fantasize about using your iPhone as a sex aid, and that there was even something cheerful about the whole affair. Yeah sure, it was no Videodrome, but the 19% rating at Rotten Tomatoes is childish and shameful for the reviewers. It's like scrawling UGLY BITCH on a girl's locker just because the date wasn't as orgasmic as you'd imagined it would be.

I'll put up a review sometime later today to try to correct the major misunderstandings I've read. But there is a grain of truth in what the tone-deaf critics are whining about, and I figured an introductory post was worth it to explore why the movie was not as successful as other attempts by cinematographers as directors.

April 18, 2014

Would the needy turn down your obsolete computer from 10 years ago?

I thought about donating the old Gateway tower that I found languishing in the basement, since it still runs fine. It's running Windows XP smoothly on a Pentium II 800 MHz processor, 20 GB hard drive, and 256 MB of RAM (and another 128 MB can be bought on eBay for a whopping $3, shipped). It has a fast CD-ROM, 3.5" floppy drive, Zip drive, 2 USB ports, and ports for modem and ethernet.

Not only does it do everything that a normal person would need, it is backwards compatible to make use of old things that may be still lying around the house, like floppy disks.

And yet such a system would be rejected by all computer donation centers, who preen for their do-gooder audience about how they're keeping bulky electronics from choking up our landfills, helping out others in the community who can't afford or otherwise don't have access to computers, and so on and so forth.

Why? Because their vision of "bridging the digital divide" means giving the needy a set-up that's within striking distance of the computers that the well-to-do use. It doesn't mean giving them something that meets their needs for free. After all, on the Gateway system from 2000, how are the poor supposed to stream pornography in HD? Or the all-important function of hardcore gaming? Giving them a system like ours would only perpetuate the inequality gap in cyber-distraction.

The first hit I got for middle to upper-middle class Montgomery County, MD, was Phoenix Computers -- "reclaim, refurbish, reuse." According to their "what donations can we use?" page, your computer will probably be rejected as obsolete if it's not running a 2.0 GHz processor on a Pentium 4 chip, and will be harvested for parts. Talk about greedy.

Even the T40 Thinkpad that I'm using to type, upload, edit, and comment on this post would get rejected — only 1.5 GHz and a variant of the Pentium 3 chip. Yet somehow I've pulled datasets off the internet, analyzed them in Python, drawn graphs of the results in R, and made PowerPoint talks to present them to others, carrying these on a USB flash drive. And garden-variety web-surfing, of course. But, y'know, the computer experts are right — this thing doesn't provide surround sound for when I'm watching cat videos, so it must be time to just throw this wad of junk in the trash.

Was it just the greedy do-gooders in Montgomery County who took such a wasteful approach toward conservation? Here is an all-purpose page with suggestions for those thinking about donating their old computers, and they are only a little bit more forgiving, explaining that only those that are 5-10 years old are going to make the cut. But under these more generous guidelines, that Gateway that's running XP without a hitch would still get sent to the scrap yard. Not flashy enough for the discriminating tastes of 21st-century proles.

So, for all their talk about frugality and stewardship, these donation and recycling centers behave more as though they were the producers of Pimp My 'Puter for MTV. Aw yeah, son, Black Friday's coming early this year!

Zooming out to the big picture, this entitled mindset among the lower 75% of society is an overlooked cause of how fucked up our economy is becoming. In the original Gilded Age, wasteful status-striving was only an option for the one-percenters. But now that we have democratized access to debt, along with state-mediated schemes like Cash for Clunkers and Obama Phone, everybody can whine like an entitled little bitch their entire lives, and bury themselves under debt in order to play the status-striving game. Hand-me-downs are for losers, and everyone must be a winner.

Today's economic explosions will be far more severe, on account of how broadly the attitude of wastefulness has infected our society. And so-called donation programs that feed this petty sense of entitlement are only going to make things worse.

April 17, 2014

Is anyone holding onto their digital pictures?

While poking around the dank storage area of our basement back home, I found a nearly 15 year-old tower computer lying ignominiously on its side on one of the shelves. It had been sitting there since 2010 and had scarcely been used since around 2008. When I later opened it up to clean it out, there was thick dust covering all top surfaces and a good amount of the cables. It was a miracle that when I first tried to power it on, it took only a little coaxing.

While cleaning out lots of old files to free up some hard drive space, I came across what must have been hundreds of image files stored across a few dozen folders. This computer had been my brother's during college, and went back into general family use during the mid-to-late 2000s. So there's a good range of who and what is pictured. My brother's social circle at college, family vacations, holidays, and so on and so forth. Not to mention a good deal of pictures of old photographs that had been digitally scanned.

Nothing mind-blowing, but isn't that normal for family pictures? And it's not as though any single picture would've been a major loss, but family pictures aren't trying to make the individual shot excellent, they're trying to record what our experiences were.

Needless to say that if I hadn't taken a curiosity in restoring and preserving this dusty old thing, it and those hundreds of pictures would've gone straight into the landfill. I backed up the pictures onto a flash drive just in case the hard drive craps out, and it struck me that we had to buy a new drive for this purpose. There wasn't a master drive that we had been loading digital images onto all along.

Perhaps other families are more OCD than ours (that would not be too hard), but I suspect that most people are not moving their old picture archives from one main "device" to another. And given how quick the treadmill of planned obsolescence is running, they're not going to have much time to get to know the pictures that are confined to a particular device before cycling on to the next one.

Photographs are the exact opposite. In our cedar closet, we still have album after album full of film prints, some of them going back to the early 20th century. Photo albums were never housed in a larger piece of technology, let alone one that was subject to such rapid turnover. So it hasn't been hard to keep those archives separate from all the other stuff that comes in goes in a household.

And although I've written about how bland, forgettable, and crummy digital pictures look compared to film, their quick sentence to oblivion seems to have more to do with digital storage media rather than digital image capturing. If you took pictures with a digital camera but printed them up, they probably wound up in a photo album with the others. Whereas if you took pictures with a film camera but told the photo lab to upload scanned image files onto a flash drive instead of making prints, you'll lose them before 10 years is up.

It may seem odd that digital images are vanishing so easily — although less tangible than photographs, they are still housed on a physical machine. But those machines are getting more or less thrown away every several years these days, and even if they're donated, they have their hard drives automatically wiped clean before passing them along.

Forgettable images on disposable machines — how would this world ever go on without progress?

April 14, 2014

White flight and ethnic differences in cocooning

The main cause of white flight is no mystery — the higher rates of violent and property crimes among the groups that they seek further distance from. Hence, in this period of still-falling crime rates, whites have partially reversed their flight from the urban core, and have reconquered large parts of major American cities that would have been no-go zones for most folks back in the early 1990s.

This influx of affluent whites has pushed the earlier black and Hispanic groups out into the suburbs. How are white suburbanites responding, if they aren't part of the movement back into cities? They're moving further and further out into the exurbs.

But why? It's not like the earlier white flight, given how low crime is these days. Sure, it's still higher among their new neighbors, and moving further out would bring them into lower-crime areas. But it's not the huge gain that it would've been 30-40 years ago. Particularly if the newcomers are Mexicans, whose crime rates are not even at black levels to begin with.

What about the newcomers' kids fucking up the local schools? That would certainly worry the parents of school-aged children, but would probably not get picked up by the radar of everyone else. It should only be the former group moving out, whereas it seems like everybody is packing up.

Fleeing diversity and its fragmenting effects on trust, in search of greater similarity and cohesion? That's more like it, but why is it the resident whites who are leaving, rather than the non-white newcomers? Is the white suburbanites' homogeneity that easy to perturb toward disintegration?

You wouldn't see that in the other direction, with suburban whites invading and driving out suburban Mexicans — other than by paying more money for rents, goods, and services. The Mexicans aren't going to move out just because they sense a creeping up-tick in the level of diversity with all these white newcomers. They're going to make it known that they aren't going anywhere, and that whites aren't welcome.

And not necessarily in a violent or confrontational way. All it takes is them refusing to learn or speak English, refusing to participate in mainstream American culture, refusing to display American cultural symbols, etc. And then proudly displaying their own.

Whites are a lot more wimpy about asserting their ethnic or cultural identity. I don't mean in some cosplay Nazi style, but in something as simple as playing rock music out of their car windows or on their front porch, walking around the neighborhood, and hanging out in its public spaces — particularly in groups. That would give them a group-y physical presence that would not feel palpable if they kept to themselves indoors most of the time, occasionally going out for a jog alone or walking the dog alone.

To the outgoing go the spoils. If you look around a diverse area today, you'll see that Mexican parents are more willing to let their kids hang out on their own, and that the grown-ups are more likely to be hanging out in the public areas of a shopping center instead of pulling right up in front of their destination, darting in and out, and taking off back home. Even where they may be a numerical minority of private residents, their public physical presence can be much greater.

This line of thinking may also explain why some white suburbs haven't been so thoroughly affected as others. The introverted Scandinavian weenies in Minnesota and Wisconsin have gotten slammed a lot harder by all the black newcomers from the Chicago area over the past 20 years. They're just going to keep moving further and further out.

The working-class suburbs of Boston and New York (who can't just out-bid the non-whites) have fared much better, compared to the rest of the country. You can always rely on the garrulous micks and wops, I mean the sociable Irish- and- Italian-Americans to stand their ground, not just as an individual defending his private property, but as whole groups coming together to put some pressure on unwelcome newcomers.

Public life is part of the pastoralist culture of honor that the capable defenders are adapted to, whereas retreating into the private domain of the family is more what large-scale agriculture selects for, outside of urban dwellers (and you let the culture of law deal with any problems that may arise). Naturally the Nords are going to have a tougher time driving outsiders back out, compared to the Celts.

April 13, 2014

Children of divorce, invisible to Wikipedia

I was just reading about Amy Adams on Wikipedia and found out that her parents divorced when she was 11, despite belonging to the Mormon church which places a heavy emphasis on family togetherness (heavier than other religions).

While scrolling down to the bottom of the entry, I expected to see one of those offbeat Wikipedia category tags, like "Mormon children of divorce." Alas. But they didn't even have a generic category tag like "children of divorce," "father-absent people," etc.

They have a category for adoptees, something that most folks reading a person's biography would be interested in knowing. And something that could have shaped the way they turned out as adults. Adoption puts a happy face on the category of "unusual family structures," though. Things looked hopeless for the kid at first, but then they were rescued. Pointing out everybody who went through the opposite — things looked cheery at first, but then it all fell apart — would be a downer.

Wikipedia has all sorts of other category tags about a person's background and upbringing, from ethnicity to religion to being a military brat (like Amy Adams). That's the good kind of diversity — in the Wikipedian's mind, no ethnicity or religion or parental occupation is better than any other, so what's the harm in pointing it out? But whether both your parents were still together when you were going through adolescence... well, we don't have to air everybody's dirty laundry, do we?

And it is becoming "everybody" — see this earlier post about the still-rising rate of children growing up in broken homes.*

However, conveying how fucked-up the world is becoming, and pointing out who specifically has been hit, would go against the prime directive of the neutral point-of-view. You can read about it on a case-by-case basis, and assuming you soak up thousands of articles about living people, the pattern might strike you.

But there will be no larger picture that an abstract category tag could clue you in to at the outset. And no firm sense of there being this whole category of people out there, without a concise tag to reify them as a group. Some things were not meant to be understood, even (or especially) through The Free Encyclopedia.

* The trend for divorce is not the same, and has been declining since a peak circa 1980. But the rosier trend for divorce ignores whether or not there are any children involved, and married couples aren't pumping out babies like they used to. The divorce of a childless couple does not leave a broken home.

April 9, 2014

Planned obsolescence and conspicuous consumption

Stuff isn't built to last anymore, whether it's cruddy pressboard furniture from IKEA that will flake off into a pile of shavings within 10 to 20 years, or an iPhone whose speed has been crippled by an "upgrade" to the operating system that overwhelms it.

And yet, as the popularity of IKEA and Apple testify, people these days not only don't mind how disposable their purchases are — they are eager to feel another endorphin rush from throwing out the old and buying up the new, like a woman who changes her hairdo every three months.

Sadly, you see the same treadmill consumerism at Walmart and Sears, where everyone wants to step up their game, upgrade their rig, etc. It's not just the elites who are effete. Working-class people today are not the honest, making-do folks from a John Cougar Mellencamp song. They act just as entitled, discontent, and in-need of cheap junk from China as the rest of the social pyramid.

So, upper or lower class, Americans today don't give a shit if their stuff is unusable in five or ten years. Indeed, that's the way they like it.

Usually "planned obsolescence" is talked about as a supply-side thing, with the producers scheming to trick us into buying things that will be no good before long. But consumers notice that kind of thing, and by now that awareness is so widespread that all you have to do is say "not built to last," and everyone nods along. Notice: they do not stop buying all this shoddy crap, rather they grudgingly accept the nuisance as the small price they have to pay to enjoy the larger benefit of stepping up their game and upgrading their rig, feeling that heady rush more frequently.

This is a sorely under-appreciated source of how crappy things are today. You don't want to think of your fellow Americans as feeding it through their own self-absorbed consumer behavior, and would rather pin it all on the greedy stockholders, managers, marketers, and so on. But those guys can't sell an entire nation what it doesn't want to buy. Same with immigration — you can blame large-scale farm-owners, but what about the folks you know who use a housecleaning service, lawncare / landscaping service, or construction service that's almost certainly employing cheap illegal labor?

The Arts and Crafts movement took root in the late Victorian period, as status-striving and inequality were rising toward their peak of the WWI years. The standard story is similar to today's, where the shoddiness of stuff at the time was blamed on mass production techniques introduced by the Industrial Revolution — something on the supply side, at any rate. Given today's parallels, I'm more inclined to blame airheaded Victorian strivers for spreading the throwaway mindset. Only with such a docile consumer base could the industrialists flood the market with cheap junk.

At the other end, it's striking how sturdy and long-lasting stuff is from the nadir of status-striving and inequality during the 1960s and '70s. Especially for mature industries that can be fairly compared across long stretches of time — like furniture. Those Archie Bunker chairs, cherry dressers, and "granny squares" Afghan blankets are still in wide circulation at thrift stores, and have always been. The "thrift store look" from today is about the same as it was 20 years ago.

For some reason, IKEA futons and plastic rolling cabinets from the Container Store are not making their way in, and likely never will. Nobody has any use for that, and it's going straight in the trash.

April 8, 2014

Displays for digital distraction: Larger monitors, multiple monitors

Back when computers were machines for doing something productive or creative, their monitors were small and compact, whether in a business or home setting. They could have been made much larger, since the same CRT technology was used for medium and large-sized TV sets. But those larger screens were preferred because TV was for entertainment, hence were of little value when it was time to type up a report or crunch some numbers on a spreadsheet.

Here is the original IBM PC, with an 11.5" monitor (and only about 10.5" being viewable), and the original all-in-one Macintosh, with an even smaller 9" monitor (rather prophetic copy in the ad, don'tcha think?):



How did Primitive Man ever get any work done with such crude tools, unable to read emails in HD? How could they have focused on the memo they were typing, without the word processor overwhelming their eyes on a 30" screen? Really makes you grateful for not having to live in the dark ages anymore...

Monitors stayed small for well over a decade, and did not begin to bloat out and take up most of the desk until the mid-to-late 1990s, as they came to be used more for entertainment — first for graphics-intensive video games, then for watching illegally downloaded movies and TV episodes, then for illegally downloaded porn, and then web pages in general once they became more dazzling than black Times New Roman text against a gray background.

The increasingly common widescreen aspect ratio is another sign of computers as entertainment centers.

But one monitor can only get so big — eventually you're going to have to step up your game and upgrade your rig to include two monitors. Awww yeah, son: getting interrupted by emails on one screen, while refreshing TMZ on the other. Creating PowerPoints LIKE A BAWS. Nobody can hold me down now, I'm a productivity beast!

The multi-monitor set-up is nothing more than conspicuous consumption for nerds. In fact, if two are better than one, why stop there? Hook up ten. (Brought to you by the makers of five-blade razors.) Better yet — swiveling around at the center of a 360-degree ring-o'-monitors. And hey, the sky's the limit: install a planetarium ceiling and crane your neck up toward your Display Dome. Endless workspace, endless possibilities!

Forget the fact that, in the old days of a real desktop, they did not bother extending desks out to 10 feet long in a lame attempt to maximize productivity. Having too many separate sub-areas of the desktop makes it hard to focus on the one at hand. About the only task that truly benefits from two separate areas visible at the same time is manually copying a target document onto a blank one, analogous to dubbing cassettes. Otherwise, the world churned right along — and saw greater productivity gains over time — with just one central work area on their desks.

Something similar is going on with the phenomenon of "twenty tabs open at a time," as though keeping twenty books open on a real desktop would somehow make you absorb information more efficiently. Or as though playing twenty TV channels simultaneously would make for greater entertainment. In Back to the Future II, that was presented as satire; today it has become the unremarkable reality.

Undoing or not participating in the multi-monitor trend is fairly simple. Finding a monitor under 17" is a lot tougher. Buying them new is out, and even thrift stores will not accept them, so don't bother there. Craigslist and eBay are the best places, although since most of them are CRT (better than LCD in any case) the shipping will not be cheap. Still, it's a small price to pay for something that will last forever and prevent digital distractions from taking over your desktop.

April 6, 2014

A weakening power of helicopter parenting? Another round of little Boomers?

While traveling a bit lately, I've been observing the otherwise invisible nuclear families of today, now that they have to leave their lock-down compounds to go to the airport, or leave their hotel room to grab breakfast.

It's probably too early to tell, but I'm getting a hunch about how small children these days are, or are not, going to internalize the paranoia of their helicopter parents. These are children in early elementary school or younger.

When helicopter parenting paranoia began with new births in the late '80s, there was plenty for parents to be concerned about (which doesn't excuse their over-reaction). Violent and property crime rates were nearing their peak, and for the previous several decades, it had seemed like the world would only continue on in that direction.

Hence, when the parents sealed off their nuclear family from the outside world and began ordering their kids not to do this and not to do that, there was an honest sense of concern coming through in their voice and mannerisms (however overblown this concern may have been). Moreover, this was the only message about the outside world that the parents allowed to get through to their children — primarily by shutting out all other sources of input, but also by choosing only those external sources that would provide a consistent paranoid message to their little dears. "Parental control."

These children, the Millennials, have grown up to be the most frightened and insecure generation in living memory — how else could they have turned out? Everybody who offered them input, or who their parents allowed them to observe, sent the message that the world is too scary and random to venture out into on your own. And their tone of voice was consistently frightened for your safety, not as though they were just making shit up or just trying to spoil your fun. I guess you might as well hunker down in your room and interact with others at most through virtual channels (texting, internet, online video games, etc.).

Now, what's going to happen when these people become parents? They don't have any first-hand experience with real life, let alone the dangerous, topsy-turvy, and humbling parts of it, let alone decades of such experience. When they try to pass on the message of how scary the world is, it will start to ring hollow. Kids aren't stupid, and they can tell what your tone of voice and mannerisms reveal, aside from whatever you claimed in words. At the least, they can tell when you're being sincere and honest, or when you're joking around and teasing them.

Can children also sense which grown-ups have more experience, and which are more naive? If so, they'd react to the dire warnings of their Millennial parents with, "Yeah, and how would you now, you big wuss?" Whereas if they sensed the parent was more seasoned, they'd take it to heart — "Damn, even this been-there, done-that kind of grown-up sounds scared. It must really be dangerous."

However, when the child steps on the other side of the paranoid "do not cross" line, Millennial parents sound more annoyed and upset than they sound concerned and afraid. This may also be going on with later Gen X parents, who are more experienced, but whose memories of how tumultuous the world can be are fading more and more every year. It's harder and harder for 1992 to exert that kind of gut influence that would shake up the parents, when the world has gotten as safe, stale, and antiseptic as it has by 2014.

Thus little kids today are not going to take parental paranoia to heart like the Millennials did. It's just the mean old grown-ups trying to boss us around and spoil our fun, not looking out for our greater long-term welfare. By the time they're in high school, cocooning will have bottomed out, and they'll be able to enjoy a more unsupervised adolescence. And given how low the crime rate will be by then, they'll conclude that all their parents' warnings were either clueless and out-of-touch, or knowingly wrong and intended to shelter them from real life. See, nothing to worry about in venturing off into unsupervised places!

What was the last generation that had this naive attitude toward breaking free from parents, who they callously dismissed as either out-of-touch or as hypocrites? Yep, we're about to see the rebirth of the Baby Boomers, whose defining impulse is calling the bluff of authority figures.*

It's odd how small children these days are more annoying in public places, running around and making noise, despite their helicopter parents trying to make them behave. When the Millennials were that age, they were either not to be seen at all, or were frozen in place. Today's rugrats and ankle-biters seem more appropriate for the 1950s (see any Saturday Evening Post cover showing their frazzled parents, or a crowd of kids running around the house at a birthday party on Mad Men).

We're not into that late '50s environment yet, but you can sense things creeping up to that turning point. For now, we're still waiting for Elvis.

* Gen X has a similar but more practical and mature attitude. Breaking free from parents is good, but you do have to be cautious out there. Yes, parents are out-of-touch, but that's to be expected given what a different world they grew up in. Authority should not be blindly followed, but blithely romping around calling the bluff of every older person who offers you advice, especially if it comes from the wisdom of tradition, is likely to get you killed.

April 4, 2014

Adventures in vinyl-hunting for practical reasons, part 2

In an earlier post I confessed a major sin: I have always bought vinyl records for practical rather than aesthetic or Romantic reasons. Namely, there's a lot out there that is not easily available on CD — finding it is hard or expensive.

I dropped by a Half-Priced Books today to comb through their rock/pop records, and found five things I've never seen in real life on CD, for under $15 including tax. You can't go wrong for under three bucks a pop.

1. Scandal's self-titled debut EP, with "Goodbye to You." It was never released on CD. Two songs are on a Patty Smyth compilation that I have on CD, and VH1 did release all of them on a CD compilation — though given the date, I'm sure it was a victim of the loudness wars and had all the dynamic range compressed out of it. In fact, the Patty Smyth compilation sounds a bit hollow itself, and it's from the mid-'90s IIRC.

2. Missing Person's album Spring Sessions M, with "Words," "Destination Unknown," and "Walking in L.A." It was released on CD, but I've never seen it in at least five years of regular combing of used record/CD stores, nor could they order it from their distributor. I'm guessing the CD is out of print, as it's going for $40-$50 on eBay.

3. A Flock of Seagulls' self-titled debut album, with "I Ran" and "Space Age Love Song." It was released on CD, but I've never seen it, and on eBay it's $10-$20 used. The LP is going for about that as well, so I got an even better deal today.

4. The Motels' album All Four One, with "Only the Lonely" and one of the coolest album covers ever. As with #3, it was released on CD, though I've never seen it, and is going for $10-$20 used on eBay.

5. After the Snow by Modern English, with "I Melt with You." On eBay there are CDs going from $5-$10, which isn't bad, but they also say that it's rare. Short supply doesn't always translate into high prices if there's little demand for it. Can that be, though? It's only got one of the most iconic, catchy, uplifting songs of all time on it! Up until now, I'd been listening to it on the CD soundtrack for Valley Girl.

Come to think of it, I don't believe I've seen any CDs by these bands in the wild, not even a greatest hits, although most of their hits can fairly easily be found on multi-group compilation discs. (I might have seen something by the Motels, but can't recall.)

They had a copy of the first two Duran Duran albums, the self-titled one and Rio. I'm sure I'll continue to like those albums more than any of the five that I bought today, but I still passed on them. I've already got them on pre-'90s CDs that sound great (not guaranteed to be in a used CD rack, but not hard to find either). I'll eventually get around to buying duplicates on vinyl just for the different sound, but there's still so many more records to hunt down of things that are damn near impossible to get on CD.

I also passed on some more pop-y choices that I probably shouldn't have, and may have to return to pick up tomorrow: Belinda Carlisle's debut album Belinda, with "Mad About You," and Jane Wiedlin's album Fur, with "Rush Hour." Two or three songs from Belinda are on a greatest hits of hers that I've already got, but I've never seen anything by Jane Wiedlin on CD, not even her one hit on a compilation.

Usually the discussion of vinyl centers on music from the '60s and '70s, before CDs were available. But you'd be surprised how much killer '80s music is hard to find outside of the record crates, at least the entire original album rather than the hits on a compilation.

If it was major enough, they probably released it on CD — perhaps following up with multiple re-issues if it's that important (like Graceland by Paul Simon). But if the band was remembered as a one-hit wonder, then probably not. The heyday of New Wave from '82 to '84 came just as CDs were being introduced, so it was far more likely to have been released on vinyl and perhaps cassette. That's the place to look for it, then.

April 3, 2014

Cosplay in the Fifties: The Davy Crockett craze

The music video for "Come As You Are" by Peter Wolf portrays exuberant everyday life in a small town during the late 1950s, and by the end the whole town gathers together in a crowd as though they'd all caught dance fever from the future.

One of the key examples of "iconic Fifties life" that they bring out is the 10 year-old kid who's all dressed up in his Davy Crockett gear. Not because he just came from a school play about frontier life, but because that's just what boys were into at the time. In fact, the Crockett craze was part of a broader mania for all things Western during the late '50s, among both children and adults.





How similar was that phenomenon to the trend of cosplay in the Millennial era? It was a full costume of the icon — clothing, hat, and prop gun — rather than a t-shirt / backpack / lunchbox with the icon's image on it. The icon was a specific character, Davy Crockett, rather than a generic type like "frontiersman." The children were role-playing as the character, rather than dressing up that way while behaving normally. And it was a mass phenomenon mediated by the mass media, rather than a strictly grassroots sub-culture or counter-culture.

Daily life in the Mid-century was relatively unexciting, especially for children during the heyday of Dr. Spock, smothering mothers, Levittown subdivisions, and drive-ins where the customers physically cut themselves off from each other. Role-playing as a figure from the Wild West gave them an escape from the cocooning culture's insistence on parking yourself in front of the TV set rather than wandering around the neighborhood, clipping your fingernails and brushing your teeth in the proper way, and remembering to drink your Ovaltine.

At the same time, competitiveness was low and falling circa 1960, so kids back then did not make costume competitions and one-ups-manship a part of the Crockett craze. And unlike now, there was no pseudo-slut counterpart among girls — attention-whoring being another aspect of the competitiveness of today's culture. It was also restricted to kids who were about to go through puberty, rather than adolescents and young adults.

It seems then that cocooning is what brings this trend into being, and that high or low levels of competitiveness only shape its expression.

I don't remember anything like the Crockett craze during the nadir of cocooning in the '80s. There are pictures floating around of D&D-themed cosplay in the late '70s and '80s, but that must've been damn rare. There are only a handful of gatherings pictured, as opposed to the endless examples you can find of kids in Davy Crockett get-up or of contempo cosplayers.

At the height of Hulkamania, we might have worn a t-shirt like Hulk Hogan's, or played with ninja weapons during the Ninja Turtles craze, but we never got fully dressed up in character for role-playing, let alone dress that way in an ordinary setting like hanging around the house. Everyday life had enough excitement back in more outgoing and rising-crime times that it wasn't necessary to pretend that you were part of a Wild West culture.

April 1, 2014

The decline of pop music parody when the songs and videos have become forgettable

VH1 Classic just played "Synchronicity II" by The Police. I'd never seen it before, but instantly recognized that it was a quotation of the iconic video for "Dancing with Myself" by Billy Idol, whose personal image Sting is imitating.

Seems like it's been a long time since a well known group made a video that quoted another video. This brief review confirms the hunch. In 2011, All-Time Low made a video spoofing Katy Perry and other major stars du jour, but they were not a very popular group (never appearing on Billboard's Year-End charts). The last clear example was "All the Small Things" by Blink 182 way back in 2000, and even that was a blip, as there were no others from the '92-and-after period. (Needless to say, Chris Rock did not appear on the charts, and I don't recall seeing the video for "Champagne" back then.)

After "Synchronicity II" in '83, David Lee Roth made a more obvious and slapstick parody video for "Just a Gigolo" in '85, the same year that Phil Collins aimed for parody in the video for "Don't Lose My Number." In '91 with Genesis, he included a Michael Jackson parody in the video for "I Can't Dance." There are probably some others out there that I can't think of immediately, and that the writer of the review missed, but that's enough to establish the basic picture.

Those four acts were all major successes at the time, all appearing on the Year-End charts across multiple years, hence the parody videos would've enjoyed high visibility.

So why the decline since the '90s? Well, allusions like these rely on the audience having already been exposed to the original and stored it in memory. Steadily over the past 20-odd years, less and less of the target audience has reliably tuned into music videos -- and music in general. Moreover, music videos have gotten a lot more bland and featureless -- and who's going to remember a forgettable video when it's referred to in a later parody?

By "forgettable," I don't mean "inferior" in a subjective way, but objectively lacking in identifying features -- things that help you explain them to others to jog their memory. Simple memory tests would show how easy or difficult a video is to recall. "Y'know, that video where the glamor models are playing the instruments and the singer is wearing a shirt and tie?" (Actually, he made three videos like that -- the one where the models are wearing pink, or wearing black? And is he wearing a suit or just a shirt?) Try doing that for recent videos -- "Y'know, that one where those rappers are boasting around a pool and a bunch of big booty hoes are shaking their asses..." Oh right, that one.

I think this also explains why parody songs have died off since the heyday of Weird Al Yankovic in the '80s and early '90s. You can't get the listener to recognize the target song if it has no memorable riffs, melody, solo, or distinctive vocal delivery. It can only end up sounding like a parody of the whole genre, whose performers are all interchangeable (emo, crunk, etc.). That would also have been difficult in the '50s and early '60s, when most pop music sounded so similar and lacking in identity.

Thus, the peak of pop parody during the '80s and early '90s suggests a peak in those years for the distinctiveness of both the music and the videos of hit songs.

Related: Ornament is meant to make things more memorable

TV audiences in love with annoying children, the second wave

Here is a New York Post article reviewing some of the many annoying child and teenage actors on hit TV shows right now. I haven't seen any of them, so I'm not sure how central they are, but it sounds like they're at least regular cast members, some of them at the core.

If there are so many of them littered throughout popular shows, the general audience appreciates it, notwithstanding a vocal minority. And even if the general audience doesn't care for a particular character, that's more of a failed attempt at the goal of creating the "prominent child character" that the audience craves. In today's climate of nuclear family-centric cocooning, viewers just can't get enough of watching children.

However, this isn't the only time when you would have been assaulted by annoying kids when turning on the TV. During the previous heyday of permissive parenting in the Mid-century, one of the most popular shows was Dennis the Menace, starring a more sympathetic but still off-putting Baby Boomer brat who couldn't act. The less popular yet more iconic show Leave It To Beaver starred an even more annoying kid who couldn't act.

Circa 1960, status-striving and dog-eat-dog competitiveness was nearing a low point, so at least those earlier examples would not have made you angry with their smug dismissive attitude. Still, they were children who couldn't act, they were central characters, and they were going to get on your nerves for being so dorky, bratty, and wussy.

As Mid-century cocooning had all but melted away by the 1980s, it was damn unlikely that you were going to suffer annoying children on television. Here is a site that lists the top 30 shows in the Nielsen ratings for each year, which you can browse if you're unfamiliar with them.

For the early and middle parts of the decade, there was nary a child to be seen, let alone a central character whose immaturity and inability to act would have made you change the channel. There was that blonde daughter from Family Ties who was a mopey sourpuss, but she was mostly out of the picture before she got to high school. I'm sure there are other marginal examples like that, but none where they're one of the main characters.

What's striking about the hit shows back then is how grown-up everyone is. Dallas, Magnum P.I., Cheers, Miami Vice. Viewers then were so maturity-minded that they put The Golden Girls, a show about four senior citizens, into the upper layer of ratings. Directly related to that is how unrelated most of the characters are — annoying kids tend to crop up in shows that focus on families.

Toward the end of the '80s, as cocooning is about to set in, there were still only a couple of annoying kids on TV — the son Jonathan in Who's the Boss? and Rudie from The Cosby Show.

Then as the family-centered shows of the '90s rode the wave of helicopter parenting and cocooning, we got a deluge of annoying kids. Darlene and DJ from Roseanne, the little boy from Family Matters, the Olsen twins and Stephanie from Full House, the blond nerdlinger from Step by Step, and all of the kids from Home Improvement.

Everybody's favorite show about nothing, Seinfeld, was a holdout in this regard, and in hindsight was a key factor in making the show so enjoyable — no fucking kids. Not even teenagers. It was more of an ironic and self-aware incarnation of Cheers adapted for the postmodern Nineties.

These changes back and forth are the result of folks being more community-minded and public-oriented vs. more family-minded and home-oriented. The community and public places are not made up of children, but adults (mostly). Only when people start locking themselves indoors do they dwell on the ankle-biters that are part of private, family life.