Thursday, November 20, 2014

Culture Shock 11.20.14: Bullying spoils science achievement

And then the shirt hit the fan....
And then the shirt hit the fan....
The perpetually aggrieved are why we can't have nice things.

Last week, the European Space Agency did something incredible. It landed an unmanned spacecraft, launched more than 10 years ago, on a comet roughly 300 million miles away.

To give you a ballpark idea how far that is, the average distance from Earth to the moon is 238,900 miles. From the Earth to Mars is 140 million miles. And from here to Jupiter is 484 million miles.

It was an amazing feat of science and engineering, and for a little while, those of us who weren't alive for the Apollo moon missions got just a little taste of what it must have been like to watch Neil Armstrong take his "one small step" into the history books.

It isn't likely a lot of people will recall where they were and what they were doing when they heard a probe had touched down on comet 67P/Churyumov-Gerasimenko. With a name like that, it isn't likely many people will remember 67P/Churyumov-Gerasimenko at all. But in the annals of space exploration, the Rosetta mission is a pretty big deal.

That's why it's so frustrating that all some people could talk about afterward was one rocket scientist's bowling shirt. Matt Taylor, the lead mission scientist, made the mistake of thinking he could wear a shirt, made by a friend and given to him for his birthday, during the live Internet feed for the Philae lander's touchdown. Little did he suspect that doing so would make him a target for people who search for reasons to be offended.

The shirt wasn't stereotypical "rocket scientist" gear. It was covered with cartoon images of sexy women holding guns and posing provocatively, the sort of thing you used to see painted on the side of vans. Suddenly, Taylor's accomplishment wasn't the story; his "sexist" shirt was.

The most shrill response came from The Verge, which went after Taylor and the ESA in an article headlined "I don't care if you landed a spacecraft on a comet, your shirt is sexist and ostracizing." ("Ostracizing"? Really?) The article, by Chris Plante and Arielle Duhaime-Ross, goes on to say, "This is the sort of casual misogyny that stops women from entering certain scientific fields."

That sentiment was echoed by other bloggers and on Twitter. It didn't matter that the friend who made the shirt for him, Elly Prizeman, is a woman, who took to her own blog to defend Taylor and thank him for the "sweet gesture" of wearing it on one of the most important days of his life.

When I was growing up in the 1970s, I was taught that women could do any job men could do, that women were just as tough as men, that they didn't need knights in shining armor to protect them and should be just as free with their sexuality as men have always been. That was feminism.

What passes for feminism now, however, says women are afraid to go into science-related fields because male co-workers might wear loud bowling shirts sewn by their female friends. It says those female friends are mindless dupes of the patriarchy. And because I'm a male, this strain of feminism says I need to shut up, "check my privilege" and stop "mansplaining." I'm especially not supposed to have an opinion about what feminism is, nor cite any female scholars or writers who agree with me.

Bloomberg View columnist Virginia Postrel coined a wonderful term for this feminism of constant outrage: "link-bait feminism." Find any slight, real or imagined, no matter how small, and take offense. Blog about it, attach a sensational headline, and watch the outrage go viral. Rarely is the outrage even genuine.

This sort of cynical offense-stoking cheapens feminism and renders it increasingly irrelevant. Only 23 percent of U.S. women identify as feminists, according to a HuffPost/YouGov poll. The brand is tainted, because link-bait feminism isn't really about equal rights, equal pay or reproductive rights. It's about a right not to be offended, even if you're only pretending to be offended. That's not feminism; it's Puritanism.

The link-bait feminists won the battle. They bullied Taylor into a tearful apology. But with any luck, this is the overreach that will cost them the war, allowing feminism to become relevant again.

Thursday, November 13, 2014

Culture Shock 11.13.14: 'Twin Peaks' is happening again

I'll see you again in 25 years.
True to her word, it looks like Laura Palmer will indeed see FBI Agent Dale Cooper again in 25 years. Whether that's exactly what "Twin Peaks" creators David Lynch and Mark Frost had in mind back in 1990 is another matter.

Laura's prediction, in Agent Cooper's dream at the end of the first season's third episode, was just one of the many cryptic clues dropped during the show's brief, yet groundbreaking run.

When Lynch and Frost announced last month that "Twin Peaks" would return in 2016 for nine new episodes to air on Showtime, the news lit up the Internet like nothing short of announcing a new pope or a new "Star Wars" trilogy. As the giant told Agent Cooper, "It is happening again."

"Twin Peaks" returns to a television landscape radically changed from the one it shook up back in 1990 and '91. Back then, "Twin Peaks" was unique. Now, quirky shows with oddball characters and serialized storytelling that once was reserved for soap operas are virtually the norm.

"Twin Peaks" is in large part responsible for that.

When Lynch brought "Twin Peaks" to ABC in 1990, it was almost unheard of for respected filmmakers to slum in the television ghetto. Now that's common, too, and no one but the most obnoxious art-film snobs looks down on TV.

Yet that's not all that's changed about television in the intervening quarter century.
"Twin Peaks" is many things. It mixes comedy, melodrama, horror, science fiction, urban legends, bleeding-edge physics and Buddhist philosophy into the most compelling murder mystery since Jack the Ripper. "Who killed Laura Palmer" was the second coming of "Who shot J.R.?"

Yet structurally, "Twin Peaks" is deceptively simple. It's a parody of soap operas, and not the nighttime variety such as "Dallas" and "Dynasty," which reigned over the 1980s, but the daytime kind.

Like virtually every daytime soap ever aired, "Twin Peaks" is set in a fictional town with a sordid underbelly. A logging/resort community near the Canadian border, Twin Peaks falls into the long tradition of made-up burghs, from Salem to Genoa City, that attract more than their share of drama.

The show's mood music, courtesy of composer Angelo Badalamenti, swells with overwrought intensity at every romantic interlude. And the multi-generational cast reflects the storytelling necessities of daytime soaps, which traditionally shift their plots toward younger cast members in the summer to take advantage of teens being out of school.

"It is happening again. It is happening again."
"It is happening again. It is happening again."
Beyond that, "Twin Peaks" trades in all of the familiar tropes of daytime drama.

Infidelity? Aplenty. Convenient cases of amnesia? Nadine coming out of a coma thinking she's a teenager and back in high school qualifies. Characters presumed dead who turn out to have faked their deaths? That runs in the Packard family. A shady businessman who owns most of the town and schemes to gobble up or destroy everything he doesn't? Well, of course. "Twin Peaks" even has that most soapy of soap opera tropes, the identical twin who appears out of the blue.

James, a lovesick teen with resting pout face, seems to have just arrived from daytime TV. He goes through soul mates faster than most men do TV channels. He starts as Laura's "secret boyfriend" and moves on to her best friend Donna before the corpse is cold. Next he's on to Laura's identical cousin, then back to Donna and finally into the arms of an older woman, who so obviously plans to Double Indemnity him that we realize it faster than James can say Barbara Stanwyck.

In case all that doesn't clue us in, almost everyone in Twin Peaks is hooked on a soap opera within the soap opera, the amusingly overwrought "Invitation to Love": a parody within a parody.

But can "Twin Peaks" return as the same soap parody it was? In 1990 ABC, CBS and NBC each aired between three and four hours of soap opera programming each weekday. Today they air 3½ hours total between them. The daytime soap is a dead format, wrapped in plastic.

How the soap opera's demise will play into what Lynch and Frost have planned is, for now, just another of their unsolved mysteries. It could even involve a long lost twin.

"Would you like to play with fire? Would you like to play with Bob?"
"Would you like to play with fire? Would you like to play with Bob?"

Thursday, November 06, 2014

Culture Shock 11.06.14: Book reveals Wonder Woman's secrets

"The Secret History of Wonder Woman"
The past year has seen a surprising surge in Wonder Woman scholarship, with about a half dozen books, at least, delving into one or another aspect of the character's rich history. And with Wonder Woman's 75th anniversary still two years away, more are in the offing.

The highest profile entry so far is Jill Lepore's "The Secret History of Wonder Woman" (Knopf, $29.95). Lepore, a Harvard history professor and New Yorker staff writer, offers what is the most detailed and compelling look to date at Wonder Woman's fascinating and controversial creator, William Moulton Marston, and the women who shaped his life, from his childhood until his death in 1947.

The basics are already common knowledge to anyone with an interest in Wonder Woman and access to Google. Marston, who wrote his Wonder Woman stories under the name Charles Moulton, was a Harvard-educated psychologist and author credited with helping develop the lie detector. In 1941, he turned to comic books as a way of reaching young readers, especially girls, with his vision of a strong, independent woman. And so Wonder Woman was born, appearing first in "All-Star Comics" No. 8 and immediately graduating to the main feature in "Sensation Comics" No.1.

Marston was also a practicing polyamorist with an interest in bondage and discipline. He lived with both his wife, Elizabeth Holloway Marston, and assistant, Olive Byrne. He had children by both, and both were instrumental in Wonder Woman's origin. His Wonder Woman stories are filled with episodes of bondage and spankings. Wonder Woman courted controversy from the outset, but especially after Marston's death, when her adventures became Exhibit A in Fredric Wertham's brief against comics.

Lepore takes us beyond the Wikipedia summary of Marston by delving into his private letters and journals, which previously had been seen only by members of his family. The Marston that emerges is more con man than scholar, a shameless self-promoter never above exaggerating his accomplishments in a usually futile effort to get ahead. His lie detector experiments were more pseudoscience than science, and until he created Wonder Woman, his career trajectory was one of downward mobility. Each university post was less prestigious and secure than the last.

Jill Lepore
Photo by Dari Michele
His inability to advance in academia is what led him to popular entertainment as an outlet for his radical ideas, first in the movies and later in comics, aided by the opening provided by Byrne's brother, Jack Byrne, an editor for the pulp magazine publisher Fiction House, which published comics starring one of the most popular pre-Wonder Woman heroines, Sheena, Queen of the Jungle.

For all his failings as a scientist, Marston was a successful and manipulative showman, with a remarkable capacity to lure the media to his dog and pony demonstrations. He brought hucksterism to comics long before Stan Lee created his carnival barker persona to promote Marvel Comics.

The deceit that permeated Marston's professional life extended to his family life. The polyamorous relationship at the heart of the family would have scandalized Depression era society, and it appears to have included a fourth member, Marjorie Huntley, whom the Marstons met before they met Byrne.  Secrecy was inevitable, but even the children, at Byrne's insistence, were kept in the dark.

Lepore also places Wonder Woman within the context of competing strains of feminism, finding in her the influence of both 19th century female supremacists and birth control activist Margaret Sanger, who founded what would become Planned Parenthood. Sanger was also Byrne's aunt.

Yet the Amazon heroine herself appears in Lepore's narrative mostly to show how Marston's ideas and home life informed his stories. Lepore's account falls short when she gets to the comics industry. She downplays other female comic characters, such as Sheena, and misses obvious connections. For instance, during a brief stint advising Universal Studios, Marston consulted on "The Man Who Laughs." That film's title character would directly inspire the Joker, but Lepore passes over the opportunity to even mention Marston's connection, however minor, to Batman's arch foe.

That aside, Lepore gives both fans and scholars a lot to digest. This is the start of a reexamination of Wonder Woman and her creator, not the last word.

Thursday, October 30, 2014

Culture Shock 10.30.14: '30s horror possessed of timeless glamour

Zita Johann, left, and Boris Karloff in 1932's "The Mummy."
One typically doesn't think of horror films as glamorous. Quite the opposite.

Glamour is enchantment. It appeals to our dreams and desires. It's something we want to escape to. Glamour is Louis Vuitton's "L’Invitation Au Voyage" ads, featuring model Arizona Muse, who in one ad dashes through the Louvre in Paris and hops aboard a hot air balloon, and in a subsequent ad lands her balloon in Venice just in time for an elegant fancy dress ball hosted by a regal David Bowie.

Horror is anti-glamour. It's something we seek to escape from, whether it's something as subtle as a prickling sensation of foreboding or something as blatant as a machete-swinging maniac.

When horror becomes glamorous, it ceases to be horror. The "Twilight" films are soap opera fantasies that happen to involve vampires and werewolves. They're someone's wish fulfillment.

The exceptions to the rule are the horror films of the early 1930s, especially those produced by Universal Pictures. And it is the paradoxical glamour of the these early, archetypal horror films that helps explain why they are so enduring, long after they lost their ability to startle jaded audiences.

As some of the first horror pictures of the sound era, "Dracula," "Frankenstein" and "The Mummy" set a benchmark other filmmakers have rarely approached. In terms of cultural impact, only Britain's Hammer Films comes close to equaling what Universal and its early imitators accomplished 80 years ago. And Hammer went about it in an altogether different way, emphasizing shocks and "Kensington Gore" over mood and atmosphere. When film historians speak of "Hammer glamour," they refer to the studio's curvaceous leading ladies, with their pin-up looks and plunging necklines, rather than to the films themselves. It's an altogether different type of glamour.

What makes the horror films of the 1930s different is just that: They're of the 1930s.

In her paradigm-setting 2013 book "The Power of Glamour: Longing and the Art of Visual Persuasion," Bloomberg View columnist and former Reason magazine editor Virginia Postrel zeroes in on the '30s as the decade of glamour.

Postrel writes that the films of the '30s created a visual shorthand for glamour that extends across the decades to the present day. They introduced Middle America to "the high-contrast surfaces and streamlined forms of American art deco, the satin gowns and dramatically lit portraits of screen goddesses, the distant shots of the New York skyline, the sleek nightclubs and penthouse apartments, the languorous cigarette smoke."

Universal's horror films transport Dracula and Frankenstein's monster, creatures of Victorian times and earlier, to what was then the present day, or a vague facsimile. Unmoored from their history, they become intruders in a chrome-plated world that evokes an imagined and desired future.

Bela Lugosi, left, and Helen Chandler in 1931's "Dracula."
The title credits of Tod Browning's "Dracula" (1931) play over a stylized bat symbol that's all art deco curves and none of the sharp, Gothic angles you'd expect. The contemporary setting liberates the women from their Victorian layers and corsets, allowing them to float freely inside elegant gowns that barely cling to their shoulders. Even Bela Lugosi's Dracula manages to blend in with style, looking equally at ease in a top hat and tails as he does in a cape. Only his accent gives him away.

In Karl Freund's "The Mummy" (1932), modern Cairo blends old and new into a glamorously exotic locale, where the city's 20th century lights pinprick the skyline beneath the pyramids. And absorbing the view from her balcony is Zita Johann's Helen, named for antiquity's most beautiful woman yet wearing a modern dress and smoking a cigarette. Like all 1930s glamour, Helen is modern yet timeless.

The real menace of the early Universal monsters is the threat they bring to the glamorous world to which we'd like to escape. That is what makes them so compelling to this day.

But the 1930s didn't last, and neither did horror's glamour. "The Wolf Man" (1941), with its doomed, aristocratic hero taking the blue collar form of Lon Chaney Jr., was the beginning of the end.

Abbott and Costello lay just ahead.

Thursday, October 23, 2014

Culture Shock 10.23.14: 'The Great Pumpkin' is a test of faith

"A Charlie Brown Christmas" may be the first and most celebrated "Peanuts" TV special, but "It's the Great Pumpkin, Charlie Brown" is the best by almost any standard.

The animation is smoother, the writing and acting are more assured, and the entire production seems more fully formed, which it is, given that CBS wasn't entirely sold on "A Charlie Brown Christmas" until after it aired and the ratings came in. In particular, CBS wasn't sold on using children as voice actors or the jazz score by West Coast pianist Vince Guaraldi and his trio. But the special's ratings triumph led the network to go all in on two "Peanuts" specials the following year: the seldom-aired "Charlie Brown's All-Stars" and "It's the Great Pumpkin, Charlie Brown."

"The Great Pumpkin" has more than snazzier production values going for it. It better captures the melancholy spirit of the comic strip. Of all the "Charlie Brown" specials in the world, it's the Charlie Browniest.

"A Charlie Brown Christmas," which first aired in 1965, is "Peanuts" creator Charles M. Schulz at his most hopeful. Linus gives a soliloquy on the true meaning of Christmas, Charlie Brown gives a ragged little Christmas tree a chance, and at the end even Lucy gives Charlie Brown a break, admitting he did pick a good tree after all. Everyone sings carols, and the special fades out with a happy ending — a merry Christmas to all, and to all a good night. God bless us, every one.

Premiering Oct. 26, 1966, "It's the Great Pumpkin, Charlie Brown" is Schulz at his most cynical. The Christmas special reaffirms his faith, but the Halloween special tears it back down. It isn't just happenstance that Linus, for once, ends up with his hopes dashed.

Charlie Brown is the receptacle of all of Schulz's insecurities, but Linus is his philosophical mouthpiece. "It's the Great Pumpkin" shows Schulz is keenly aware Linus can, at times, be more than a little overbearing.

Linus, who proselytized for Christmas a year earlier, is a prophet for the Great Pumpkin, who according to official doctrine, rises each year from the most sincere pumpkin patch and flies through the air with toys for all the good little children. Linus has little regard for the Great Pumpkin's more high-profile rival, "that fellow with the red suit and white beard who goes, 'Ho, ho, ho!' " He leaves Charlie Brown to sigh about how they are separated by "denominational differences."

Halloween becomes a test of faith for both Linus and Charlie Brown.

Linus, naturally, never sees the Great Pumpkin. But rather than question the object of his faith, he blames a momentary lapse. Even a little slip, he says, can cause the Great Pumpkin to pass you by.

The Great Pumpkin is a jealous squash, and he leaves Linus literally out in the cold.

Charlie Brown fares even worse. He first puts his faith in Lucy (and a signed document) when he goes to kick the football she so temptingly holds for him. And once again, Charlie Brown is done in, this time not just by his faith in humanity, but by a legal technicality: the document wasn't notarized.

Yet the worst is to come. Charlie Brown goes trick-or-treating Halloween night and ends up with nothing but a bag of rocks. It's probably the funniest joke in all the "Peanuts" specials, yet it's also the most mean-spirited. We thought it was just his fellow kids who gave Charlie Brown a hard time, but no. Apparently the whole town hates him.

Audiences, however, still love the blockhead, and they love to see him miserable. ABC's Oct. 15 broadcast of "It's the Great Pumpkin" scored 6.3 million viewers and a 2.1 rating among adults 18-49, according to Entertainment Weekly, topping most of the other network shows that aired that night.

In all the gloom and doom of "The Great Pumpkin," one ray of hope cuts the darkness. Schulz eases up just long enough for Lucy to go outside in the early morning and bring her shivering brother to bed.

Sisterly love doesn't conquer all, but if even Lucy has a heart, maybe there is hope for the rest of us.

Subsequent "Peanuts" specials don't go as dark, and by 1973's "A Charlie Brown Thanksgiving," Schulz is doing happy endings again. Staring into the abyss once a year is enough.

Thursday, October 16, 2014

Culture Shock 10.16.14: Cosplay isn't a sign of stagnation

Thinkstock photo
I've never been one for cosplay. Putting together a good costume takes a lot of time and money, to say nothing of sewing skills I'm sadly lacking, as my high school home economics teacher would attest.

The most dressed up I ever get for comic book or sci-fi conventions is a T-shirt emblazoned with some obscure pop-culture reference. But I've always envied people who have the time, patience and know-how to pull off a really great costume. Little did I suspect they were an indicator of economic stagnation and poor job prospects. Who'd a thunk it? So, imagine my surprise when I read an article at The Week headlined "Why the rise of cosplay is a bad sign for the U.S. economy."

The author, James Pethokoukis, a fellow at the American Enterprise Institute, makes a bold and, it seems to me, wrongheaded claim. It goes like this: More and more Americans, especially millennials, are into cosplay, short for "costume play." And this is because dressing up as fantasy characters is an escape from dead-end jobs and economic malaise.

But maybe I should let Pethokoukis speak for himself: "When you're disillusioned with the reality of your early adult life, dressing up like Doctor Who starts looking better and better."

Pethokoukis argues by way of analogy, noting that Japan has lots of young cosplayers, lots of underemployed young people and an economy that's been no better than anemic for the past 20 years. This, I gather, adds up to something, but I have no idea what.

Despite the nagging feeling there must be something more to what Pethokoukis is saying, if there is I can't find it. Taken as is, Pethokoukis' argument is so wrong I barely know where to begin, but I'll start with dollars and cents.

Cosplaying isn't for the poor of spirit or bank account. It costs a lot of money to make a good costume. Some cosplayers spend hundreds if not thousands of dollars on their wardrobes. Some wear multiple costumes during the course of a weekend convention. Some even call upon the services of professional makeup artists. And none of that accounts for travel and hotel costs. Attending a sci-fi or comic convention isn't cheap, whether you're in costume or not.

Pethokoukis knows this and even refers to the "big bucks" cosplayers invest in their costumes.

Now maybe he assumes they all live with and sponge off their parents to supplement whatever money they make from their menial jobs. Regardless, it takes a wealthy society to be able to afford such pastimes. Cosplay isn't a sign of economic trouble, but a reminder of how rich we are, even after the Great Recession. But the problems with his argument don't end there.

It's hard to calculate how many cosplayers there are in the U.S., but we can at least get an idea of attendance at the conventions cosplayers frequent. Attendance at one of the largest, San Diego's Comic-Con International, has been growing for years, since well before the Great Recession. The fastest growth was from 2001 to 2005, when attendance doubled and reached 100,000 for the first time. Since the recession, attendance has hovered between 125,000 and just over 130,000.

That doesn't look like a post-recession flight from reality to me. It looks more like a trend line flatting out. But enough foreplay. Time to get to the crux of Pethokoukis' article, such as it is.

Though unstated, Pethokoukis makes a common but unwarranted assumption: that there is something special about cosplay. But the fact is, people escape mundane reality in lots of ways, and dressing up like fantasy characters is just one of them, albeit the one that's most easily ridiculed.

What if we applied the same illogic to jock pastimes that Pethokoukis applies to geek pastimes? Maybe we should be looking at the increasing popularity of fantasy football and baseball? Maybe the real indicators of a lousy economy are guys who turn their dens into shrines to their favorite college or pro football teams? Yes, these are expensive hobbies, too, but why let that stop us?

If Pethokoukis' argument describes reality, then cosplaying millennials aren't the only ones trying to escape it. Fortunately, they aren't the ones I think have taken a flight from reality.

Thursday, October 09, 2014

Culture Shock 10.09.14: It's the end of Saturday morning as we knew it

Print advertisement for NBC's 1983 Saturday morning cartoon lineup. It was the
debut season for "Mr. T" and "Alvin and the Chipmunks," while "Thundarr the
Barbarian" moved to NBC from ABC, where it had aired the previous two seasons.
For as long as I can remember, "children's advocates" have hated children's television.

They always said the same thing: Children's television was too violent, too dumb and too commercial. And because kids watched "too much" of it, it was, by implication, too entertaining.

Not anymore. Mark your calendar, for this is a date that shall live in infamy: Saturday, Oct. 4, 2014, was the first Saturday since the 1960s when there were no Saturday morning cartoons on broadcast network television. For those of us who were kids during the Golden Age of Saturday morning cartoons, in the 1970s and '80s, it's Saturday mourning in America.

At last, the self-appointed children's advocates have slain their dragon.

In place of the animated cartoons that Generation X and the millennials grew up with are a bunch of live-action "educational and informational" programs. They're designated by the little "E/I" logo on the screen, which means the broadcaster is counting every second of them toward its government-mandated quota of E/I programming. It doesn't matter if anyone watches; it just matters that it's there and that it's "quality," as defined by the children's advocates.

Kids, meanwhile, have responded just as you'd expect. Those who can have flocked to cable TV and the children's section of Netflix, both of which operate blessedly free of the dictates of the Federal Communications Commission, for the most part. For now, anyway.

How did this happen? How did children wake up in a world with no Saturday morning cartoons?

It started 24 years ago when Congress passed the Children's Television Act of 1990. The act was the culmination of 20 years of agitation by activist groups such as Action for Children's Television, founded by Peggy Charren, who became the go-to talking head whenever the national news media needed someone to pontificate about kiddie TV, because why would you ever ask a kid?

The Children's Television Act limited advertising during both cable and broadcast children's programming and mandated that broadcasters devote a set amount of airtime each week to E/I shows.

The act's first victims were the cartoons that aired after school each weekday. The ad restrictions made them less profitable, which was the kiss of death in the highly competitive broadcast syndication market. Stations quickly dropped cartoons and added more talk shows and TV judges. Indirectly, we have the CTA to blame for Judge Judy and her ilk.

The CTA's full impact didn't hit Saturday mornings until later, as the FCC "clarified" the act and spelled out exactly what "educational and informational" meant, always tightening the screws.

NBC was the first to fall. The proud peacock that once had aired "The Smurfs" and "Spider-Man and His Amazing Friends" farmed out its Saturday morning airtime to corporate sibling NatGeo.

Print advertisement for CBS's 1982 Saturday morning cartoon lineup.
CBS and ABC were next, followed by Fox. When the end came, only The CW was still airing cartoons, all of them Japanese imports. It was a painful, lingering death. Saturdays deserved better.

The image of children getting out of bed at the crack of dawn to watch Saturday morning cartoons along with a sugary cereal chaser has become a cliché. But it's no less true. My generation looked ahead to Saturday mornings — filled with Superfriends and Snorks — as if each were a mini Christmas. The networks trumpeted their new Saturday morning lineups each fall with preview specials in prime time. It was a big freaking deal.

Sure, kids still have Cartoon Network, Nickelodeon and the Disney channels, but Saturday morning's passing matters. Maybe those old cartoons weren't "educational" in the approved sense, but they were a springboard for our imaginations. More than that sugary cereal, those cartoons fueled us, not just for the rest of the day but for life.

"Scooby-Doo," for one, taught us real-life lessons. We learned not to worry about scary-looking ghosts, because the odds were those ghosts were just con men trying to pull a fast one.

With cartoons teaching lessons like that, it's no wonder Congress was so eager to replace them with FCC-approved boredom.