Thursday, July 30, 2009

Culture Shock 07.30.09: Comic-Con grows into Hollywood's biggest party


It's all over but the fumigation.

More than 125,000 people attended this year's Comic-Con International in San Diego this past weekend. To give you an idea how many people that is, Comic-Con could qualify as the fifth-largest city in Alabama, just after Huntsville and nearly 2 1/2 times the size of Decatur.

But despite the name, Comic-Con isn't just about comic books, nor has it been for the past decade. Comic-Con is now the biggest entertainment event this side of the Oscars, and in the grand scheme of things, it's far more important than any mere awards ceremony.

Comic-book writers, artists and their fans, like the 300 or so who attended the first Comic-Con in 1970, are now second-class citizens — make that third-class citizens — at their own party. They've been given a big, collective wedgie by Hollywood's entertainment elite. It's just like junior high school all over again.

The comics geeks who started Comic-Con are, as they say, victims of their own success. All it took was movie studios realizing the convention's attendees represent a demographic that can make or break a summer blockbuster. And a sample size of 125,000 is hard to beat, even if far too many attendees forget to bathe during all the excitement, no matter how many times con officials remind them.

Plus, most of Hollywood's top-grossing films of late, from "Spider-Man" to "Iron Man" to "The Dark Knight," are based on comic-book characters. If you have a big sci-fi or adventure movie due out in the next year, you skip Comic-Con at your peril. And you better arrive with some pretty jaw-dropping preview footage if you want to stir up interest on Twitter and the movie blogs.

Judging by Twitter, this year's big winner at Comic-Con was "Tron Legacy," the sequel to 1982's groundbreaking cult classic "Tron." "Tron Legacy" was a "trending topic" on Twitter this past weekend — meaning everyone was talking about it.

Once you've seen the test footage screened at Comic-Con, you'll know why. (You can view it here.) Not only are the first film's light cycles back, upgraded for the 21st century, so is the original's star, Jeff Bridges. And in more ways than you might suspect.

This year's "Tron Legacy" test footage was a more polished version of footage screened at Comic-Con in 2008. That screening created enough positive buzz to make Disney confident in greenlighting the sequel. Such is the power of Comic-Con.

The film positioned as 2010's early frontrunner, "Iron Man 2," probably didn't need the publicity, but disappointing the Comic-Con hoards would have been a damaging P.R. blunder. That's not how you reward an audience that helped turn a lesser-known Marvel Comics hero into a box-office champ, to the tune of $318 million in North America alone. So, this year, director Jon Favreau and star Robert Downey Jr. wowed the audience with five minutes of footage, including glimpses of new co-stars Scarlett Johansson (Black Widow) and Mickey Rourke (Whiplash) and great one-liners like Downey's Tony Stark quipping, "I have successfully privatized world peace."

Proving it is now a force on this side of the Atlantic, too, the British sci-fi series "Doctor Who" elicited cheers as preview footage of this year's Christmas episode debuted at Comic-Con, and I suspect more than a few fans in Great Britain are jealous their American cousins got the first look.

But just as sci-fi and adventure films have taken over what was once a small event devoted to comic books, this year yet another party crasher threw its weight around.

Legions of "Twilight" fans invaded Comic-Con to see new clips of the upcoming sequel, "New Moon." And the sound those "Twilight" fanatics made whenever they saw a shirtless Robert Pattinson could be heard by dogs in several surrounding states.

No, Comic-Con definitely isn't what it used to be. And that's not always for the best.

Thursday, July 23, 2009

Culture Shock 07.23.09: 40 years later, moon hoax theories persist


This week, America marked the 40th anniversary of the Apollo 11 moon landing. But even after four decades, some people still don't believe.

Earlier this year, I met an earnest young man who was absolutely convinced we never went to the moon. He said the U.S. government faked Apollo 11 and the five subsequent lunar landings for propaganda purposes.

I didn't ask, but I can only assume he believes the Apollo 13 mission was faked, at great expense, just so Tom Hanks could one day turn "Houston, we have a problem" into a catchphrase.

You may think you've seen NASA archival footage of the Apollo astronauts traipsing around the lunar surface and hitting the occasional golf ball, but it all really took place on a super-secret soundstage. Yep, it was all just an elaborate hoax designed to fool the Soviets and the American people, too.

That's how the theory goes, anyway. And the fellow I met this past spring is not alone in buying into it. The moon hoax movement dates to the end of the Apollo program.

In 1974, Bill Kaysing, a former technical writer for Rocketdyne, self-published "We Never Went to the Moon: America's Thirty Billion Dollar Swindle." The book makes his case for why going to the moon was impossible. For example, Kaysing claims, no space capsule could safely pass through the Van Allen radiation belt that surrounds the Earth.

Astronomer Phil Plait, who spent 10 years working on the Hubble Space Telescope and currently blogs for discovermagazine.com, debunks the major moon hoax claims at his Web site, Bad Astronomy.

Conspiracy theorists haven't improved much on Kaysing in the past 35 years. The wacky moon hoax theories you hear on late-night talk radio pretty much all originate with his book. Four years before he died, Kaysing resurfaced on the ludicrous 2001 Fox special "Conspiracy Theory: Did We Land on the Moon?" True to form, he made the same old, shopworn arguments that scientists have refuted time after time.

Don't get me wrong. I'm the last person who would say you should trust the government. And I'm sure the more than $33 billion spent on the Apollo program could, in theory, have secretly funded a lot of nefarious black ops programs. But the evidence of our having gone to the moon is overwhelming.

Just this past week, NASA released photos, taken from a satellite in lunar orbit, that plainly show the Apollo landing sites, abandoned instruments and even the footpaths the astronauts left behind. Everything was exactly where it was supposed to be.

If it is a conspiracy, it's one too sophisticated for our federal government to have pulled off. I don't give the folks in Washington that much credit.

The late Robert Anton Wilson used to draw a distinction between conspiracy theories, which could be true, and paranoid conspiracy theories that claim there is one giant

Conspiracy behind everything. He ought to know. Wilson co-authored "The Illuminatus! Trilogy," a sprawling science-fiction epic that satirizes just about every conspiracy theory imaginable.

The moon hoax conspiracy definitely falls into the paranoid category simply because of its mind-boggling scale. And to add to the paranoia, just about everyone I've encountered who believes the moon landing was faked also believes a lot of other conspiracy theories, too.

This is where it gets really crazy, because not all of these conspiracy theories are compatible. If you don't believe in the moon landings, how can you believe an alien spacecraft crashed in 1947 near Roswell, N.M., and we've been reverse engineering its advanced technology ever since?

You would think we could have used some of that alien tech to mount a real moon mission.

Hmmm. Maybe we did.

Thursday, July 16, 2009

Culture Shock 07.16.09: Meteor's deep impact brings Armageddon to your television

I was flipping channels Sunday night and happened upon the end of the world.

Jason Alexander of "Seinfeld" fame was in a military command post trying to help the Air Force — at least I think it was the Air Force — shoot down a huge asteroid that was threatening to wipe out all life on Earth.

The fate of the Earth in the hands of George Costanza? We're doomed.

Fortunately, it was just the first half of NBC's two-part miniseries "Meteor: Path to Destruction." Unfortunately, the second part airs Sunday, and at this point, I suspect many viewers would just as soon it not.

The gratuitous subtitle, "Path to Destruction," may be to avoid confusion with the 1979 feature film "Meteor," which starred Sean Connery and Natalie Wood. Or maybe not, since I'm one of only three people who remember that James Bond has already dealt with this exact same plot device.

Disaster epics like NBC's "Meteor" always seem to come in pairs. Last month, ABC aired its own ludicrous miniseries, "Impact," in which the moon ends up on a collision course with Earth. This recalls the summer when "Deep Impact" and "Armageddon" dueled it out at the box office, with the louder and sillier of the two, "Armageddon," reaping the bigger take, thanks to Bruce Willis' star power and Liv Tyler's ability to pout on cue.

There must be an audience for movies about big rocks falling from the sky or else Hollywood wouldn't keep making them. And why not? After all, it's a certainty that the Earth will end up in the crosshairs of a devastating comet or asteroid — eventually. It's happened before. Just ask the dinosaurs.

In fact, we have a close call in our near future. According to some calculations, there is a 1 in 45,000 chance that the asteroid Apophis — named for the Egyptian god of chaos and darkness — will strike the Earth on April 13, 2036. And while 1 in 45,000 may seem like pretty long odds, it's enough to make some astronomers sweat just a little.

With Apophis and other near-Earth objects looming in space, scientists have begun putting a lot of thought into how we might divert any that could become potential planet killers. Most of the ideas so far involve shifting the threatening meteor off course by a few degrees, either by attaching a rocket engine to it or by maneuvering a probe close enough that the probe's gravity can tug the meteor off target.

But nothing on the drawing board resembles Hollywood's preferred method of dealing with threats from space. Call it the Michael Bay Method, after "Armageddon" director Michael Bay.

From "Bay Boys" to "Transformers: Revenge of the Fallen," Bay has demonstrated that there is one solution to every problem — blow something up.

Yet despite what you may have seen in "Armageddon," blowing up an asteroid or comet is probably the last thing you want. Instead of one big rock, you end up with a lot of smaller rocks raining down on you with almost the same mass, minus the pieces that are small enough to burn up in the atmosphere.

George Costanza didn't get that memo, which is why he and the military are launching every nuclear missile they can get their hands on, plus Patriot missiles to shoot down anything that survives the big boom.

If that seems absurd to you, that's because it is absurd. But you haven't seen anything until you've seen a soldier shoot down a small meteor with a shoulder-fired Stinger missile. Yes, "Meteor: Path to Destruction" is full of comedy gold. They should have called it "Meteor: Path to Wackiness."

I don't ask for much from network television. But is it too much to ask that each new movie about a big rock from space not be dumber than the one that came before it?

Thursday, July 09, 2009

Culture Shock 07.09.09: 'Twilight' is the Lifetime take on vampire movies

There are things worse than "Twilight." Root canals, for instance. But when it comes to major Hollywood productions, "Twilight" is amazingly, stupendously awful.

How awful is that? It's somewhere between "Ishtar" and "Battlefield Earth" on the scale of movie awfulness. It's like a Lifetime Television made-for-cable version of a vampire movie, except it doesn't star Tori Spelling.

By this point, I realize, I've offended a lot of people. Millions, probably.

"Twilight" was the No. 7 theatrical release of 2008, grossing more than $191 million domestically. More than 3 million copies of the DVD sold on its first day of availability. And the four books that comprise Stephenie Meyer's "Twilight" saga to date have sold a combined 7.7 million copies in just the United States.

The movie version of the second book, "New Moon," is scheduled for release Nov. 20, and it will probably be one of the year's top-grossing movies, too, because it seems a lot of people really, really love "Twilight," in all of its incarnations.

Now, I'm not one to idly engage in fomenting panic and hysteria. I'm the guy who usually uses this space to tell you not to worry, because video games, trashy TV, popular music and comic books will not turn your children into juvenile delinquents, teenage mothers or that most pathetic of social outcasts, game show hosts.

But I do worry that America is raising a generation that will think vampires sparkle in the sunlight.

Whenever I see some teenage girl reading one of Meyer's books, I feel like yelling, "When I was your age, our vampires didn't sparkle in the sunlight! They shriveled up and died! Why, some of them even caught fire! And the ones who didn't at least had the common decency to lose their powers during the day! And we liked it that way!"

Then I shake my fist and tell the girl to get off my lawn.

But not to be an old fogey, I recently subjected myself to "Twilight" — after putting it off as long as I possibly could.

Reading the book was out of the question. I skimmed a few pages, and that was about all I could take. People on the Internet are churning out Kirk/Spock fan fiction written with more style. So, I watched the movie, instead.

Did I mention the movie is awful?

Still, I learned a lot by watching "Twilight." For example, I learned that vampires are basically just superheroes who crave blood, except they don't have to drink the blood of humans. Only bad vampires do that, and the bad vampires don't show up — bringing the plot along with them — until the end of the movie. Also, vampires love baseball, which gives them a chance to show off their superpowers.

Is this a vampire movie or an episode of "Smallville"?

The rest of "Twilight" is like one of those Lifetime movies in which the girl is stalked by some creepy guy who turns out to be a serial killer. Only this time, the guy is the hero, and I'm supposed to like him because he's dreamy, plus he's a vampire. And the girl is, like, totally cool with him stalking her and watching her while she sleeps. Ick.

Did I mention the girl is a teenager and the vampire was born in 1901? But that's OK, because "Twilight" is the most sexless vampire movie ever. This is the only time that drinking blood hasn't been a metaphor for anything. Besides, the lead actors, Kristen Stewart and Robert Pattinson, have all the chemistry of a lump of lead.

Immediately after watching "Twilight," I felt like I needed to crack open my skull and scrub my brain with bleach. But instead, I watched a real vampire movie, "Dracula A.D. 1972," starring Christopher Lee as Dracula and Peter Cushing as the vampire-hunting Professor Van Helsing.

Dracula hates sunlight, and when he stalks you, it's not because he has a bad case of puppy love. Now that's a vampire.

Thursday, July 02, 2009

Culture Shock 07.02.09: Summer '09 goes into celebrity-death overload

It has been a bad summer for 1970s celebrities.

Ed McMahon, Farrah Fawcett and Michael Jackson, all of whom reached the heights of their popularity during the '70s and early '80s, pulled a celebrity-death trifecta last week, dying within days of each other. They followed David Carradine, star of the cult-favorite TV series "Kung Fu," who died early last month.

But the Grim Reaper's reign of terror didn't end there. Even lesser-known celebrities were not immune. Impressionist and frequent "Late Show with David Letterman" guest Fred Travalena and TV pitchman extraordinaire Billy Mays died during the past few days, too.

(Sky Saxon, lead singer for the 1960s band The Seeds, also died. But I'm only counting celebrities I had heard of beforehand.)

"Mystery Science Theater 3000" alum Frank Conniff probably summed it up best when he posted to his Facebook page: "David Carradine, Ed McMahon, Farrah Fawcett, Michael Jackson, Billy Mays and now Fred Travalena. Just three dead celebrities away from a kick-ass episode of 'Hollywood Squares' in the afterlife."

But Paul Lynde has been keeping heaven's center square warm since 1982, so I figure the afterlife needs only two more.

Mays' death also briefly brought to mind actress Natasha Richardson's passing earlier this year. She died hours after sustaining a head injury while skiing, and early reports suggested Mays might have suffered a similar fate. Not long before his death, Mays took a blow to the head when the commercial airliner he was aboard experienced a hard landing. Preliminary autopsy results released Monday, however, ruled out head trauma in Mays' case.

Media reaction to this epidemic of celebrity deaths has been a bit strange. A lot of my friends noted Jackson's death happened just in time to overshadow Fawcett's. Her passing could have topped newscasts all weekend long, but in Jackson's wake, it was almost an afterthought.

Then Mays died, and he seemed to get a lot more attention than a TV pitchman normally would, as if to give people something to talk about besides the King of Pop.

I was surprised to learn Mays actually had a compelling life story. He was discovered while selling products on the famed Boardwalk in Atlantic City, N.J. For a TV pitchman, that has to be the fairytale equivalent of a would-be movie star being discovered at a drugstore soda counter at Hollywood and Vine.

Has Mays upped the ante? When Ron Popeil, the most famous TV pitchman of all, dies, will CNN devote wall-to-wall coverage to him? I can see it now: Even the History Channel could get involved, devoting an entire episode of "Modern Marvels" to Mr. Microphone and the Pocket Fisherman.

The weird thing about celebrity deaths is how so many people take them personally. A conversation about Fawcett's death inevitably turns to "Hey, when I was young, I used to have that poster of her in the swimsuit." Any discussion of Jacko will eventually lead to someone talking about owning "Thriller" — on vinyl — and doing the moonwalk.

Of course, virtually all of Generation X owned that poster and had a copy of "Thriller" — on vinyl — and did the moonwalk. Or at least tried to do the moonwalk. All of these "personal" experiences are as close to universal experiences as American culture gets. Yet people feel some strange urge to personalize the deaths of celebrities they probably never met and certainly didn't really know. It's probably a primal, instinctual reaction. After all, death is supposed to mean something, and we all feel like we know these people simply because we've invited them all into our living rooms.

So what if Ed never came around anymore and we hadn't seen Farrah in years and Michael wasn't the Michael we all used to know? They were the old friends we hadn't really thought about for a long time — until we learned they were gone.

And Billy was the sitcom neighbor who always let us borrow a tub of OxiClean.

Thursday, June 18, 2009

Culture Shock 06.18.09: The 'revolution' was Twittered, and I was there

"The revolution will not be televised. It will, apparently, be Twitterized."

With that tweet this past weekend, I achieved my 15 minutes of Twitter fame. I'll be available for autographs.

As protesters demonstrated in the streets of Tehran during the hours following Iran's still-contested national election, the best source of information from the Iranian capital was Twitter. The Iranian government quickly blocked the country's access to some Web sites and social networks like Facebook, but it couldn't completely block Twitter, which allows users to send brief messages to the Internet via the Web and mobile phones.

Lesson learned: If people have access to the right technology, then no government, no matter how authoritarian, can completely cut them off from the rest of the world. If you want to run a totalitarian state, you have to keep your people in the Stone Age, like North Korea does.

Twitter users use hash tags, a # followed by a keyword, to designate tweets on specific topics so that anyone can follow the online conversation. The "#iranelection" tag rose quickly Saturday to be the top "trending topic" on Twitter. As of Tuesday night, it was still No. 1, despite challenges from "Real Housewives," "Taylor Swift" and "WeirdAl" (short for parody musician "Weird Al" Yankovic, who stormed the Web with a Doors-themed music video about Craigslist).

For the first time, the world experienced a social uprising taking place in real time, 140 characters per tweet. That's a definite step up from Twitter's previous claim to fame — Ashton "Mr. Demi Moore" Kutcher's successful bid to become the first person on Twitter with 1 million followers.

In addition to people tweeting from Iran, people from around the world chimed in on Twitter to express their support for the protesters. Twitter users in the United States also tweeted their dismay about the lack of news coverage on the American cable news channels. The general sentiment was, how could CNN, Fox News Channel and MSNBC stick to their canned weekend programming when there was a "revolution" going on in Iran?

That was when I made my slightly tongue-in-cheek tweet about the revolution being "Twitterized" instead of televised.

As someone in the news business, I recognize as well as anyone the difficulties involved in switching gears to cover a breaking story, especially one on the other side of the globe, in a country that isn't exactly friendly to foreign reporters. But anything is better than the prerecorded fluff that the 24-hour news channels use to fill their weekend schedules. At least give me some think-tank nerds discussing what they think might be going on in Iran. Even if I think they're clueless hacks, I can at least shout at the TV, which is what I do most of the time I watch CNN and FNC, anyway.

Before I knew it, my tweet had been re-tweeted, including by an actual ABC News reporter working in the Middle East. Mine was the tweet re-tweeted around the world. Yay, me!

Inadvertently, I had been swept up in the #iranelection furor, but I have mixed feelings about that. It's not as if the pro-reform side in Iran doesn't have an agenda, too, and who really knows who is on the other side of a tweet? Some early Twitter reports were wrong, and some, probably, were propaganda.

Also, some Twitterers in America are now demanding that the U.S. support the pro-reform backers of Mir Hossein Mousavi against Iran's reactionary President Mahmoud Ahmadinejad. I suspect, however, that explicit American support would undermine the pro-reformers with the broader Iranian population. Memories of the United States' support of Shah Mohammad Reza Pahlavi run deep.

That's one problem with Twitter. Getting caught up in the moment can get the better of your judgment.

Thursday, June 11, 2009

Culture Shock 06.11.09: The anti-Oprah backlash goes mainstream

After years of expanding its influence into every aspect of our lives, the most powerful force in modern America finally is facing organized resistance.

I'm speaking, of course, of Oprah Winfrey.

Like Cher and Madonna, Oprah has long since grown beyond needing a last name, despite lacking any apparent musical talent. So, a backlash was inevitable. What's surprising is that it took so long to take shape.

Newsweek magazine kicked off the rebellion in earnest with a June 8 cover story featuring a photo of a deranged-looking Oprah below the big, bold words "Crazy Talk." It's the first time I can recall that any major news outlet has portrayed Oprah in such an unflattering light.

The article inside was an exhaustive expose of the dubious health advice Oprah and her guests regularly peddle to her millions of viewers — advice that could actually make you sick. Or worse.

Oprah responded with a mealy-mouthed press release that said, "I trust the viewers, and I know that they are smart and discerning enough to seek out medical opinions to determine what may be best for them."

As if that's a good excuse for dispensing questionable medical advice in the first place. Never mind that Oprah is downplaying her own influence.

Since launching her nationally syndicated talk show in 1986, Oprah has expanded her media empire into all corners. She dethroned Phil Donahue to become the undisputed ruler of daytime talk. But that wasn't enough. Soon, she moved on to producing movies — and even sometimes acting in them.

Anyone else would have been satisfied. But not Oprah. She had greater ambitions. She launched a magazine called — what else? — O, The Oprah Magazine, which has featured Oprah on every cover since its inception. She also co-founded the Oxygen TV channel, which she then sold to NBC Universal. And she recently entered into an agreement with Discovery Communications to take over the Discovery Health Channel, which will become OWN: The Oprah Winfrey Network.

Oprah has become an all-encompassing brand, and what she's selling is Oprah. Donald Trump has nothing on her.

Oprah's Book Club has choked the best-seller lists with mountains of middlebrow novels and tell-all confessionals. Even a scandal involving James Frey's alleged memoir, "A Million Little Pieces," couldn't stop her.

And finally, to the consternation of Hillary Clinton's supporters, Oprah arguably swayed the 2008 Democratic presidential primaries to Clinton's main rival, Barack Obama. And you know how that story ended.

That's real power. For all his reputation as a diabolical media mogul, News Corp. CEO Rupert Murdoch, whose holdings include the Fox News Channel, only wishes he had Oprah's ability to sway viewers.

But the anti-Oprah revolt didn't really start until Oprah announced plans to launch a talk show hosted by former Playboy Playmate Jenny McCarthy, who has gone from centerfold to anti-vaccination activist. McCarthy has already appeared numerous times on Oprah's TV show to promote her claim that preservatives in the MMR vaccine, which protects children against measles, mumps and rubella, cause autism.

Having already inflicted Dr. Phil McGraw on an unsuspecting public, Oprah now stands ready to give McCarthy a platform to promote her dangerous claims, which have absolutely no scientific backing but, if taken seriously, could leave children vulnerable to potentially deadly but easily preventable illnesses.

Bloggers and a few outspoken scientists have been taking Oprah to task for a while, but the Newsweek article has moved the fight against Oprah's destructive influence into the mainstream. All I can say is, it's about time.

Thursday, June 04, 2009

Culture Shock 06.04.09: 'Evil Dead' director Sam Raimi returns to 'Drag Me to Hell'

After seven years of guiding Spider-Man's big-screen adventures, Sam Raimi has returned to the type of filmmaking that made him a hero to legions of film geeks.

"Drag Me to Hell" is a welcome change from the overly serious, poorly lit and increasingly tiresome "torture porn" horror movies that have flooded theaters ever since "Saw" became a surprise hit. Filled with as many laughs as scares, "Drag Me to Hell" sees Raimi going back to the combination of gruesome horror, slapstick comedy and just plain questionable taste that made "The Evil Dead," "Evil Dead 2" and "Army of Darkness" cult favorites.

Christine Brown (Alison Lohman) is a bank loan officer with her eye on a vacant assistant manager position. Unfortunately, to get the job, she has to prove to her boss that she can make the hard decisions. Even more unfortunately, the hard decision she makes is to deny an elderly woman a third extension on her overdue mortgage.

Bad move. The elderly woman in question is a grotesque gypsy named Mrs. Ganush, brilliantly played by Lorna Raver. And as we all know from the movies, gypsies are either kindly, wise women who tell you how to kill werewolves or evil, spell-casting crones. Guess which one Mrs. Ganush is.

In this age of record foreclosures, this is one little old lady most of us probably wouldn't mind seeing tossed out of her home of 30 years.

Enraged at having been shamed in public, Mrs. Ganush targets Christine for a grizzly revenge. An evil goat spirit — don't ask — will harass Christine for three days. Then, on the fourth day, it will drag her spirit to hell to be tormented for all eternity.

Hey! That's just like the movie's title!

So, with the help of her sympathetic boyfriend (Justin Long, the Mac guy in the "I'm a Mac, I'm a PC" commercials) and a fortune teller (Dileep Rao), Christine tries to undo the curse before it's too late. In the meantime, however, she has to deal with threats that range from embarrassing to life-threatening, as the evil goat spirit — seriously, how often does anyone get to type that? — makes her life miserable. You just know that dinner party at her boyfriend's parents' house isn't going to end well.

If you've seen Raimi's "Evil Dead" movies, you know pretty much what to expect. "Drag Me to Hell" is full of hideous corpses vomiting embalming fluid, invisible monsters slapping people around and, of course, projectile eyeballs. It's not a Sam Raimi horror movie without someone's eyeball getting knocked out of its socket and flying across the room.

Amazingly, Raimi makes all of his old tricks work while staying within the confines of a PG-13 rating. It helps that the violence in his movies, however bloody, has always had more in common with the Three Stooges than with "Saw III."

Two scenes in particular, one in a parking garage and the other in a graveyard, show that Raimi hasn't lost his flair for over-the-top violence that can make you jump out of your seat one moment and leave you holding your sides the next.

Lohman takes it all in stride, whether she is getting thrown across a room, attacked by an animatronic goat, nearly drowned in an open grave or assaulted by a possessed handkerchief — again, don't ask.

If "Drag Me to Hell" has a flaw, it's that the twist ending comes as no surprise. It's obvious from the start of the final act. But given Raimi's irreverent approach to the material, maybe that is intentional. Why bother trying to hide an obvious plot twist? You might as well let everyone in on the joke as soon as possible.

And in case you're wondering, yes, Raimi's 1973 Oldsmobile Delta 88, which has appeared in all of his other movies, shows up here, too. In fact, the only thing missing is Bruce Campbell ("Burn Notice"), who stars in Raimi's "Evil Dead" trilogy and has cameos in the "Spider-Man" movies.

But even without Campbell, "Drag Me to Hell" is lots of fun. If any summer movie tops it for pure entertainment value, that will be a surprise.

Thursday, May 28, 2009

Culture Shock 05.28.09: Alabama beer snobs can finally toast to victory

A proposal that state lawmakers once considered dead on arrival now has beer connoisseurs toasting victory.

Last week, Gov. Bob Riley signed into law a bill raising the alcohol content of beer sold in Alabama from 6 percent by volume to 13.9 percent. That has opened the door to a new world of beers that will finally be available here legally.

According to Free the Hops, the grass-roots organization that lobbied for the higher limit, only a couple of the 100 top-rated beers in the world were available in Alabama before the law took effect.

Distributors and stores still have some bureaucratic paperwork to sort through, but the new limit comes in time for the Magic City Brewfest on June 5 and 6 in Birmingham, which will feature a number of previously forbidden brands. Unfortunately, the bill passed too late for Huntsville's Rocket City Brewfest earlier this month.

This victory for beer lovers in Alabama follows years of setbacks. At the end of the Legislature's 2007 session, a similar bill to raise the beer alcohol limit won the Shroud Award as the "deadest bill of the legislative session."

In 2008, another attempt to raise the limit led to one of the most bizarre episodes of political theater ever to play out on the floor of the state House of Representatives. Questioning why anyone would want to allow gourmet, craft-brewed beer into Alabama, Rep. Alvin Holmes, D-Montgomery, uttered the now infamous line, "The beer we got drink pretty good, don't it?"

When informed that some Free the Hops supporters were German employees at the Mercedes plant in Vance, near Tuscaloosa, Holmes exclaimed, "From Germany!"

You'd think we fought two world wars just to keep from drinking German beer.

Mind you, the Germans are serious about their beer. Local legend has it that Cullman County lost the Mercedes plant to Vance because Cullman County is dry.

But this year, the Free the Hops bill finally passed, no thanks to Holmes, who voted no, of course.

Apart from his apparent grudge against the Germans, Holmes is a staunch defender of "the beer we got" — mass-produced brands like Budweiser, Miller and Coors. I guess you could say he's a blue-collar beer guy. And that isn't surprising, given that beer has, until recently, been mostly a blue-collar beverage.

When you think about beer's place in popular culture, you probably think about Homer Simpson chugging mugs of Duff at Moe's bar, or maybe you think about the Bandit hauling 400 cases of bootleg Coors from Texarkana, Texas, to Atlanta while trying to elude Sheriff Buford T. Justice.

Then there's David Lynch's 1986 film "Blue Velvet," in which the villain, Frank Booth, played by Dennis Hopper, colorfully explains why Pabst Blue Ribbon, the ultimate blue-collar brew, is better than Heineken.

Class may have played a role in prolonging the fight to raise Alabama's beer alcohol limit. Overcoming the state's usual religious objections to anything related to beer, wine and spirits would have been easier if all of the state's beer drinkers had thought they had a stake in it. But most probably agreed with Rep. Holmes, and they didn't care about expensive, craft-brewed beers with hard-to-pronounce names.

With most of the state's beer drinkers sitting on the sidelines drinking their Bud Lights, it was up to just the connoisseurs to get the bill passed. But now the fight is over, and Alabama is open to the best beers in the world.

Personally, I'm not too bitter that it took so long, and I'm willing to share the spoils. There'll soon be enough good beer for everyone.

Thursday, May 21, 2009

Culture Shock 05.21.09: I am not Spock, and neither is President Obama

I don't know who started it, but it's gone on long enough. So, knock off the Obama/Spock comparisons.

It seems like every columnist, pundit and blogger inside the Washington, D.C., Beltway is comparing the president to the world's most famous pointy-eared, green-blooded extraterrestrial. But I'm not buying it.

President Barack Obama is not Spock. I'll grant that the president sometimes comes across as cold and emotionless, and with a little Photoshop manipulation, he even looks a bit like a Vulcan. But the similarities end there.

For one thing, President Obama isn't someone who makes decisions based solely on logic. When he describes what he is looking for in a new U.S. Supreme Court justice, for example, he says he wants someone with "empathy."

"I will seek someone who understands that justice isn't about some abstract legal theory or footnote in a casebook," the president said. " ... I view that quality of empathy, of understanding and identifying with people's hopes and struggles, as an essential ingredient for arriving at just decisions and outcomes."

Last I checked, empathy was one of those human emotion thingies.

If anything, the president seems a lot more like Dr. McCoy, who, I should note, had one of his finest moments in a "Star Trek" episode titled "The Empath."

Now, I am certainly not saying that the president is Dr. McCoy. As McCoy would say, "I'm a doctor, not a politician." And the last thing I want is to start another silly Obama/"Star Trek" character meme. Anyway, I'm surprised none of the president's supporters have likened him to Scotty, the USS Enterprise's chief engineer and resident "miracle worker," who can fix anything just in the nick of time. But maybe they're downplaying expectations.

The comparison of Obama and Spock is supposed to be positive, at least as long as you overlook that whole "pon farr" thing, the sometimes violent Vulcan mating season that occurs every seven years. Past administrations, however, have elicited more negative comparisons to sci-fi characters.

If I had a bar of gold-pressed latinum (that's a "Star Trek" reference) for every time someone compared former Vice President Dick Cheney to Darth Vader or Emperor Palpatine, I could retire the national debt and have enough money left over to replace those two Death Stars.

And don't think it was just a bunch of wacky liberals who got mileage out of comparing President George W. Bush's administration to the Galactic Empire in "Star Wars." Some of President Bush's most ardent supporters made such comparisons, too. Some friends they were.

Writing for The Weekly Standard, a conservative magazine, Jonathan V. Last argues that "the truth is ... (George) Lucas confused the good guys with the bad. The deep lesson of Star Wars is that the Empire is good."

Last's reasoning is almost too twisted to be believed. He writes, "The destruction of Alderaan is often cited as ipso facto proof of the Empire's 'evilness' because it seems like mass murder — planeticide, even. As Tarkin prepares to fire the Death Star, Princess Leia implores him to spare the planet, saying, 'Alderaan is peaceful. We have no weapons.' Her plea is important, if true. But the audience has no reason to believe that Leia is telling the truth."

Now, maybe I'm reading too much into it, but I think Last just compared Princess Leia to Saddam Hussein and Alderaan to Iraq, which, as it turned out, also didn't have any weapons — at least not of the "mass destruction" variety.

If Republicans are wondering why they lost the most recent presidential election, maybe it's because they embraced their inner Empire.

As you can see, comparing political leaders to sci-fi characters brings nothing but trouble.

Thursday, May 14, 2009

Culture Shock 05.14.09: There's no escaping the legacy of being a Trekkie

I suspect that no matter what I say about the new "Star Trek" movie, I'm going to come across like a deranged Trekkie.

If I say I love it, that means I'll lap up anything with the name "Star Trek" attached. If I say it sucks, however, that just means I'm an obsessive fanboy who can only nitpick the new movie for failing to live up to the sainted Gene Roddenberry's vision.

All of you newcomers to the "Trek" franchise have it easy. If the new movie is your first encounter with the USS Enterprise and its crew, no one is going to fault you for either liking it or hating it. But if you're an old-timer like me, there is no getting away from the past.

Regardless of what I think about director J.J. Abrams' reboot of the "Trek" franchise, it's seemingly impossible for me to escape all of the stereotypes built up in the past 43 years of "Trek" fandom. Middle-aged men who wear Spock ears. A woman who showed up for jury duty wearing her homemade Starfleet uniform. Couples whose wedding ceremonies were conducted in Klingon. Trekkie obsessions take the form of "infinite diversity in infinite combinations," as the old Vulcan saying goes.

As for me, I've never been to a "Star Trek" convention. I've never worn a Starfleet uniform, and I can barely read Klingon, much less speak it. I'm just a guy who has seen every episode and every movie and maybe read a "Star Trek" comic book or 20. So what if I can quote "The Wrath of Khan" pretty much verbatim? I can do the same thing with "Smokey and the Bandit" and "Raiders of the Lost Ark." Does that make me a freak?

Don't answer that.

If you prick me, do I not bleed? Is my blood green?

This is my Kobayashi Maru — the "no-win scenario" made legendary in "Star Trek II: The Wrath of Khan." But like Capt. James T. Kirk, I don't believe in the no-win scenario.

So, what's my verdict on the "Star Trek" reboot? It's OK. It's not terrible, like "Star Trek V: The Final Frontier" or the two last "Next Generation" films, "Insurrection" and "Nemesis." But don't let the hype fool you because it's no "Wrath of Khan" or "First Contact," either. Abrams has given us a film that, by "Trek" standards, is pretty much middle of the pack.

The new "Trek" gets a lot of things right. Zachary Quinto ("Heroes") gives a credible performance as the young Mr. Spock, and Karl Urban is near perfect as Dr. McCoy. After a while, even Chris Pine's Capt. Kirk starts to grow on you, even if he doesn't attempt to channel any of William Shatner's staccato delivery and Canadian accent, which is almost certainly for the best. No one is going to out-Shatner William Shatner.

Still, these are not exactly the same characters I grew up with. The plot involves an insane Romulan captain (Eric Bana), who travels from the 24th century to the 23rd century and changes history, creating an alternate timeline. The main fallout is that Kirk grows up without his father and seems destined never to become captain of anything — until, that is, he's challenged to enter Starfleet Academy.

Fortunately, the original Spock (Leonard Nimoy) arrives to from the future to set things back on track and, ironically, add a touch of the old-style "Trek" humanism to the proceedings.

But — and you knew there was a "but" coming — Abrams' "Star Trek" suffers from plot holes so large even a cadet could pilot a starship through them and science that's dodgy even by "Trek" standards.

And those are minor faults compared to some of the decisions Abrams and the screenwriters make. Turning the young Chekov (Anton Yelchin) into a hyperactive Wesley Crusher clone is not a good idea, nor is giving Scotty (Simon Pegg) an Oompa Loompa for a sidekick. Meanwhile, poor Eric Bana's Nero will go down as one of the least inspired villains in "Trek" history. He has no good lines and almost nothing to do but growl occasionally.

Still, "Star Trek" is an enjoyable thrill ride, and it sets up sequels that have a brand new "Trek" universe to play in.

Anyway, those nagging faults probably won't bother anyone but deranged Trekkies — like me. And who am I kidding? That's what we Trekkies live long and prosper for.

Thursday, May 07, 2009

Culture Shock 05.07.09: Finding madness at the bottom of the world

When Werner Herzog told the National Science Foundation he wanted to make a documentary in Antarctica, he warned them in advance. This was not going to be a heartwarming movie about marching penguins.

Nevertheless, the most arresting scene in "Encounters at the End of the World" does involve a penguin.

On a rocky outcropping at the bottom of the world, Herzog interviews a researcher who has spent so much time among the penguins that he barely speaks to humans. Desperate to get the researcher to talk, Herzog asks questions about penguin sex habits. The researcher seems skeptical about reports of gay penguins. Then Herzog asks if there is such a thing as madness among penguins. Do any penguins ever get so fed up with their colony that they simply go insane?

The researcher says he has never seen a penguin bashing its head against a rock, but sometimes penguins become disoriented and end up in places they shouldn't be.

Then, as if on cue, a penguin demonstrates the madness Herzog is seeking.

The lone penguin refuses to follow the others to the water to feed and refuses to return to the colony. After a few minutes, it finally turns and waddles toward the mountains, located far into the bleak, icy continent's interior. Herzog tells us that when a penguin sets out like this, nothing will stop it. It will press on until it meets its fate, which is certain death.

No marching penguins. No penguins with happy feet. It is this one penguin, which Herzog describes as "deranged," that captures Herzog's interest and, one suspects, his heart as well. If the famed German filmmaker has a soft spot for anything, it's suicidal madness. He's a romantic, albeit one with a dark outlook on life.

His earlier documentary, 2005's "Grizzly Man," covers similar terrain. Herzog tells the story of Timothy Treadwell, using, in part, Treadwell's own movie footage, left behind after his death in 2003. Treadwell's derangement, if you want to call it that, cost him his life.

Over the course of 13 summers, Treadwell, an environmentalist and activist, lived among the grizzly bears of Katmai National Park and Preserve in Alaska. He clashed frequently with the National Park Service, which cited him numerous times for reckless behavior. In 2003, Treadwell and his girlfriend were mauled to death and eaten by at least one bear.

Treadwell thought he was protecting the bears from humans, when in reality he needed protection from the bears. For Herzog, that is a kind of madness.

Unlike Treadwell, the scientists and support staff in "Encounters at the End of the World" are professionals with a regard for safety and a respect for the dangers the remote, untamed continent presents. Working near an active volcano or underwater beneath several feet of solid ice, they find those dangers impossible to ignore.

After seeing some of the small but monstrous predators that lurk in the Antarctic's waters, Herzog speculates that it must have been similar terrors that drove our ancestors millions of years ago to evolve to live on land. Fish evolved into amphibians and escaped the ferocious depths.

But for Herzog, that was only a temporary reprieve, and humanity's eventual end can't be prevented by combating climate change or embracing New Age, environmentalist philosophies. Herzog is nothing but dismissive of people he deems to be "tree huggers." He takes a longer view.

The vast majority of all of the species that have ever lived on Earth — more than 99 percent of them, in fact — are extinct. Those are pretty long odds against us not eventually going the way of the dinosaurs.

Yet, as depressing as this all sounds, there is something captivating about Herzog's fatalism. In spite of it all, he thinks people's hopes and dreams matter, and he is fascinated by the odd assortment of dreamers who come each summer to live in one of the world's most hostile environments — people, like that penguin, who just want to get away from it all, no matter the cost. If that is derangement, Herzog seems to sympathize.

"Encounters at the End of the World" is now available on DVD.

Thursday, April 30, 2009

Culture Shock 04.30.09: It's 'Red Dawn' in America again

I've seen my share of ill-advised remakes. "Halloween." "Psycho." Two unnecessary retreads of "King Kong." But they're nothing compared to what lurks on the horizon for next year.

That's when MGM plans to unleash its remake of 1984's "Red Dawn."

The original "Red Dawn" has aged about as well as legwarmers, Aquanet hair and pre-ripped jeans.

The movie's plot can best be described as "The Breakfast Club" meets "Invasion U.S.A." And by "Invasion U.S.A.," I mean either the 1952 Red Scare flick about godless commies descending on America or the 1985 Chuck Norris movie about godless commies descending on America. Take your pick.

"Red Dawn" opens with the Russians and Cubans invading the U.S. Afterward, it's up to a bunch of high school students to mount a resistance. Along the way, they get training from a downed Air Force pilot played by Powers Boothe, which allows them to stage guerrilla raids and fight their way to friendly territory.

The movie is best known for launching the careers of Patrick Swayze, Charlie Sheen and C. Thomas Howell.

OK, let's be honest about that. C. Thomas Howell hasn't had much of a career. But that's what he gets for agreeing to star in "Soul Man."

Anyway, back when Ted Turner still controlled TBS, rumor had it that he personally programmed the channel by spinning a big wheel with movie titles printed on it. Then he'd air whichever movie the arrow landed on. Unfortunately, the wheel listed the titles of only three movies: "Beastmaster," "Roadhouse" and "Red Dawn."

Now, it could be that I'm the person who started that rumor. That's neither here nor there. The point is, "Red Dawn" aired on TBS a lot when I was a teenager.

"Red Dawn" is written and directed by John Milius, who also gave us Arnold Schwarzenegger in "Conan the Barbarian." As one of Hollywood's few out-of-the-closet conservatives, Milius apparently decided that it was his duty to make an anti-communist movie that was every bit as paranoid and absurd as the anti-business and anti-nuclear movies Hollywood was churning out at about the same time.

Seven years later, the Soviet Union would be gone. But "Red Dawn" would still be on cable TV and still taking itself way too seriously.

But for all of its dreary, depressing earnestness, "Red Dawn" is now a cult favorite, which is probably why someone at MGM thinks it is prime remake material. There's just one tiny, little problem. The Soviet Union is no more. It has ceased to be. It has expired and gone to meet its maker. It is an ex-country.

But never fear. There still is one Big Bad out there to serve as the heavy — China.

According to a review of the remake's script, the new "Red Dawn" has the Chinese invading the U.S., with the Russians coming in later to back them up. So, basically, China is the new Soviet Union and Russia is stuck with the role of Cuban flunky. My, how the Russian Bear has fallen.

The irony here is that the new "Red Dawn" is probably being shot with camera equipment made in China. And that brings up another glaring flaw with this remake. As unlikely as a Soviet invasion of the U.S. was in 1984, a Chinese invasion in 2010 is even more farfetched.

I'm pretty sure we never bought movie equipment from the former Soviet Union, but we buy almost everything from China. Last year alone, the U.S. and China did $409 billion worth of business with each other, with China racking up a $266 billion trade surplus, which it then used to buy a sizable chunk of U.S. debt.

China is far more likely to turn us over to a bill collector than invade. And that might be worse. Think of all the harassing phone calls.

Thursday, April 23, 2009

Culture Shock 04.23.09: There's no scream like a Wilhelm scream

What would the movies be without screams?

The thought of it is almost too terrible to entertain. It's like imagining Sonny without Cher, Oats without Hall or Garfunkel without Simon. And we don't have to imagine any of those. We know exactly how bad they are.

Often, screams make the movie, especially horror movies, where bloodcurdling screams of sheer, abject terror are a necessity. A subset of Hollywood actresses is even known for their ability to scream on screen. "Scream queens," they're called.

But one scream has taken on a life of its own. You've heard it. In fact, you've heard it probably dozens of times, even if you've never heard of it.

Among Hollywood sound editors, the scream is legendary. It even has a name. It's the Wilhelm scream.

The Wilhelm scream was originally recorded for "Distant Drums," a 1951 Warner Bros. film starring Gary Cooper. It makes its debut when a character is eaten by an alligator.

The scream is, needless to say, distinctive. Unfortunately it's impossible to get across in print. But I'll try. It goes something like "aaaaAAAAuuuuhhhh!"

If, however, you prefer a more lifelike — or, in this case, death-like — example, an audio file of the scream is available on the Wilhelm scream's Wikipedia page.

The scream didn't get its name until its second appearance, in the 1953 Western "The Charge at Feather River." In that film, an arrow strikes a minor character, named Wilhelm, in the leg. Wilhelm, of course, screams, as people with arrows in their legs tend to do. The rest, as they say, is history.

Today, the Wilhelm scream is more famous than either of the first two movies in which it appeared. A band is named after it, YouTube videos celebrate it and nearly 150 movies feature it in one way or another. And the list is growing.

I was reminded of the Wilhelm scream last weekend, as I was indulging in a marathon viewing session of the British sci-fi series "Primeval," which is currently airing on the Sci-Fi Channel. Wilhelm gives a big shout-out once every episode. If you're playing a "Primeval" drinking game, taking a shot whenever you hear the Wilhelm scream is a must — along with taking a shot whenever a character goes in search of a dangerous prehistoric predator without taking along a firearm. But that is another column.


The Web site Hollywood Lost and Found has done an excellent job of uncovering the Wilhelm scream's history, along with cataloging the films in which the scream appears. Over the years, the Wilhelm scream has become something of an inside joke among sound editors, who insert the scream, where appropriate, as a way of saying "hi!" to their peers.

At first, the scream appeared only in Warner Bros. films, but by the 1970s it was seemingly everywhere. It's used in the "Star Wars" films and the "Indiana Jones" movies. You say you have a Nazi falling out of a truck and onto the hood of a jeep? That sounds like a job for the Wilhelm scream. A stormtrooper falling down a Death Star shaft or an alien falling into the Sarlacc pit? That calls for a Wilhelm scream.

In space, no one can hear you scream. Unless it's a Wilhelm scream.

Cropping up mostly when movie characters fall from great heights or die in explosions, the Wilhelm scream, unlike those characters, refuses to stay down. And sometimes it even shows up in a situation like the one for which it was originally intended. In "Indiana Jones and the Temple of Doom," the third and final time the Wilhelm scream appears is when the villain is eaten by alligators.

The scream has been a part of films as diverse as "Kill Bill, Vol. 1" (in which it appears at least twice during the same fight scene) and "Toy Story" (Buzz Lightyear knocked through a window).

But what of the man who recorded the scream in the first place? No one knows for sure, but the best evidence suggests the real-life "Wilhelm" was Sheb Wooley, who is best known for the song "The Purple People Eater."

I bet if someone were eaten by a Purple People Eater, the last thing out of that person's mouth would probably sound like a Wilhelm scream.

Thursday, April 16, 2009

Culture Shock 04.16.09: Warning: You may need to Google these retro references

Assuming you live long enough, there comes a time when you suddenly realize you can't understand what the younger generation is talking about. And apparently, that works both ways.

Explaining Twitter to someone who grew up with rotary dial telephones requires an answer that is longer than 140 characters. Meanwhile, whenever I explain rotary phones to someone under the age of 20, they look at me as if they think I knew Alexander Graham Bell personally.

Bell died in 1922, by the way.

But according to Ralph Keyes, technology isn't the only thing that leads to communication breakdowns between generations.

Keyes, author of "I Love It When You Talk Retro: Hoochie Coochie, Double Whammy, Drop a Dime and the Forgotten Origins of American Speech," says people my age and older — mostly older — might as well be speaking Aramaic, as far as young people are concerned.

(I'm assuming everyone knows what Aramaic is, even though few people still speak it, because it's the language Jesus spoke. Also, it was mentioned in 1975's "Monty Python and the Holy Grail.")

Writing at Editor & Publisher magazine's Web site, Keyes is particularly hard on my profession, accusing us of "retrotalk," which he says is "terminology rooted in our past that may not be familiar to younger readers. Or immigrants. Or anyone at all, for that matter."

Keyes picks on New York Times columnist David Brooks — admittedly, an easy target, although not as easy a target as Brooks' colleague Thomas Friedman.

Brooks' sin was comparing Hilary Clinton to Howard Beale, the fictional newsman in the 1976 movie "Network," who famously said, "I'm mad as hell, and I'm not going to take this anymore!"

Or maybe not so famously, Keyes thinks.

But I guess he doesn't know that part of Beale's speech is an Internet hit, featured in the YouTube video "40 Inspirational Speeches in 2 Minutes." The video has racked up more than 1 million views so far. And it's nothing more than a cleverly edited inspirational speech cobbled together from inspirational speeches featured in movies spanning seven decades.



Who would have thought something so retro could be so popular with all those young people who use YouTube?

Keyes also cites a Miami Herald writer, who confused his editor by referring to Mayberry in a column. It seems the editor grew up in a home without a television and wanted to know where Mayberry was.

Do you mean to tell me that someone actually needs to have seen "The Andy Griffith Show" to know that the fictional town of Mayberry is where the show took place? Well, Shazam!

I think there are some bits of cultural knowledge people should just know, regardless of their age. After all, I've never managed to sit through "Gone with the Wind" (1939), but I do know tomorrow is another day.

Sometimes, if you don't get the reference, it isn't your fault. Some writers probably are too obscure, gleefully tossing out references to dusty cultural artifacts just as Dennis Miller did during his "Saturday Night Live" years. For example, does anyone remember Dennis Miller? Anyone? Bueller? Bueller?

On the other hand, is there anyone who doesn't know the origin of the phrase "Beam me up, Scotty"? If high school literature teachers think I should remember the importance of the line "To be or not to be," I should have the expectation that the average Joe on the street will know what I mean if I ask, "Where's the beef?" Especially if I ask in a crotchety old woman's voice.

If, however, the phrases "Beam me up, Scotty" and "Where's the beef?" are a mystery to you, please look them up on Google. But if the word "Google" has you confused, well I can't help you with that.

Explaining Google is a lot like explaining Twitter. And I just ran out of space.

Thursday, April 09, 2009

Culture Shock 04.09.09: Movie incentives offer opportunities for state — and me

"Space Camp" and "The Long Walk Home" were shot in Alabama. Some of "Close Encounters of the Third Kind" was filmed here, too, although not the famous Devil's Tower sequence, obviously.

But apart from "Big Fish" in 2003, it has been a while since a major Hollywood production set up shop in Alabama. The 1999 film "Crazy in Alabama," for example, stars Speake, Ala., native Lucas Black and is set in Alabama. But it was filmed in Louisiana.

That sort of thing sticks in the craw of Alabama's state legislators. Not only is it an affront to the state's honor — if you care about such things — it means millions of dollars are going to states like Louisiana that are standing in for the real Heart of Dixie. And who in Alabama doesn't care about that?

So, this year the Alabama Legislature finally passed an incentive package — basically a lot of tax breaks — designed to lure Hollywood producers to the state. But first, the proposal's backers had to win over the state's most powerful lobby, the Alabama Education Association.

AEA had opposed previous tax breaks for the movie industry, claiming they would cost the state's education budget much-needed funding.

Now, correct me if I'm wrong, but if Alabama was doing virtually no movie business before the tax breaks, and thus making almost no money off movie productions, how exactly were the tax breaks going to cost anything? Any percentage of zero is still zero. So, either someone in the AEA is pretty bad at math, which is a sobering thought, or AEA was just looking for a payoff.

It's almost like a Hollywood script. "The Godfather," maybe. After all, AEA Executive Secretary Paul Hubbert is a bit like the Don Corleone of Alabama politics. He makes politicians offers they can't refuse.

Well, this year AEA got its payoff, in the form of a law making it easier to tax out-of-state corporations that do business in Alabama, and dropped its opposition to the movie incentives bill, which, with the governor's signature, is now law.

Now, Alabama can not only play itself in the movies, it can stand in for other places as well. In theory, it could become the South's answer to Vancouver, British Columbia, which doubles for just about everywhere. Ever see Jackie Chan's "Rumble in the Bronx"? It is set in New York City, which makes the occasional appearance of the Rocky Mountains in the background of some scenes unintentionally hilarious.

But Hollywood's deep pockets weren't the only things on legislators' minds when they passed the incentives bill. Lawmakers were thinking about jobs, too.

Athens State University and Calhoun Community College are moving forward with a program to train students for jobs in the movie and television business. Soon, downtown Decatur, Ala., could be home to classes that prepare young Alabamians to be gaffers, best boys and other movie professionals with strange job titles. Someone around here could be the next Steven Spielberg, or at least the next Roger Corman.

This is a great opportunity. If there had been a film school around here when I graduated from high school, do you think I would have gotten degrees in economics and political science? Not a chance.

But it is still good news for me because I'll need trained professionals to work on my film, "Kung Fu Biker Zombies vs. the Vampire Strippers from Hell in 3-D." Granted, I don't have a script yet, just a plot synopsis scribbled on a cocktail napkin one late night at The Brick Deli & Tavern. But I feel good about this project. And with the tax breaks Alabama is offering, I might be able to bring it in under the $30 million budget I have in mind.

Sure, my original plan was for the movie to be set in Nevada, but it's not too late to make revisions for the sake of my native state.

Thursday, April 02, 2009

Culture Shock 04.02.09: 'Atlas' shrugs up sales charts as economy falls

More than 50 years after its publication, Ayn Rand's bestseller "Atlas Shrugged" is again moving up the sales charts.

According to The Economist, "Atlas Shrugged" has reached as high as No. 33 on
Amazon.com's best-seller list, briefly topping President Barack Obama's "The Audacity of Hope."

Renewed interest in the novel began with the economic downturn, starting with the Federal Reserve's interest rate cuts last year. Sales spikes then coincided with the mortgage and bank bailouts, and the passage of the president's economic stimulus package.

Every time the economy takes a hit or the federal government grows larger, new readers flock to Rand's novel. But why, after all this time, is Atlas still shrugging?

Although set in the 1950s and somewhat dated by its focus on the railroad industry, "Atlas Shrugged" still reads like dystopian science fiction, describing a near-future world on the brink of economic collapse as governments restrict, regulate and expropriate private businesses in a failed attempt to keep the system going. To a lot of people, apparently, that near future seems a lot like now.

In the novel, the few businessmen, artists, thinkers and other producers who have not compromised their principles or been co-opted by the government go on a "strike of the mind." Tired of being taxed and regulated, they abandon their companies and other projects. They deprive the world of their creative talents. And, eventually, they join the book's mysterious hero, John Galt, in a secret hideaway. There, they watch as the world crumbles and plan their return, when they will rebuild along lines that respect the individual, creativity and, by implication, laissez-faire capitalism.

Over the years, Rand has taken a beating both for her literary talents and her philosophy of rational self-interest, which she called Objectivism. A small number of academic philosophers treat Rand's ideas seriously, and I was a student of one of them at Auburn University in the early 1990s. But the official Objectivist movement, led by the Ayn Rand Institute, hasn't served Objectivism well, instead treating Rand's novels and essays as holy writ.

Yet, as a literary figure, Rand endures, to the consternation of the literary establishment. She was born in Russia, and her novels, especially "Atlas Shrugged" and "The Fountainhead," combine the scope of Russian epics with a distinctly American pulp style. They are an odd mix of both highbrow and lowbrow sensibilities that continue to enthrall and inspire readers. Rand even weaves science fiction into her works, especially "Atlas" and her novella "Anthem," which was reprinted in a 1953 pulp sci-fi magazine.

Rand's sometimes unlikely admirers include Angelina Jolie and Brad Pitt. Jolie is among the contenders to star in a movie version of "Atlas Shrugged," which is in development at Lionsgate for a tentative 2011 release. Perhaps the novel's newfound popularity will finally get the long-delayed project in front of the camera.

Meanwhile, conservative pundits, particularly Michelle Malkin, are citing anecdotal cases of people "going Galt" — voluntarily reducing their income to avoid paying higher taxes. Of course, one wonders where these would-be John Galts were when President George W. Bush was increasing federal discretionary spending by nearly 50 percent.

There is a bitter irony to Rand's resurgence. Most if not all of the blame for the U.S. economy's woes lies with former Federal Reserve Chairman Alan Greenspan, a former acolyte of Rand's. Many of Rand's other followers — and economists sympathetic to Rand's pro-capitalist politics — have denounced Greenspan as a sellout, some warning beforehand that his policies would lead to trouble. Certainly, the Greenspan who wrote in favor of the gold standard and against the Fed in the 1960s, when he was part of Rand's inner circle, is hard to square with Greenspan the Fed chairman.

It makes you wonder if Greenspan ever imagined himself as one of Rand's villains, because he makes for a pretty convincing one.

Thursday, March 26, 2009

Let the ‘Battlestar Galactica’ arguments begin

After a four-hour miniseries, 73 episodes, a TV movie and numerous Internet “webisodes,” “Battlestar Galactica” has finally reached its journey’s end. Now the arguments can begin.

If you haven’t yet seen the series finale, consider yourself warned. There are spoilers ahead.

The two-hour conclusion of “Battlestar Galactica,” which aired Friday, gave its major characters closure, but it left some major plot points unanswered. Or, to be more exact, it answered them ambiguously.

Were the apparitions that appeared to Baltar and Caprica Six really angels? Was everything that happened in the series really the will of God, or some deity-like facsimile? What was Starbuck anyway? Was she an angel or a demon? And who or what resurrected her after her apparent death in season 3?

The only thing certain is that the ending has people talking. Fans who have followed “Galactica” since the 2003 miniseries seem split. Some think they were cheated. Some think producer/writer Ronald D. Moore delivered an emotionally satisfying conclusion. Some think Moore was just “making it up” as he went and didn’t know how to end the series. And others are just confused.

Put me in the camp that is pleased with how Moore ended the series. All of the major characters reached the end of their personal story arcs and were better for the journey, even if some of those endings were bittersweet. As for the unanswered questions, science fiction is still fiction. In science, it is nice to have answers, but in fiction it is usually more fun to be left with questions.

I don’t really care what Starbuck was or how she returned from the dead. What is important is that her character finally found a purpose — along with inner peace. And when she simply vanished without a trace, her mission at last complete, that floored me as much as anything on television has in a long, long time. It was as close to perfect as television gets.

Did Moore and his writing staff make things up as they went? Sure. You can’t expect a TV series to come out looking like a novel. I suspect a lot of sci-fi fans have taken away the wrong lesson from “Babylon 5,” J. Michael Straczynski’s epic “novel for television,” which spanned five seasons. Straczynski started out with a definite beginning, middle and end in mind. But he still had to make major changes as the series progressed, some of which improved on his original plan.

Even when it comes to novels of the regular paper variety, it is rare for an author’s outline to survive the first draft. The difference is when you read a novel, you’re getting a finished product, but when you watch a TV series — even a so-called novel for television — you’re getting a work in progress, one episode at a time. You have to expect the writers to change their minds, try new ideas and abandon ones that aren’t working. They wouldn’t be doing their jobs if they just stuck to their original plan. Few writers are clever enough to get it right the first time.

In any case, it could be worse. At least “Battlestar Galactica” didn’t end like “The X-Files.” It would take a PowerPoint presentation, complete with flow charts, to explain the ins and outs of that show’s conspiracy-laden mythology.

“Battlestar Galactica” leaves its fans with lots of great moments and lots more to debate. That puts it in some pretty special company. Other classic SF series that had controversial endings include the British series “The Prisoner” and the animated Japanese series “Neon Genesis Evangelion,” both of which still set off heated arguments years and, in the case of “The Prisoner,” decades after they left the air. Both also remain extremely popular. AMC is producing a miniseries remake of “The Prisoner,” while in Japan, new feature-film retellings of “Evangelion” are still landing in theaters.

So, thanks, “Battlestar Galactica,” both for the memories and for the arguments to come.

Thursday, March 19, 2009

Sci-Fi Channel finally does away with the ‘Sci-Fi’

What is in a name? Quite possibly more than even Shakespeare ever dreamt.

The Sci-Fi Channel, which has gone by just “Sci Fi” for most of the decade, announced Monday that it will have a new name as of July 7.

Say goodbye to Sci Fi, and say hello to SyFy — different spelling, same pronunciation. And unlike the generic term “sci-fi,” which was coined by the late Forrest J Ackerman, SyFy has the advantage of being something the network can trademark, said Bonnie Hammer, former Sci-Fi Channel president and current head of parent company NBC Universal Cable Entertainment.

Unfortunately, SyFy has the disadvantage of being lame, if not downright insulting to sci-fi fans. Yet it took marketing geniuses years to come up with it — if they actually did. The name originally belonged to a Web site called SyFy Portal, which sold the SyFy name and reopened as Airlock Alpha.

Sci Fi executives have been looking for a new name and a new image for a long time. In fact, they seem embarrassed by the idea of being associated with science fiction.

TV Week cites television historian Tim Brooks, who helped launch the Sci-Fi Channel when he worked at USA Network, Sci Fi’s sister channel.

“The name Sci Fi has been associated with geeks and dysfunctional, antisocial boys in their basements with video games and stuff like that, as opposed to the general public and the female audience in particular,” Brooks said.

So, in case you were wondering what Sci Fi’s executives think of their core audience — the geeks and “dysfunctional” fanboys who have watched the channel religiously since it started up in 1992 — now you know.

Current Sci Fi President Dave Howe was a little more diplomatic.

“If you ask people their default perceptions of Sci Fi, they list space, aliens and the future,” he said in a New York Times story. “That didn’t capture the full landscape of fantasy entertainment: the paranormal, the supernatural, action and adventure, superheroes.”

Well, we certainly can’t have a TV channel devoted to space, aliens and the future, now can we? In fact, during her tenure as president, Hammer pretty much let everyone know of her contempt for real science fiction. She canceled “Farscape,” a popular series set in space and loaded with extraterrestrials, and packed the schedule with “reality” and paranormal programs like “Crossing Over with John Edward.”

From the beginning, Sci Fi aired a lot of horror and fantasy programming, and most sci-fi fans accepted that. In the publishing world, science fiction, fantasy and horror have always been linked. But then Sci Fi aired “Braveheart,” a movie that isn’t sci-fi, fantasy or horror. It’s just bad history. After that, nothing was off limits. Not even professional wrestling.

With the new name, SyFy, Hammer finally has what she always wanted: something that sounds like sci-fi and looks a little like sci-fi, but isn’t sci-fi.

Sure, SyFy will still air science fiction, with “Caprica” — a prequel series to “Battlestar Galactica” — and a new “Stargate” series yet to come. But I would expect more wrestling and action movies if I were you. Toss in some “C.S.I.” reruns, and SyFy would probably look a lot like Spike TV.

Along with the new name, the channel is also getting a new slogan, “Imagine Greater,” and you have to wonder how long it took the marketing guys to come up with that.

What does it even mean? Greater than what? Maybe they mean “Imagine More” or “Imagine Better.” Is there a language on Earth in which “Imagine Greater” even makes sense? Maybe it sounds better in the original Klingon, or would if SyFy didn’t hate aliens.

Hey, SyFy executive folks, how about trying to “Imagine Grammar” instead?

Thursday, March 12, 2009

Children’s brains may evolve for Information Age

If TV doesn’t rot your child’s brain, the Internet will, according to the latest scary speculation.

Susan Greenfield, an Oxford University neuroscientist, says exposure to social networking Web sites like Facebook and Twitter, along with video games and fast-paced TV shows, may be rewiring children’s brains.

I can think of two reasonable responses to this claim: “So what?” and “So what?”

At one level, this is exactly the sort of thumb-sucking response that has always greeted new media. Since the 1930s, do-gooders have blamed jazz, comic books, rock ’n’ roll, television, and now video games and the Internet, for everything wrong with “the children” — even when there isn’t anything wrong with children.

Like Paul Lynde in the Broadway musical “Bye Bye Birdie,” they ask, “Why can’t they be like we were, perfect in every way? What’s the matter with kids today?”

Juvenile crime across the board is down compared to 10 years ago. Fewer teens are having sex, and the ones who do are more likely to use protection. Fewer teens are getting abortions.

OK. I give. What exactly is the matter with kids today? If online social networking is bad for them, it hasn’t shown up in the statistics yet.

But let’s assume MySpace and Facebook really are rewiring children’s brains. In fact, I would be surprised if that were not the case. The real question is “Why is this a problem?”

This isn’t the first time the environment has rewired our brains. Before our ancestors developed written languages, they had excellent memories. But writing has taken away our ability to remember epics like “The Iliad.” Still, that seems a good trade-off.

Adults often complain that change happens too quickly and that the world is becoming too fast-paced. If children’s brains are adapting to the pace of modern society, that seems like a net plus. A brain that is better at filtering information from many different sources in a short span of time is a brain that is better suited to the Information Age.

How fast-paced is the world today? One useful measure is the length of the average movie shot. An average shot in a major motion picture today is 2 or 3 seconds long. By comparison, film critic Roger Ebert notes, the average shot in “Citizen Kane” (1941) is 11.4 seconds. By today’s standards, “Pulp Fiction” is a slow movie, which isn’t surprising given that Quentin Tarantino is, in many respects, an old-fashioned filmmaker. The average shot in “Pulp Fiction” is 7.9 seconds.

There is a question of cause and effect here. Are movies contributing to children’s shorter attention spans by rewiring their brains? Maybe, but I doubt that’s the whole answer. I suspect, instead, that movies are mainly catering to the shorter attention spans that have resulted from the quickening pace of life in general, with movies being a small part of that.

Phil Edholm, chief technology officer at Nortel, created a minor stir last year when he suggested that Attention Deficit Hyperactivity Disorder, at least in its mild form, may be an evolutionary adaptation for multitasking in the digital era. Parents of ADHD children where not amused. But maybe there is some truth to his speculation.

Scientists at Northwestern University have discovered that a genetic variation associated with ADHD seems to help people in nomadic cultures survive. The variation also seems to encourage novelty-seeking behavior. The researchers speculate that some ADHD-associated traits may have been beneficial in our prehistoric past.

If modern society is taking on attributes that make it more like our hunter-gatherer past — for example, greater mobility and a faster pace than agricultural societies — then genetic traits that were advantageous in the past could reassert themselves.

If so, that’s not cause for alarm. That might be just evolution.