Thursday, July 30, 2009

Culture Shock 07.30.09: Comic-Con grows into Hollywood's biggest party


It's all over but the fumigation.

More than 125,000 people attended this year's Comic-Con International in San Diego this past weekend. To give you an idea how many people that is, Comic-Con could qualify as the fifth-largest city in Alabama, just after Huntsville and nearly 2 1/2 times the size of Decatur.

But despite the name, Comic-Con isn't just about comic books, nor has it been for the past decade. Comic-Con is now the biggest entertainment event this side of the Oscars, and in the grand scheme of things, it's far more important than any mere awards ceremony.

Comic-book writers, artists and their fans, like the 300 or so who attended the first Comic-Con in 1970, are now second-class citizens — make that third-class citizens — at their own party. They've been given a big, collective wedgie by Hollywood's entertainment elite. It's just like junior high school all over again.

The comics geeks who started Comic-Con are, as they say, victims of their own success. All it took was movie studios realizing the convention's attendees represent a demographic that can make or break a summer blockbuster. And a sample size of 125,000 is hard to beat, even if far too many attendees forget to bathe during all the excitement, no matter how many times con officials remind them.

Plus, most of Hollywood's top-grossing films of late, from "Spider-Man" to "Iron Man" to "The Dark Knight," are based on comic-book characters. If you have a big sci-fi or adventure movie due out in the next year, you skip Comic-Con at your peril. And you better arrive with some pretty jaw-dropping preview footage if you want to stir up interest on Twitter and the movie blogs.

Judging by Twitter, this year's big winner at Comic-Con was "Tron Legacy," the sequel to 1982's groundbreaking cult classic "Tron." "Tron Legacy" was a "trending topic" on Twitter this past weekend — meaning everyone was talking about it.

Once you've seen the test footage screened at Comic-Con, you'll know why. (You can view it here.) Not only are the first film's light cycles back, upgraded for the 21st century, so is the original's star, Jeff Bridges. And in more ways than you might suspect.

This year's "Tron Legacy" test footage was a more polished version of footage screened at Comic-Con in 2008. That screening created enough positive buzz to make Disney confident in greenlighting the sequel. Such is the power of Comic-Con.

The film positioned as 2010's early frontrunner, "Iron Man 2," probably didn't need the publicity, but disappointing the Comic-Con hoards would have been a damaging P.R. blunder. That's not how you reward an audience that helped turn a lesser-known Marvel Comics hero into a box-office champ, to the tune of $318 million in North America alone. So, this year, director Jon Favreau and star Robert Downey Jr. wowed the audience with five minutes of footage, including glimpses of new co-stars Scarlett Johansson (Black Widow) and Mickey Rourke (Whiplash) and great one-liners like Downey's Tony Stark quipping, "I have successfully privatized world peace."

Proving it is now a force on this side of the Atlantic, too, the British sci-fi series "Doctor Who" elicited cheers as preview footage of this year's Christmas episode debuted at Comic-Con, and I suspect more than a few fans in Great Britain are jealous their American cousins got the first look.

But just as sci-fi and adventure films have taken over what was once a small event devoted to comic books, this year yet another party crasher threw its weight around.

Legions of "Twilight" fans invaded Comic-Con to see new clips of the upcoming sequel, "New Moon." And the sound those "Twilight" fanatics made whenever they saw a shirtless Robert Pattinson could be heard by dogs in several surrounding states.

No, Comic-Con definitely isn't what it used to be. And that's not always for the best.

Thursday, July 23, 2009

Culture Shock 07.23.09: 40 years later, moon hoax theories persist


This week, America marked the 40th anniversary of the Apollo 11 moon landing. But even after four decades, some people still don't believe.

Earlier this year, I met an earnest young man who was absolutely convinced we never went to the moon. He said the U.S. government faked Apollo 11 and the five subsequent lunar landings for propaganda purposes.

I didn't ask, but I can only assume he believes the Apollo 13 mission was faked, at great expense, just so Tom Hanks could one day turn "Houston, we have a problem" into a catchphrase.

You may think you've seen NASA archival footage of the Apollo astronauts traipsing around the lunar surface and hitting the occasional golf ball, but it all really took place on a super-secret soundstage. Yep, it was all just an elaborate hoax designed to fool the Soviets and the American people, too.

That's how the theory goes, anyway. And the fellow I met this past spring is not alone in buying into it. The moon hoax movement dates to the end of the Apollo program.

In 1974, Bill Kaysing, a former technical writer for Rocketdyne, self-published "We Never Went to the Moon: America's Thirty Billion Dollar Swindle." The book makes his case for why going to the moon was impossible. For example, Kaysing claims, no space capsule could safely pass through the Van Allen radiation belt that surrounds the Earth.

Astronomer Phil Plait, who spent 10 years working on the Hubble Space Telescope and currently blogs for discovermagazine.com, debunks the major moon hoax claims at his Web site, Bad Astronomy.

Conspiracy theorists haven't improved much on Kaysing in the past 35 years. The wacky moon hoax theories you hear on late-night talk radio pretty much all originate with his book. Four years before he died, Kaysing resurfaced on the ludicrous 2001 Fox special "Conspiracy Theory: Did We Land on the Moon?" True to form, he made the same old, shopworn arguments that scientists have refuted time after time.

Don't get me wrong. I'm the last person who would say you should trust the government. And I'm sure the more than $33 billion spent on the Apollo program could, in theory, have secretly funded a lot of nefarious black ops programs. But the evidence of our having gone to the moon is overwhelming.

Just this past week, NASA released photos, taken from a satellite in lunar orbit, that plainly show the Apollo landing sites, abandoned instruments and even the footpaths the astronauts left behind. Everything was exactly where it was supposed to be.

If it is a conspiracy, it's one too sophisticated for our federal government to have pulled off. I don't give the folks in Washington that much credit.

The late Robert Anton Wilson used to draw a distinction between conspiracy theories, which could be true, and paranoid conspiracy theories that claim there is one giant

Conspiracy behind everything. He ought to know. Wilson co-authored "The Illuminatus! Trilogy," a sprawling science-fiction epic that satirizes just about every conspiracy theory imaginable.

The moon hoax conspiracy definitely falls into the paranoid category simply because of its mind-boggling scale. And to add to the paranoia, just about everyone I've encountered who believes the moon landing was faked also believes a lot of other conspiracy theories, too.

This is where it gets really crazy, because not all of these conspiracy theories are compatible. If you don't believe in the moon landings, how can you believe an alien spacecraft crashed in 1947 near Roswell, N.M., and we've been reverse engineering its advanced technology ever since?

You would think we could have used some of that alien tech to mount a real moon mission.

Hmmm. Maybe we did.

Thursday, July 16, 2009

Culture Shock 07.16.09: Meteor's deep impact brings Armageddon to your television

I was flipping channels Sunday night and happened upon the end of the world.

Jason Alexander of "Seinfeld" fame was in a military command post trying to help the Air Force — at least I think it was the Air Force — shoot down a huge asteroid that was threatening to wipe out all life on Earth.

The fate of the Earth in the hands of George Costanza? We're doomed.

Fortunately, it was just the first half of NBC's two-part miniseries "Meteor: Path to Destruction." Unfortunately, the second part airs Sunday, and at this point, I suspect many viewers would just as soon it not.

The gratuitous subtitle, "Path to Destruction," may be to avoid confusion with the 1979 feature film "Meteor," which starred Sean Connery and Natalie Wood. Or maybe not, since I'm one of only three people who remember that James Bond has already dealt with this exact same plot device.

Disaster epics like NBC's "Meteor" always seem to come in pairs. Last month, ABC aired its own ludicrous miniseries, "Impact," in which the moon ends up on a collision course with Earth. This recalls the summer when "Deep Impact" and "Armageddon" dueled it out at the box office, with the louder and sillier of the two, "Armageddon," reaping the bigger take, thanks to Bruce Willis' star power and Liv Tyler's ability to pout on cue.

There must be an audience for movies about big rocks falling from the sky or else Hollywood wouldn't keep making them. And why not? After all, it's a certainty that the Earth will end up in the crosshairs of a devastating comet or asteroid — eventually. It's happened before. Just ask the dinosaurs.

In fact, we have a close call in our near future. According to some calculations, there is a 1 in 45,000 chance that the asteroid Apophis — named for the Egyptian god of chaos and darkness — will strike the Earth on April 13, 2036. And while 1 in 45,000 may seem like pretty long odds, it's enough to make some astronomers sweat just a little.

With Apophis and other near-Earth objects looming in space, scientists have begun putting a lot of thought into how we might divert any that could become potential planet killers. Most of the ideas so far involve shifting the threatening meteor off course by a few degrees, either by attaching a rocket engine to it or by maneuvering a probe close enough that the probe's gravity can tug the meteor off target.

But nothing on the drawing board resembles Hollywood's preferred method of dealing with threats from space. Call it the Michael Bay Method, after "Armageddon" director Michael Bay.

From "Bay Boys" to "Transformers: Revenge of the Fallen," Bay has demonstrated that there is one solution to every problem — blow something up.

Yet despite what you may have seen in "Armageddon," blowing up an asteroid or comet is probably the last thing you want. Instead of one big rock, you end up with a lot of smaller rocks raining down on you with almost the same mass, minus the pieces that are small enough to burn up in the atmosphere.

George Costanza didn't get that memo, which is why he and the military are launching every nuclear missile they can get their hands on, plus Patriot missiles to shoot down anything that survives the big boom.

If that seems absurd to you, that's because it is absurd. But you haven't seen anything until you've seen a soldier shoot down a small meteor with a shoulder-fired Stinger missile. Yes, "Meteor: Path to Destruction" is full of comedy gold. They should have called it "Meteor: Path to Wackiness."

I don't ask for much from network television. But is it too much to ask that each new movie about a big rock from space not be dumber than the one that came before it?

Thursday, July 09, 2009

Culture Shock 07.09.09: 'Twilight' is the Lifetime take on vampire movies

There are things worse than "Twilight." Root canals, for instance. But when it comes to major Hollywood productions, "Twilight" is amazingly, stupendously awful.

How awful is that? It's somewhere between "Ishtar" and "Battlefield Earth" on the scale of movie awfulness. It's like a Lifetime Television made-for-cable version of a vampire movie, except it doesn't star Tori Spelling.

By this point, I realize, I've offended a lot of people. Millions, probably.

"Twilight" was the No. 7 theatrical release of 2008, grossing more than $191 million domestically. More than 3 million copies of the DVD sold on its first day of availability. And the four books that comprise Stephenie Meyer's "Twilight" saga to date have sold a combined 7.7 million copies in just the United States.

The movie version of the second book, "New Moon," is scheduled for release Nov. 20, and it will probably be one of the year's top-grossing movies, too, because it seems a lot of people really, really love "Twilight," in all of its incarnations.

Now, I'm not one to idly engage in fomenting panic and hysteria. I'm the guy who usually uses this space to tell you not to worry, because video games, trashy TV, popular music and comic books will not turn your children into juvenile delinquents, teenage mothers or that most pathetic of social outcasts, game show hosts.

But I do worry that America is raising a generation that will think vampires sparkle in the sunlight.

Whenever I see some teenage girl reading one of Meyer's books, I feel like yelling, "When I was your age, our vampires didn't sparkle in the sunlight! They shriveled up and died! Why, some of them even caught fire! And the ones who didn't at least had the common decency to lose their powers during the day! And we liked it that way!"

Then I shake my fist and tell the girl to get off my lawn.

But not to be an old fogey, I recently subjected myself to "Twilight" — after putting it off as long as I possibly could.

Reading the book was out of the question. I skimmed a few pages, and that was about all I could take. People on the Internet are churning out Kirk/Spock fan fiction written with more style. So, I watched the movie, instead.

Did I mention the movie is awful?

Still, I learned a lot by watching "Twilight." For example, I learned that vampires are basically just superheroes who crave blood, except they don't have to drink the blood of humans. Only bad vampires do that, and the bad vampires don't show up — bringing the plot along with them — until the end of the movie. Also, vampires love baseball, which gives them a chance to show off their superpowers.

Is this a vampire movie or an episode of "Smallville"?

The rest of "Twilight" is like one of those Lifetime movies in which the girl is stalked by some creepy guy who turns out to be a serial killer. Only this time, the guy is the hero, and I'm supposed to like him because he's dreamy, plus he's a vampire. And the girl is, like, totally cool with him stalking her and watching her while she sleeps. Ick.

Did I mention the girl is a teenager and the vampire was born in 1901? But that's OK, because "Twilight" is the most sexless vampire movie ever. This is the only time that drinking blood hasn't been a metaphor for anything. Besides, the lead actors, Kristen Stewart and Robert Pattinson, have all the chemistry of a lump of lead.

Immediately after watching "Twilight," I felt like I needed to crack open my skull and scrub my brain with bleach. But instead, I watched a real vampire movie, "Dracula A.D. 1972," starring Christopher Lee as Dracula and Peter Cushing as the vampire-hunting Professor Van Helsing.

Dracula hates sunlight, and when he stalks you, it's not because he has a bad case of puppy love. Now that's a vampire.

Thursday, July 02, 2009

Culture Shock 07.02.09: Summer '09 goes into celebrity-death overload

It has been a bad summer for 1970s celebrities.

Ed McMahon, Farrah Fawcett and Michael Jackson, all of whom reached the heights of their popularity during the '70s and early '80s, pulled a celebrity-death trifecta last week, dying within days of each other. They followed David Carradine, star of the cult-favorite TV series "Kung Fu," who died early last month.

But the Grim Reaper's reign of terror didn't end there. Even lesser-known celebrities were not immune. Impressionist and frequent "Late Show with David Letterman" guest Fred Travalena and TV pitchman extraordinaire Billy Mays died during the past few days, too.

(Sky Saxon, lead singer for the 1960s band The Seeds, also died. But I'm only counting celebrities I had heard of beforehand.)

"Mystery Science Theater 3000" alum Frank Conniff probably summed it up best when he posted to his Facebook page: "David Carradine, Ed McMahon, Farrah Fawcett, Michael Jackson, Billy Mays and now Fred Travalena. Just three dead celebrities away from a kick-ass episode of 'Hollywood Squares' in the afterlife."

But Paul Lynde has been keeping heaven's center square warm since 1982, so I figure the afterlife needs only two more.

Mays' death also briefly brought to mind actress Natasha Richardson's passing earlier this year. She died hours after sustaining a head injury while skiing, and early reports suggested Mays might have suffered a similar fate. Not long before his death, Mays took a blow to the head when the commercial airliner he was aboard experienced a hard landing. Preliminary autopsy results released Monday, however, ruled out head trauma in Mays' case.

Media reaction to this epidemic of celebrity deaths has been a bit strange. A lot of my friends noted Jackson's death happened just in time to overshadow Fawcett's. Her passing could have topped newscasts all weekend long, but in Jackson's wake, it was almost an afterthought.

Then Mays died, and he seemed to get a lot more attention than a TV pitchman normally would, as if to give people something to talk about besides the King of Pop.

I was surprised to learn Mays actually had a compelling life story. He was discovered while selling products on the famed Boardwalk in Atlantic City, N.J. For a TV pitchman, that has to be the fairytale equivalent of a would-be movie star being discovered at a drugstore soda counter at Hollywood and Vine.

Has Mays upped the ante? When Ron Popeil, the most famous TV pitchman of all, dies, will CNN devote wall-to-wall coverage to him? I can see it now: Even the History Channel could get involved, devoting an entire episode of "Modern Marvels" to Mr. Microphone and the Pocket Fisherman.

The weird thing about celebrity deaths is how so many people take them personally. A conversation about Fawcett's death inevitably turns to "Hey, when I was young, I used to have that poster of her in the swimsuit." Any discussion of Jacko will eventually lead to someone talking about owning "Thriller" — on vinyl — and doing the moonwalk.

Of course, virtually all of Generation X owned that poster and had a copy of "Thriller" — on vinyl — and did the moonwalk. Or at least tried to do the moonwalk. All of these "personal" experiences are as close to universal experiences as American culture gets. Yet people feel some strange urge to personalize the deaths of celebrities they probably never met and certainly didn't really know. It's probably a primal, instinctual reaction. After all, death is supposed to mean something, and we all feel like we know these people simply because we've invited them all into our living rooms.

So what if Ed never came around anymore and we hadn't seen Farrah in years and Michael wasn't the Michael we all used to know? They were the old friends we hadn't really thought about for a long time — until we learned they were gone.

And Billy was the sitcom neighbor who always let us borrow a tub of OxiClean.