Thursday, December 25, 2008

End of piracy lawsuits an early gift to music lovers

For many Americans, it was an early Christmas present.

Friday, the Recording Industry Association of America announced it is ending its five-year campaign of suing people who share music via the Internet.

The RIAA declared victory, but to anyone reading between the lines, the triumphant bluster seemed a lot more like an admission of defeat. Although the RIAA claimed its lawsuits were necessary to save the music industry, no data indicates the suits had any impact on people’s behavior.

According to the Wall Street Journal, the RIAA initiated legal proceedings against 35,000 people — ranging from college students to grandmothers — during its war on music piracy. That number, however, was a just small fraction of the 19 percent of Internet users estimated to have shared music via peer-to-peer networks. So, unsurprisingly, fear of being sued wasn’t much of a deterrent. You might as well worry about being struck by lightning. From 2003 to 2007, that 19 percent figure remained virtually constant, and the number of songs downloaded actually increased.

Most of the RIAA’s targets settled out of court, and, ultimately, the RIAA spent more on legal fees than it recovered from shaking down defendants.

Yet, the RIAA’s president, Cary Sherman, claims the suits were a success. In an interview with, he points to the spectacular growth of authorized downloads from for-profit sites like iTunes. But all that proves is that a lot of consumers are willing to pay for digital music — if, of course, the big music companies are willing to offer it.

During most of the past decade, the music industry hurt itself by not selling music downloads. Instead, the RIAA sued Web sites like Napster to prevent people from sharing songs. Then, when peer-to-peer software made it possible for computer users to share music with each other directly, without needing centralized hubs like Napster, the RIAA went after individuals.

When that didn’t work, the music companies finally, grudgingly, started entering into agreements to sell music downloads online, first through iTunes and, later, through sites like and MySpace. But by then, many of that pesky 19 percent were already used to not paying for music. And the music industry had only itself to blame.

Still, selling music downloads has become big business, even as sales of compact discs continue to fall. Who wants a bunch of CDs littering the house when you can store thousands of songs on a computer or iPod?

According to NPD Group, paid music downloads rose 29 percent in the third quarter of 2008 compared to the same period in 2007, with iTunes and Amazon gaining 2.8 million music buyers, or about 15 percent of Internet users, CNET reported. Two major music labels, Warner Music Group and Universal Music Group, reported strong gains in digital sales, with Warner posting a 27 percent increase in the third quarter. Universal reported that its digital sales more than made up for the continued decline in CD sales.

Not coincidentally, as the music industry finally began offering legal downloads, the number of people illegally downloading music finally started to creep lower in 2008, falling to 14 percent, according to NPD.

Does that look like the result of the RIAA’s lawsuits, or the result of the RIAA’s members finally getting around to offering a legal alternative? Consumers aren’t greedy, and they want their favorite music artists to make money, even if that also means giving money to the music labels that wanted to sue them.

Nevertheless, the RIAA says it still plans to send warning letters to Internet service providers when it discovers computer users who are major copyright violators. The RIAA wants ISPs to limit or block Internet access for repeat offenders. But I have to wonder, is an ISP more likely to heed the RIAA or its own paying customers?

Somehow, I doubt the RIAA’s new strategy will be any more successful than its old one. And with legal downloads becoming easier to get, I also doubt it will matter much either way.

Thursday, December 18, 2008

Childhood holds memories of the best Christmas gift ever

I suspect most people can recall one special Christmas — the Christmas when they received a present so cool, so amazing that it could only be the Best. Christmas. Present. Ever.

For Ralphie Parker, the hero of “A Christmas Story,” it was a Red Ryder carbine-action 200-shot Range Model air rifle BB gun with a compass in the stock and a thing that tells time. I’m still not sure about the “thing.” A clock? A sundial?

For me, it was the Christmas of 1982, when I received my Atari 2600 game console. OK, yeah, what I’d really wanted was a state-of-the-art ColecoVision game system. But it was more expensive than the Atari, and money was tight. Besides, it wasn’t as if any of my friends had a ColecoVision, anyway. 

They all had 2600s, too. Except for the ones who had Mattel’s Intelevision system or Magnavox’s Odyssey2.

Actually, I don’t know anyone who owned an Odyssey2. But I do remember the floor model that mostly gathered dust at the local Otasco store.

Before the Christmas of the Atari, my best present ever had been a Mego brand Batmobile with Batman and Robin action figures. That must have been about 1975. Obviously, in just seven years, my requirements for a satisfactory Christmas present had gotten a lot more stringent.

Christmas ’77 would have been a great Christmas if the first batch of “Star Wars” toys had arrived on time. George Lucas was smart enough to keep all of the licensing rights to his movies, but not smart enough to get the toys into stories in time for the holidays. But maybe some children actually had fun playing with their Early Bird Gift Certificates, which were basically IOUs for action figures.

Some children this year will get MP3 players that have more processing power than my first game system did. It’s easy to forget, especially when the economy is in the dumps, but we’re a lot wealthier as a nation now than we were just 30 years ago.

In the early ’80s, William Shatner appeared in ads extolling the Commodore Vic-20 — “the wonder computer of the 1980s” — which cost “under $300.” The wristwatch I’m wearing now is smarter than a Vic-20, and for about $300 I can buy a decent desktop PC.

Each new Christmas brings rising expectations. I was once happy to get this thing called a “turntable” on which I placed a large, wax disk called a “record album.” When spun on the turntable, the album would play music, along with assorted hisses, pops and crackles.

Today, I’d insist that Santa Claus bring me an iPod that could hold thousands of songs. And, no, a Zune would not be a reasonable substitute.

But even with rising expectations, you reach a point when nostalgia trumps all else. I’ve received better, more advanced Christmas gifts since that Atari 2600, but none that can top it as my best gift ever. After a certain age, Christmas loses something, and not just because children stop believing in Santa. I suspect it’s when puberty hits, and suddenly you have more important things on your mind — like girls or boys or whatever — than what’s under the tree.

That’s when Christmas goes into a time capsule, waiting to be opened only when you have children of your own. And the memories in that capsule can’t be surpassed.

Lexus is running a TV advertisement in which a young child talks excitedly about how his new Big Wheel is the greatest Christmas present ever — at least until the ad shifts to the same child, now an adult, staring starry-eyed at a new Lexus in his driveway.

No one has ever given me a car for Christmas, so I’m not exactly certain how I would react. But, somehow, I think it wouldn’t top a Big Wheel. Or an Atari.

Thursday, December 11, 2008

Uncle Forry inspired Hollywood with his love of sci-fi, horror

The world’s greatest fan has died.

But Forrest J  Ackerman was more than just a fan. With his boundless enthusiasm for horror and science fiction, he inspired a generation of Hollywood filmmakers.

Ackerman, 92, died Dec. 4 of heart failure at his home in Los Angeles. His death elicited tributes from the many who knew him and the many more who only knew of him.

How different might the world have been without the man everyone called Uncle Forry? 

“Many of the technicians, special effects masters and filmmakers that work in the realms that Forry loved … do so in no small part based on the childhood passion that Forry gave them,” wrote Harry Knowles at his Web site, Ain’t It Cool News.

Knowles is often guilty of overstatement — just read some of his movie reviews — but not this time. For many young, would-be storytellers, Uncle Forry was the center of the universe.

Imagine a world without “Star Wars” or “Jaws.” George Lucas and Steven Spielberg are two of the many directors, writers and producers who were captivated by Ackerman’s magazine, Famous Monsters of Filmland. Imagine a world without “The Martian Chronicles” and “Fahrenheit 451.” Ray Bradbury credits Ackerman with helping him get his start as a writer.

“Forry changed my life,” Bradbury said in 2000, when both he and Ackerman visited Florence. “He paid attention when no one else cared.”

As Bradbury later told The Associated Press, Ackerman paid Bradbury’s way to New York for an authors meeting that launched Bradbury’s career.

“I hadn’t published yet, and I met a lot of these people who encouraged me and helped me get my career started, and that was all because of Forry Ackerman,” Bradbury said.

It started in the 1920s, when Ackerman, then 9 years old, discovered Amazing Stories. That magazine, which inaugurated the Golden Age of science fiction, gave the young Ackerman his lifelong love of the genre. He would keep that first issue his entire life, and it would become the cornerstone of his private museum of SF and horror memorabilia.

But at that point, Forry was still just a fan. In his teens, he and several other teenage SF fans — including future DC Comics editors Julius Schwartz and Mort Weisinger — founded the first science fiction fanzine, The Time Traveller(two Ls). It was a start, but Forry’s big breakthrough came in February 1958. That was when Warren Publishing unveiled the first issue of Famous Monsters of Filmland magazine, with Ackerman as its editor.

Aimed at adolescents and young teenagers, Famous Monsters featured stories about the greatest horror, fantasy and sci-fi filmmakers of the day. It capitalized on the newfound popularity of classic monster movies from the 1930s and ’40s, which were beginning to air on TV stations throughout America.

Note that I’m now using the term “sci-fi” instead of “science fiction” or “SF.” That’s because Uncle Forry coined it. Without Ackerman, the Sci-Fi Channel would have to call itself something else.

Famous Monsters let its readers in on the inner workings of sci-fi and horror movies. Some of those readers got the idea that they could make movies, too.

Ackerman left Famous Monsters in the late 1970s, and the magazine’s original incarnation ceased publication a few years later. But Ackerman still found ways to inspire the next generation of fans.

Every weekend, Ackerman opened his home to visitors, who traveled from just about everywhere to see the artifacts he had spent decades assembling. When I met Uncle Forry eight years ago, he had a few mementos with him: the cape Bela Lugosi wore in the stage production of “Dracula,” the ring Boris Karloff wore in “The Mummy” and face paint that had belonged to Lon Chaney Sr. And each item had a story.

Uncle Forry’s gift for storytelling is what set him apart. He seemed to know everything and everyone, and he counted people like Bradbury, Karloff and Vincent Price among his closest friends.

Sure, he edited a magazine and several books. And he had small roles in numerous films made by the people he helped lure into moviemaking. But he remained, first and foremost, a fan. And his infectious love of sci-fi helped change the world.

Thursday, December 04, 2008

The Rickroll is dead; long live the Rickroll

I hereby declare the end of the Rickroll.

The Internet prank, which involves a seemingly cool Web link that goes instead to a video of ’80s pop star Rick Astley singing “Never Gonna Give You Up,” is passé now that the ultimate Rickroll has been pulled.

During NBC’s annual broadcast of the Macy’s Thanksgiving Day Parade last week, Astley got the last laugh. He appeared unexpectedly on the Cartoon Network float and lip-synced his most infamous hit single.

In an instant, Astley had Rickrolled more people than any online prankster could ever dream of. No one can top that, so it’s time to stop trying.

The Rickroll is over. Now we need a replacement.

Fortunately, the 1980s is a treasure trove of ghastly songs that soared to the top of the pop charts. I blame cocaine abuse.

“The Girl Is Mine.” Before they tangled over the rights to the Beatles catalog, Michael Jackson and Paul McCartney recorded two duets. The first, and worst, was 1982’s “The Girl Is Mine” from Jackson’s “Thriller.” As the first single off the most popular album of all time, “The Girl Is Mine” was everywhere. Now that the drugs have worn off, however, we can recognize what an unbearably hokey song it is.

“Ebony and Ivory.” Earlier that year, McCartney teamed up with Stevie Wonder for another duet, “Ebony and Ivory.” The song reached No. 1 on both the U.S. and British pop charts, proving that bad taste doesn’t respect geographic boundaries. As trite as it is awful, “Ebony and Ivory” uses the black and ivory keys of a piano as a metaphor for racial harmony. And in case you don’t get the message, McCartney and Wonder’s harmonizing is also a metaphor for racial harmony. Listening to the song today is like having a piano dropped on your head.

Let this be a lesson to everyone: If McCartney wants to record a duet with you, just say no.

“Ice Ice Baby.” This song was Vanilla Ice’s contribution to bad music, giving the world insightful lyrics like “Will it ever stop? Yo, I don't know/Turn off the lights, and I'll glow.” Vanilla Ice, aka Robert Matthew Van Winkle, almost won a Grammy for this atrocity. However, he lost to ...

“U Can’t Touch This.” Yes, there was a time when people thought MC Hammer was cool. He was so cool, in fact, that you couldn’t touch him — presumably because you’d get frostbite if you did. So, he penned the song “U Can’t Touch This.”

Most recently, the man otherwise known as Stanley Kirk Burrell was getting in touch with his spiritual side as the host of a show on the Trinity Broadcasting Network. But whether or not his soul is immortal, his song is — like a vampire sucking the life out of anyone it encounters.

“Who Can It Be Now?” I will admit a nostalgic fondness for Men at Work’s other hit, “Down Under,” a reggae-tinged anthem to the manly virtues and poor dietary habits of Australians. But “Who Can It Be Now?” hasn’t aged well. Neither has …

“Somebody’s Watching Me.” This is the only hit by Rockwell, son of Motown Records founder Berry Gordy. Both “Who Can It Be Now?” and “Somebody’s Watching Me” deal with paranoia and have lyrics such as “Is it the man come to take me away?/Why do they follow me?” and “When I’m in the shower/I’m afraid to wash my hair/’Cause I might open my eyes/And find someone standing there.”

Why were these songs so popular in the ’80s? Again, I fault cocaine, which is said to cause paranoid delusions. These were songs cokeheads could relate to.

But probably the best potential successor to “Never Gonna Give You Up” is …

“Girl You Know It’s True.” Milli Vanilli won a Grammy for Best New Artist after releasing this earnestly lame single in 1989. But the duo had to give back their award because someone else actually recorded the vocals. The truth, however, is this song is bad no matter who sings it.

The ’80s gave us so much terrible music that I could go on for hours. But it takes a truly heartfelt, unironic and irredeemably shallow song to replace “Never Gonna Give You Up.” It set the standard.

Wednesday, November 26, 2008

Auburn and Alabama fans are driven by tribal instinct

If you think we’ve come a long way from our primitive ancestors, look around at your neighbors this weekend, and you’ll likely see we haven’t evolved all that much.

In fact, you may only have to look in a mirror.

A significant proportion of Alabamians will be decked out in tribal colors — either orange and blue for Auburn University, or crimson and white for The University of Alabama. But the odd thing is that most people wearing those colors in anticipation of the annual Auburn/Alabama football game didn’t attend either university.

It doesn’t end there. Some members of each tribe turn entire rooms of their homes into shrines filled with football memorabilia. They dress the family dogs in team sweaters. They walk down the wedding aisle to the tune of their school’s fight song. They name their children after dead coaches.

Making fun of such excess is easy. Actually, I encourage it. But that extreme behavior is, in a way, only natural. In fact, it has roots in 200,000 years of human pre-history.

Our earliest human ancestors lived in the African savanna and survived by hunting and gathering. The rule of the day was eat or be eaten. In such an unforgiving environment, cooperation within a tribe was a necessity.

Every living being, humans included, is a complex machine with one biological purpose — to spread its genetic material. “Be fruitful and multiply” isn’t just a Biblical commandment. It is encoded in our DNA.

Members of hunter-gatherer societies best fulfilled their genetic imperative by cooperating with the people they saw every day, mostly close relations. That’s how tribal bonds formed, and it’s why, even today, people have an enormous capacity for social cooperation. Evolutionary biologists call it “reciprocal altruism,” but everyone else calls it “you scratch my back, and I’ll scratch yours.” (In fact, our chimpanzee cousins practice this literally when they groom each other.) Our brains are wired for it because it’s a powerful survival mechanism, and thus favored by natural selection.

But there is a downside to our mental wiring. While it encourages cooperation within a tribe, it often leads to conflict with other tribes, especially if those other tribes are competing for the same resources. We are “us,” and they are “them.”

Down through the centuries, that tribal instinct had led to racism, ethnic hatred, religious conflict and, on a larger scale, wars between nations fighting to control land, resources and wealth.

Fortunately, human societies evolve far more quickly than human biology does. So, we’ve come up with ways to close the distance between us and them. The most successful is trade. Frederic Bastiat, a 19th century French economist, said, “When goods don’t cross borders, armies will.” Today, large-scale trade between nations is the norm, not the exception. The more trade there is between any two given countries, the less likely those countries will end up at war with one another. As a result, the number of wars going on at any particular time has dropped steadily since the end of the Cold War.

Within nations, individuals are also interconnected by trade. We all do business with each other, and most of our old tribal divisions have broken down or substantially weakened. But we still have that tribal instinct, which now expresses itself, usually, in more peaceful ways — like identifying with collegiate and professional sports teams. We have rituals, like tailgating, that we share with fellow tribesmen, and we compete with other tribes by participating in office betting pools.

We’re driven to join tribes, which is why people who never attended Auburn or Alabama still choose sides. And all tribes seek to grow, which is why natives often pressure newcomers from out of state to pick a team.

You can join a tribe even if you don’t care about sports. For better or worse, tribes form around everything from TV shows (Trekkies) to rock bands (Deadheads) to political candidates (Obamaniacs).

So, when you’re putting on your war paint for the big game Saturday, remember that you’re part of a tradition that dates to the Stone Age. And if your team loses, just tell yourself that football is still better than hunting zebras.

Thursday, November 20, 2008

New ‘Star Trek’ prequel promises everything you don’t want to know

Growing up in the 1970s and watching reruns of “Star Trek,” I always wondered what Capt. Kirk was like when he was young.

Wait. No I didn’t. Actually, the thought never crossed my mind. But now that it has, I’m reasonably sure I don’t want to know what James T. Kirk was like before he became captain of the USS Enterprise.

But the idea of going back in time to explore the early years of beloved characters is just too tempting for Hollywood. So, on May 8, 2009, we’ll get to see Kirk, Spock, McCoy and the rest of the Enterprise crew as they were before they began their famous five-year mission.

The trailer for director J.J. Abrams’ new “Star Trek” debuted last weekend in front of “Quantum of Solace,” and it confirmed my worst suspicions. It’s no wonder some longtime “Star Trek” fans have already dubbed this latest installment in the franchise “Dawson’s Trek.” I, however, prefer to think of it as “Young Phasers.”

The preview opens with a vintage 20th century sports car speeding across a desert, and just before the car flies off a cliff, its young driver jumps out. That’s when a 23rd century police officer walks up and asks the boy his name. “My name is James Tiberius Kirk,” the mop-headed little punk says.
Yes, before he became the greatest captain in Starfleet history, Kirk was an annoying brat with authority issues. Does that remind you of any other famous sci-fi character you know?

The trailer then jumps forward, where we get to see a brooding, twentysomething Kirk (Chris Pine) riding a motorcycle and watching the Enterprise’s construction at a Starfleet shipyard. (My inner science geek insists that I now note how silly it is to build a ship like the Enterprise on Earth rather than in orbit.) Meanwhile, voiceover narration describes Kirk’s troubled youth and unfulfilled destiny, and I half expect Samuel L. Jackson to appear and tell me Kirk is the chosen one who will bring balance to the Force.

Sorry. I was thinking about that other sci-fi character again. You know — the one who used to be cool. The one who was the baddest bad guy ever to appear on a movie screen. Until, that is, we saw him as a whiny 8-year-old and then, later, as an even whinier young adult.

Darth Vader killed incompetent subordinates on a whim and struck fear into an entire galaxy. Anakin Skywalker, on the other hand, complained a lot and missed his mommy. Are we really supposed to buy that Anakin could eventually become Vader?

When you venture into an iconic character’s past you’re treading on sacred ground. We knew before the “Star Wars” prequels that Anakin had a tragic fall from grace, but seeing it only cheapened it. So, do we really want to take the risk of watching Kirk grow up? We know he was the only Starfleet cadet ever to beat the no-win scenario of the Kobayashi Maru test. (He cheated, and then received a commendation for original thinking.) Do we really want to see the gory details?

The fact that Pine looks like he just stepped out of an Abercrombie & Fitch advertisement doesn’t help. No one ever confused William Shatner with an underwear model.

Face it. No matter how cool you are as an adult, you still don’t want your parents showing your friends old photographs and home movies of you. Unfortunately, Abrams’ “Star Trek” reboot looks an awful lot like Kirk’s home movies. How is he supposed to score with alien women and the occasional android if everyone sees what he was like as a petulant youth?

Hannibal Lecter was a scarier villain before “Hannibal Rising” delved into his childhood and revealed the trauma that turned him into Hannibal the Cannibal. Darth Vader was more menacing before we learned he was an idiot whose only qualification for being a Jedi was a high midi-chlorian count. (They probably have antibiotics for that.) And, trust me, Capt. Kirk is cooler without our knowing about all of his youthful screw-ups.

Thursday, November 13, 2008

Is ‘Star Trek’ actors’ feud real or a fraud?

Just when you thought it was safe to go to a “Star Trek” convention, Kirk and Sulu are fighting again.
Tales of how William Shatner is hated by most of his former “Star Trek” co-stars — with the exception of Leonard Nimoy — are legendary. But during the past few years, it seemed most of the old feuds subsided. That is, until the latest round of sparring broke out between Shatner and George Takei.

The latest flare-up began when Shatner complained on his Web site that Takei hadn’t invited him to his wedding. Takei then said he had invited Shatner. Takei issued the following statement: “It is unfortunate that Bill was unable to join us for our wedding as he indeed was invited to attend. It is our hope that at this point he joins us in voting NO on Proposition 8, which seeks to eliminate the fundamental right for same-sex couples to marry in California.”

The situation deteriorated from there. Shatner said something about Takei having a “psychosis,” and Takei responded by basically calling Shatner an egomaniac.

The back-and-forth continued. Takei sounded off on “Entertainment Tonight,” then Shatner retaliated with a video on YouTube.

It’s times like this that I’m reminded of a scene in “Fight Club”:

Narrator: If you could fight any celebrity, who would you fight?
Tyler Durden: Alive or dead?
Narrator: Doesn’t matter. Who’d be tough?
Tyler Durden: Hemingway. You?
Narrator: Shatner. I’d fight William Shatner.

Maybe Takei is like the unnamed narrator of “Fight Club.” He just really, really wants to fight William Shatner. At the rate it’s going, the Shatner/Takei feud will soon reach the epic heights of Bette Davis vs. Joan Crawford.

Or will it?

I don’t have any evidence to back it up, but I have a suspicion that the war of words between Shatner and Takei is an elaborate put-on, just like the feuds in professional wrestling. In fact, it reminds me a lot of the greatest faux feud of them all: Andy Kaufman vs. Jerry “The King” Lawler.

In the early 1980s, Kaufman started wrestling women as part of his comedy act, proclaiming himself the “World Intergender Champion.” Lawler, a popular wrestler in Memphis, became incensed, believing that Kaufman was making a mockery of wrestling by beating up women. So, he challenged Kaufman to a match, which lasted all of a few seconds, as Lawler quickly dispatched Kaufman with a pile driver. When Kaufman got out of the hospital, he threatened to sue Lawler.

From there, the feud continued on television. Both Kaufman and Lawler appeared on “Late Night with David Letterman,” where Lawler slapped Kaufman and Kaufman, after a profanity-laced tirade, threatened to sue, well, just about everyone.

That led back to the ring, where Kaufman schemed with wrestling manager Jimmy Hart to trick Lawler. The plan worked, and instead of Lawler pile-driving Kaufman again, Lawler fell to his own signature move, delivered by other rival wrestlers.

But as Lawler now admits, the whole thing was, in wrestling jargon, a work, a scripted stunt engineered by the two of them. And it worked so well that some people still believe that maybe, just maybe it was real.

So, is that what Shatner and Takei are up to?

Think about it. The feud serves both Shatner’s and Takei’s interests. Takei can use it as a platform to speak out for gay marriage. Meanwhile, Shatner can use it to do what he does best, which is play up his exaggerated public persona, something he has been doing successfully since he portrayed himself in the 1998 comedy “Free Enterprise” and in the first batch of Priceline commercials. It’s a win-win.
But maybe I’m guilty of wishful thinking. Having to take sides in a battle of “Star Trek” icons is like a child having to pick sides in a divorce.

Probably the only way we’ll ever know for sure is if Shatner tries to pile-drive Takei or Takei hits Shatner with a steel chair.

Thursday, November 06, 2008

Orson Welles remains a study in fact vs. fiction

Tim Burton’s 1994 film “Ed Wood” features a scene in which one of the greatest directors who ever lived meets one of the worst.

Frustrated with the progress of his latest project, “Plan 9 From Outer Space,” Edward D. Wood Jr. (Johnny Depp) storms into a Hollywood bar only to meet his hero, Orson Welles (Vincent D’Onofrio). After complaining about his own struggles with financial backers, Welles gives Wood the confidence to return to the set and finish shooting “Plan 9,” making Welles, ironically, responsible for a movie frequently cited as the worst ever made, usually by people who have not seen “Doomsday Machine” (1972).

Of course, like many events depicted in “Ed Wood,” the meeting between Wood and Welles never happened. Oh, yes, Edward D. Wood Jr. was a real director who made some really bad movies, but one must never let truth stand in the way of art. And the meeting between Wood and Welles works beautifully because Wood and Welles had so much in common, except that Welles was a genius and Wood was, to put it charitably, not.

Several years ago, I met Forrest J. Ackerman, the original editor of Famous Monsters of Filmland magazine. Ackerman was also once a literary agent, representing such renowned authors as Ray Bradbury. He also represented Wood. “I was his illiterary agent,” Ackerman quipped.

Wood struggled to get money to make his schlocky sci-fi and horror movies, and after “Plan 9,” he faded into a more obscure obscurity than he had occupied before, directing sleazy sex films and hitting his friends up for work until his death in 1978 at age 54.

Welles’ problem, however, wasn’t a lack of talent. He had loads of it. Director, screenwriter, producer, actor — Welles did it all. At age 23, he conquered radio. During an Oct. 30, 1938, broadcast of H.G. Wells’ “The War of the Worlds,” he fooled (inadvertently) more than a million listeners into believing the Earth was really being invaded by Martians.

He then moved to Hollywood and seemed primed to conquer it, too. But “Citizen Kane,” the film that would eventually establish Welles as a great filmmaker, was a disappointment at the box office. Afterward, Hollywood studios were reluctant to give Welles the total creative control he enjoyed with “Kane.” RKO, which had released “Citizen Kane,” re-edited and re-shot portions of Welles’ next film, “The Magnificent Ambersons,” and Universal took over Welles’ 1958 thriller “Touch of Evil,” which was finally re-released in a form close to Welles’ vision 10 years ago.

Welles turned to financing his films independently. He spent years trying to complete his adaptation of “Don Quixote” without success, just one of many projects he left unfinished, usually because of a lack of money. Only one of those films, “The Other Side of the Wind,” seems likely to ever be released. Director Peter Bogdanovich has been working on editing Welles’ footage into a finished product.

As an actor, Welles often took jobs just to finance his movies. Most famously, he became a TV pitchman. He would “sell no wine before its time.” His legendary voice lent authority to everything from frozen peas to a dubious documentary about Nostradamus. He even had a part — his last, as it turned out — in the animated feature “Transformers: The Movie.” As he put it, “I play a big toy who attacks a bunch of smaller toys.”

But still, Welles carried on, directing films that include an inspired adaptation of Franz Kafka’s “The Trial” and his not-quite-a-documentary “F for Fake,” which was his last completed movie.
Long overlooked, “F for Fake” is one of Welles’ finest films. It’s a meditation on art and fakery, taking as its subjects art forger Elmyr de Hory and Clifford Irving, who wrote a hoax biography of Howard Hughes and a real biography of de Hory. But more than that, “F for Fake” is a magic trick. Welles was an amateur magician and knew all the tricks, especially the fine art of misdirection.

You never let truth stand in the way of art.

Thursday, October 30, 2008

Palin, economy can’t scare people from Halloween

If on Halloween night you should find yourself surrounded by Sarah Palins, don’t be alarmed. It’s not an attack of the clones.

The Alaska governor and Republican vice presidential candidate may or may not be a maverick, but she definitely is one of the hottest Halloween costumes this year. And you can take “hottest” in any sense you like.

Ricky’s Halloween Costume Superstore in New York City has come up with a “Miss Alaska” costume that is little more than a sash that says “Miss Alaska” on one side and “Miss Vice President” on the other. It does, however, include a facsimile pair of Palin’s stylish Kazuo Kawasaki eyeglasses. It’s up to the wearer to supply a star-spangled bikini like the one the model wears on the Ricky’s Web site. But what do you expect for only $22.99? It’s a costume priced for a sagging economy.

Somehow, I suspect most Palin impersonators this Halloween will stick with the standard-issue latex mask. At least I hope so, unless they happen to have a Miss Alaska figure.

As if Halloween night weren’t scary enough, the streets are sure to be crawling with people dressed as — shudder — politicians. And because this is a presidential election year, there will be plenty of Barack Obamas and John McCains to go along with such evergreens as Presidents Nixon, Reagan and Clinton.

The proprietors of online retailer Annie’s Costumes claim that since 1980, the presidential candidate to sell the most Halloween masks at Annie’s has gone on to win the election. So, in case you’re wondering, Obama has a slight lead.

That leaves Joe Biden. Is anyone dressing up as the Democratic VP nominee? Anyone? Hello?
Even with the economy slumping toward a seemingly inevitable recession, Americans seem eager to spend money on celebrating Halloween. Of course, maybe that shouldn’t be a surprise. The Golden Age of horror movies — Bela Lugosi’s “Dracula” and Boris Karloff’s “Frankenstein,” among many others —occurred during the Great Depression. Economic hardship and the macabre seem to go together.

Nearly two-thirds of Americans plan to celebrate Halloween this year, according to the National Retail Federation, which expects $5.8 billion in Halloween-related sales, up from $5.7 billion last year.

Popular non-political costumes include the Joker (as played by Heath Ledger in “The Dark Knight”) and Iron Man. But unfortunately the Iron Man costume you can buy off the rack isn’t as cool as the one in the movie.

And if you want something really scary, the most popular movie-related costumes for groups, according to, are the women of “Sex and the City.” Who hasn’t run in terror from Sarah Jessica Parker’s hats?

Everyone from Wal-Mart to Papa John’s is trying to cash in on Halloween, reported The New York Times.

“Halloween is a huge day for Papa John’s,” a spokesman for the pizza chain told The Times, “and having it fall on a Friday is a double benefit because we know there will be a lot of big parties.”
Of course, if your costume is a Sarah Palin bikini, you should probably steer clear of pizza until after Halloween.

Cable television always skews heavily toward horror programming during October, but this year offers more than usual. The Sci-Fi Channel, for example, has replaced its traditional “13 Days of Halloween” with “31 Days of Halloween” and has seen its year-on-year ratings rise as a result, The Times reported.

“With the financial turmoil, people want to escape and think about something else,” said Sci-Fi President Dave Howe.

“Something else” apparently means pleasant things like zombies, ghosts and masked serial killers. Just keep the slashers away from my 401(k).

Thursday, October 23, 2008

‘Cinematic Titanic’ invites you to go down with the ship

I love bad movies — the really bad ones that are so bad they’re good. But some are so mind-numbingly awful I can’t imagine watching them alone.

Thankfully, I don’t have to. I can board the “Cinematic Titanic” and go down with the ship, along with the creator and original cast members of “Mystery Science Theater 3000.”

What made “Mystery Science Theater 3000” so great was that no matter how terrible the movie, those three wisecracking shadows on the screen could make it not just bearable, but enjoyable. Along with Tom Servo, Crow T. Robot and Joel Robinson (later replaced by Mike Nelson), I not only survived “Manos: The Hands of Fate” and “Red Zone Cuba,” I lived to watch them again and again.

When MST3K ended after 11 seasons, it seemed like all hope was lost. I was again left to watch the worst of the worst alone.

Eventually, Michael J. Nelson, along with fellow MST3K alums Bill Corbett and Kevin Murphy, would return with RiffTrax, a Web site that lets you download audio commentaries to play along with DVDs of recent movies. It’s an experiment in giving an old-style MST3K riffing to recent theatrical releases.

But Hollywood doesn’t make bad movies like it used to. “Star Wars: Episode I — The Phantom Menace” isn’t just bad, it’s sad. Watching it is like counting every penny wasted to produce it. A truly bad movie needs more than stilted dialog and dumb characters. It needs wobbly sets dressed in dime-store Halloween cobwebs or last year’s Christmas tinsel. It needs condiments standing in for blood and entrails. In short, it needs character. Also, with RiffTrax you don’t get to see shadows talking back to the movie.

While RiffTrax is a noble effort, leave it to the rest of the MST3K cast and writing staff to get as close to the old MST3K format as possible. The result is “Cinematic Titanic.”

“Cinematic Titanic” features MST3K creator Joel Hodgson (Joel Robinson’s civilian identity), Trace Beaulieu (the original Crow T. Robot), Frank Conniff, Mary Jo Pehl and J. Elvis “Josh” Weinstein, who was the voice and puppeteer of Tom Servo when MST3K aired on Minnesota’s KTMA.

So far, the “Cinematic Titanic” crew has released four DVDs, and the response has been so positive that the plan is to release a new DVD just about every month, with each movie thoroughly eviscerated for your enjoyment. All “Cinematic Titanic” releases are available exclusively at, which also offers digital downloads via

And what a worthy batch of cinematic abortions it has been so far:

“The Oozing Skull”: The timely tale of a dictator who loves his people so much he wants his brain transplanted into a young, healthy body. He can’t just die and let any old strongman oppress his people, you know. Unfortunately his HMO doesn’t cover brain transplants, so the dictator must call on a mad scientist.

“Doomsday Machine”: It’s the end of the world, and the only survivors aren’t exactly the strongest swimmers in the gene pool. For starters, they’re on a spaceship to Venus, which isn’t exactly the most hospitable planet for refugees. Mike Farrell (“M*A*S*H”) and Casey Kasem appear in bit roles. (They needed the money, I’m sure.)

“The Wasp Woman”: A non-classic from Roger Corman, the king of B movies. This movie is why we now test cosmetics out on rabbits. A Wasp Rabbit isn’t nearly as big a deal.

“Legacy of Blood”: A dysfunctional family must survive a week in a creepy mansion in order to claim an inheritance. One of them doesn’t want to share. Stop me if you’ve heard this plot before.
Each film is a big, fat target for the put-downs the CT crew hurls at it. And best of all, we get to see the gang’s silhouettes on screen. For MST3K fans, it’s just like old times. And for everyone else, there’s no better way to watch lousy horror and sci-fi movies.

Trust me, you don’t want to watch these movies alone.

Thursday, October 16, 2008

Infamous clone of ‘The Exorcist’ is due new respect

“The Exorcist” inspired its fair share of low-budget imitators, and probably the most famous — or infamous, depending on your point of view — is 1974’s “Beyond the Door.”

Warner Bros., which released “The Exorcist,” sued the makers of “Beyond the Door” for copyright infringement but lost, clearing the way for other Italian “Exorcist” clones like “The Antichrist” (1974), which added sex and exploitation to the mix. Released on DVD in 2002, “The Antichrist” is now, sadly, out of print.

Meanwhile, “Beyond the Door” became a hit, grossing about $15 million in the U.S. That may not sound like a lot of money now, but it was a respectable sum in the early ’70s for an independently produced, Italian horror movie.

A staple at mom-and-pop video stores during the heyday of VHS rentals, “Beyond the Door” is only now available on DVD, along with two commentary tracks, interviews and trailers. Best of all, the newly remastered film looks better than it did when it originally played to theater audiences across America.

At last, we can assess this blatant — but non-infringing — attempt to cash in on the success of “The Exorcist” on its own merits.

First, to be fair to “Beyond the Door,” it’s more than just an “Exorcist” clone. It’s also a rip-off of Roman Polanski’s 1968 shocker “Rosemary’s Baby.” If Linda Blair were not only possessed by the devil but also expecting his little bundle of joy, it would look a lot like “Beyond the Door.”

Jessica and Robert Barrett are a happily married couple in San Francisco. Robert (Gabriele Lavia of Dario Argento’s “Deep Red”) is a music producer, while Jessica (Juliet Mills, most recently seen in the TV soap opera “Passions”) is a housewife. They already have two children: a pre-teen daughter who never goes anywhere without a copy of Erich Segal’s “Love Story” tucked under her arm and a young son with a pea soup fetish.

Children didn’t have juice boxes in the ’70s, so I guess it was normal back then for a boy to slurp pea soup through a straw — and to have a Warholesque poster of a soup can hanging above his bed.
No, wait, there’s nothing normal about that at all. But the conspicuous, not to mention absurd, emphasis on pea soup does blatantly conjure images of a key scene in “The Exorcist.”

The Barretts already are stuck with a daughter who swears like a sailor and a son who is just plain weird. So, the last thing they need is another child, especially one who is the spawn of Satan. But if the devil wants a baby, he’s not going to let a little thing like birth control get in the way. So, Jessica unexpectedly finds herself pregnant. And worse still, the fetus is growing at an unnatural rate. Satan, it seems, is in a hurry. Plus, the movie is only an hour and 40 minutes long.

To make matters worse, Jessica’s ex-boyfriend Dimitri (Richard Johnson of Lucio Fulci’s “Zombie”) is lurking in the shadows, occasionally emerging to intone, “The child must be born!” He is not exactly a disinterested party. Having let Jessica escape from his satanic cult years earlier, he’s now once again doing Satan’s bidding.

As the devil’s baby grows inside her, Jessica goes from wanting to abort it to threatening to kill anyone who tries to take it from her. She levitates above the bed, turns her head 360 degrees and occasionally speaks in a demonic voice. Pretty much, it’s all the same stuff we saw in “The Exorcist,” only with ridiculous dialog and those two annoying children.

But you can’t dismiss “Beyond the Door” that easily. Mills delivers a chilling and, at times, grotesque performance, and director Ovidio G. Assonitis is no slouch behind the camera. While he filmed the interior scenes in Italy, Assonitis shot most of the exteriors in San Francisco, and he makes good use of that gorgeous backdrop.

Even if Assonitis set out to do nothing more than hop on the “Exorcist” bandwagon, he did so ably. It’s no surprise “Beyond the Door” was such a success when it was first released. And now it’s a welcome addition to any horror aficionado’s DVD collection.

Thursday, October 09, 2008

Horror master Dario Argento concludes 30-year-old trilogy

Just in time for Halloween, the final installment of Italian horror maestro Dario Argento’s “Three Mothers” trilogy has hit DVD. So, the question is: Was it worth the 30-year wait?

“Mother of Tears” completes a cycle that began with Argento’s 1977 masterpiece “Suspiria” — a film that routinely makes most “scariest horror movie” lists — and continued in 1980 with “Inferno.” Inspired by “Suspiria de Profundis,” a 19th century “prose poem” by English essayist Thomas de Quincey, the trilogy tells of three witches known as the Three Mothers.

Hundreds of years old and leaving only misery and death in their wake, the three are the Mother of Sighs, based in Freiburg, Germany, and the oldest of the Mothers; the Mother of Darkness, based in New York City; and Mater Lachrimarum — the Mother of Tears — who lives in Rome, Italy.

“Suspiria” and “Inferno” dealt with the Mother of Sighs and the Mother of Darkness, respectively. Now, it’s the Mother of Tears’ turn to unleash chaos on unsuspecting mortals.

And that’s just what she does, when she reawakens after an urn containing her relics is unearthed in a church graveyard and opened by two unsuspecting museum curators.

As he did in “Suspiria,” Argento opens with a grizzly murder that makes everything that follows seem almost tame. Without cataloging the — ahem! — gory details, it’s a new height of carnage even for Argento, who has never exactly been shy when it comes to gore and bloodletting.

One curator’s death scene is so brutal, it would probably be more at home in a Lucio Fulci film. If Argento is Italy’s answer to Alfred Hitchcock, Fulci was Italy’s George Romero, except his films are even gorier than Romero’s.

The other curator, Sarah, played by Dario’s daughter, Asia Argento (“Land of the Dead,” “The Last Mistress”), escapes the slaughter only through the supernatural intervention of her dead mother, played by Asia’s real-life mother, Daria Nicolodi (“Deep Red,” “Tenebre”).

As Mater Lachrimarum’s influence spreads, Rome descends into chaos. People randomly commit acts of murder and destruction, while witches from around the world converge to celebrate the Third Mother’s rebirth.

Meanwhile, on the run from the witches and their minions, Sarah learns her mother was a white witch who died, not in an accident, but during a failed attempt to kill the Mother of Sighs. (She did, however, weaken the First Mother, leading up to the events in “Suspiria.”) Now, having inherited her mother’s supernatural abilities, only Sarah can stop the Mother of Tears before she can plunge the world into a new Dark Age.

While not perfect, “The Mother of Tears” is a satisfying conclusion to Dario’s saga. His films have always been more about visual flair than plot coherence, and this one is no exception. (If you’re looking for a flawless Argento story, you’ll have to go back to the one he co-wrote for Sergio Leone’s “Once Upon a Time in the West.”) The script relies heavily on coincidence, starting with the urn just happening to be opened by someone already connected to the Three Mothers.

Some of the casting is also suspect. Unless ghosts continue to age, which would make the afterlife suck even more, Nicolodi is 30 years too old for her role. As for the Mother of Tears herself, Israeli model-turned-actress Moran Atias spends most of the movie naked, which makes up somewhat for her deficiencies as an actress.

But nobody watches Argento’s films for their plots, and “Mother of Tears” has everything you could want from the master director, from gorgeous cinematography to meticulously staged death scenes. Of course, the film also gives Argento’s critics plenty of new ammunition — especially critics who believe Dario really, really hates women.

Even if it isn’t a masterpiece like “Suspiria” or “Deep Red” (1975), “Mother of Tears” is still probably Argento’s best film since 1982’s “Tenebre.” And that makes it well worth a rental on a cool October night.

Thursday, October 02, 2008

The Lone Ranger gets a new faithful ‘Indian’ sidekick

Johnny Depp is a busy man.

Last week, Depp announced he has signed deals to reprise the role of Capt. Jack Sparrow in a fourth “Pirates of the Caribbean” movie and play the Mad Hatter in Tim Burton’s new version of “Alice in Wonderland.”

Unlike the latter tale’s Queen of Hearts, I have a difficult time believing six impossible things before breakfast. Actually, I have a hard time believing six impossible things before lunch. So, when I read about a third role Depp is undertaking, I didn’t believe it. Not at first, anyway. I checked with at least a half dozen sources before it finally sank in.

Depp has also agreed to portray Tonto, the Lone Ranger’s “faithful Indian sidekick,” in a new version of “The Lone Ranger” to be produced by “Pirates” producer Jerry Bruckheimer.

Just repeat it to yourself: “Johnny Depp is Tonto.”

All three films are Disney projects, and the executives at Disney no doubt hope their take on the Lone Ranger is more successful than the last one, 1981’s “Legend of the Lone Ranger,” which starred two nobodies as the Lone Ranger and Tonto and was a colossal box-office flop. At least Michael Horse, who played Tonto, went on to better things, like a supporting role on “Twin Peaks.”

This is a big risk for Depp. The question is, will Native American activists be more upset that a white actor is playing a Native American character, or will they be more upset that Tonto is in the movie in the first place? I understand that in some circles, Tonto is viewed as a negative stereotype.

This could be the worst case of Native American miscasting since Burt Reynolds starred in “Navajo Joe” or William Shatner starred in “White Comanche.” Take your pick.

Actually, Depp has played a Native American before, in 1997’s “The Brave,” which he also directed.
Fun fact: Iron Eyes Cody, who portrayed “the crying Indian” in a famous 1970s public service announcement, was actually Italian. Jay Silverheels, however, who played Tonto on television in the 1950s, was actually a Native American — and also a Canadian.

While nothing is official yet, George Clooney is reportedly in negotiations to wear the mask, white hat and powder-blue cowboy outfit. Hopefully, this turns out better than the last time Clooney played a masked crime fighter. Otherwise we can look forward to Christian Bale resurrecting the Lone Ranger franchise 10 years from now in a film titled “The Lone Ranger Begins,” to be followed by either “The Lone Knight” or “The Dark Ranger.”

So, let me get this straight: The actor who killed the 1990s Batman franchise might star in a new franchise based on a character who flopped in his most recent movie outing. And the casting of another actor could lead to angry protests. Who greenlights these projects?

Still, I’ll admit I’m interested in seeing exactly how Depp plays Tonto, if for no other reason than it’s certain to be weird. Keith Richards inspired Depp’s portrayal of Capt. Jack, while Depp based his performance of Ichabod Crane in “Sleepy Hollow” on Angela Lansbury. Who will Depp channel for his portrayal of Tonto? Whoever it ends up being, I expect Depp to blow Clooney off the screen. And Depp has, on occasion, made even bad movies watchable — like “Once Upon a Time in Mexico.”

The Lone Ranger was created for radio by Fran Striker, who also created the Green Hornet. In fact, the Green Hornet is the Lone Ranger’s grand-nephew, which I guess makes it appropriate that the upcoming “Green Hornet” movie also involves an odd casting decision.

Stephen Chow is set to direct as well as co-star as the Green Hornet’s partner, Kato, taking on the role that the legendary Bruce Lee filled in the 1960s TV series. If anyone can stand in for Lee, it’s Chow. He’s not the issue.

What I want to know is, who thinks of Seth “Knocked Up” Rogen when casting the Green Hornet?
Of course, this means that in a weird, only-in-Hollywood way, Seth Rogen is related to George Clooney. And that may be stranger than anything Depp can come up with.

Thursday, September 25, 2008

Emmy honors go to programs that have found a niche

So, did you watch the Emmy Awards on Sunday night? Me neither.

This year’s awards telecast was the lowest-rated Emmy ceremony in history, and not just because of its five lackluster hosts: Ryan Seacrest, Heidi Klum, Tom Bergeron, Howie Mandel and Jeff Probst. You’d probably find more talent in a police lineup.

In Hollywood, the Emmy Awards show’s low ratings didn’t come as a surprise.

“Industry insiders expected the awards to sink to a grim new watermark given that AMC’s low-rated ‘Mad Men’ was the most-nominated drama and NBC’s modestly rated ‘30 Rock’ was the most-honored comedy,” according to The Hollywood Reporter. “Some of the other shows — such as FX’s ‘Damages’ and HBO’s ‘John Adams’ — likewise drew niche audiences.”

But as they say in computing, this is a feature, not a bug.

If programs aimed at niche audiences are finally attaining the level of quality necessary to rack up Emmy nominations, then we’re finally seeing the promise of having upwards of 100 cable channels from which to choose. This is how it’s supposed to work.

Say the words “golden age,” and most people instinctively think of some period in the past. That’s because they remember the good parts and forget the bad. But as I’ve written before, the Golden Age of Television is now.

If you like comedy, there are at least two cable channels devoted just to comedy. If you like cartoons, there are at least three cartoon channels. How about sports? There are five flavors of ESPN alone, never mind the other sports channels. If classic movies are your thing, you have two channels from which to choose. If you like gritty dramas about morally ambiguous characters who swear a lot, FX is the place for you. And if you like sci-fi, there is a channel called Sci-Fi that has, on occasion, been known to show sci-fi programming.

There’s a Travel Channel, a Golf Channel, a History Channel and a Food Network.

And if you’re still nostalgic for the television of your childhood, there are at least two channels for that. Sometimes I, too, am a sucker for 40-year-old repeats of “Mission: Impossible” and “The Man from U.N.C.L.E.”

Anyway, my point is that in this sea of channels, each one aimed at some niche audience, it’s inevitable that we’ll get a decent number of daring, well-made, risk-taking shows.

AMC is a channel that has struggled to find its place. It began as American Movie Classics and aired classic movies without commercial interruption. But it was pushed out of that market by two other channels with stronger movie libraries — The Fox Movie Channel and Turner Classic Movies.

It took several years, but AMC seems to have found its niche with a format of recent movies and original dramas. Two of those dramas won big at this year’s Emmy Awards. “Mad Men,” a period piece set at an ad agency in the 1960s, won the award for best drama series. Meanwhile, Bryan Cranston took home the Emmy for best lead actor in a drama series for “Breaking Bad,” in which he plays a terminally ill man who becomes a meth dealer to support his family. Neither is the sort of show that would have found a home back when there were only three broadcast networks and PBS.

The only thing that could ruin this Golden Age is the Federal Communications Commission, which is considering mandating that cable systems offer channels on an a la carte basis, meaning customers would pay only for the channels they want. The FCC claims that would lower cable bills by letting people drop channels they don’t want. Some in Congress have been agitating for a la carte cable TV, too, including presidential candidate Sen. John McCain.

Whether your total cable bill would go down with a la carte is debatable and depends on how many channels you want. What is not debatable is that the per-channel price of cable TV would rise and some channels with niche audiences wouldn’t have enough viewers to survive.

That would be bad news for shows like “Mad Men,” and for our new era of diverse TV programming.

Thursday, September 18, 2008

No link can be found between crime and video gaming — again

I know this may come as a disappointment to some people, but when it comes to violent crime, you have less to worry about than a decade ago.

Why would good news like that disappoint anyone? Well, as it happens, there are people who have made their careers by trying to scare the rest of us. There is an entire industry of child psychologists, “public health experts” and lawyers whose livelihoods depend on Americans being scared to death that we are raising a generation of dangerous criminals reared on violent movies and video games.

The numbers, however, tell a different story.

According to FBI statistics released Monday, violent crime was down in 2007. That follows two straight years when violent crime increased slightly, but it also continues the overall trend since 1993, when the violent crime rate plunged for 10 years before leveling out and remaining mostly stable ever since.

You certainly wouldn’t get the idea that violent crime is falling or stable from watching a typical episode of “Nancy Grace” on Headline News. You might, however, think that “Grand Theft Auto” is turning the nation’s children into vicious gangsters with no regard for human life.

Yet while violent crime stubbornly refuses to skyrocket, more American children are playing video games than ever before.

A new study released by the Pew Internet & American Life Project finds that 97 percent of American children play video games. Now, I don’t want to seem judgmental, but if your child is in that remaining 3 percent, he is probably going to be shunned by his peers. You might want to invest in a Wii.

Not all video games are created equal, of course. While there are some child psychologists — I suspect they are shilling for the theme park lobby — who insist that all video games can turn children into snarling sociopaths, most of the focus is on violent games.

Well, fully half of the boys in the Pew survey listed games rated either M (mature) or AO (adults only) among their favorites. That means efforts to keep M and AO games away from children have failed miserably. It also means it doesn’t matter. Mature and adults-only video games aren’t creating delinquents any more than comic books did during the comics scare of the 1950s.

As a rather antisocial person, I don’t regard “civic engagement” as a particularly great thing. Still, for those of you who do, I have good news: The Pew study shatters the conventional wisdom that video games make children into antisocial hermits.

The study found that the young people who spent the most time playing video games were no less likely to be involved in their communities than the young people who spent relatively little time gaming. That surprised even me because you would expect some sort of trade-off. But whatever it is gamers are giving up to have time for their video games, it doesn’t appear to be community involvement.

As an aside, do you think that when books were invented, people worried about books making children antisocial? There’s no more solitary activity than reading.

Ultimately, despite claims to the contrary, no study has found hard evidence that watching or participating in violent entertainment leads children to be more violent. The studies claiming to find such a link don’t account for the fact that naturally violent children might be drawn toward violent games.

There are, however, studies that have found children are more aggressive right after playing violent video games or watching violent movies, but that effect is only temporary. It’s nothing that would lead to an increase in violent crime, which is why there isn’t an increase in violent crime. And it’s no different than the increased aggressiveness most people experience right after watching an exciting sporting event. Anything that gets you excited is going to make you temporarily more aggressive, whether it’s “Halo 3” or the Auburn/Alabama football game.

Personally, I tend to become temporarily more aggressive after watching the evening news.

Thursday, September 11, 2008

Don’t worry; it’s only the end of the world again

If you’re reading this, it means the world didn’t end when scientists in Europe fired up the Large Hadron Collider on Wednesday.

The 17-mile-long particle accelerator constructed beneath the border of France and Switzerland will seek to unravel the mysteries of the universe by smashing particles into each other at nearly the speed of light. With any luck, scientists might even create tiny, artificial black holes, which will exist for a fraction of a second before evaporating. Assuming these black holes don’t destroy the Earth, like something out of a bad sci-fi movie.

Those crazy scientists, always tampering in God’s domain. But, hey, someone’s got to do it.
Coincidentally, a few nights ago, I saw for the first time “The Quiet Earth,” a 1985 sci-fi movie from New Zealand. Its premise is that a science experiment gone wrong causes nearly all animal life on Earth to disappear. Only three people survive — two men and one woman — which leads to exactly the sort of sexual tension you’d expect. And one of the men just happens to be a scientist who worked on the ill-fated science project. After all, someone has to tell the other two — and the audience — what’s going on.

I learned three things from “The Quiet Earth.” First, if you don’t actually like people, being the (almost) last man on Earth isn’t so bad. Second, I have a low tolerance for full-frontal nudity and cross-dressing when it involves an unattractive, middle-aged man. Third, no matter how unattractive the last man on Earth is, the last woman on Earth is guaranteed to be at least cute, if not a supermodel, which takes me back to my first point.

Ever wonder what it would feel like to be the last person on Earth? If you went to a movie theater this past weekend, you might know. The few people who did go to the movies during the first weekend after Labor Day found themselves surrounded by a lot of empty seats.

Yes, the summer movie season is over. Dead. And Nicolas Cage killed it.

His new movie, “Bangkok Dangerous,” was the weekend’s top grossing movie, but it took in only about $7.8 million, making for Hollywood’s worst weekend since “Dickie Roberts: Former Child Star,” starring David Spade, stunk up theaters in 2003.

Rest assured, when the end of the world comes, Cage will still be here, and probably still making terrible movies, along with a cast of post-apocalyptic mutants. If “Bangkok Dangerous,” “Next,” “Ghost Rider” and “The Wicker Man” can’t kill his career, nothing can. He could make “Zandalee 2” and someone would still hire him.

(If you’re too young to remember, “Zandalee” used to be in heavy rotation on late-night Cinemax. And, actually, it’s better than Cage’s “Wicker Man” remake.)

With the 2008 blockbusters behind us, the Internet gossip turned toward next year’s movie slate, and “Star Trek” fans celebrated the franchise’s 42nd anniversary by dissecting the latest rumors surrounding next year’s new “Star Trek” movie.

So, what is this new “Star Trek” movie, anyway? It has a young cast taking over the roles made famous by William Shatner, Leonard Nimoy and the rest. But is it a prequel? A sequel? A reboot?
Actually, it seems to be all three. Spock (Nimoy) will travel back in time to stop a plot to kill a young Capt. Kirk (Chris Pine) before Kirk can do all of the things that will make him a Starfleet legend — like save the universe and have sex with dozens of alien species, not counting the androids.

Presumably Spock is only partly successful, as rumor has it that he creates a totally new timeline. If we’re lucky, it’ll be a timeline in which “Star Trek: Voyager” never happens.

Hopefully, Old Spock will try to avoid running into Young Spock (Zachary Quinto) because bad things happen when a time traveler meets himself.

How bad? Well, you could cause the end of the world.

Thursday, September 04, 2008

That announcer guy from the movies dies at 68

In a world where a movie’s opening weekend could mean the difference between a blockbuster or a flop, one man held the key to box office gold.

You probably don’t know his name, and you probably never saw his face, but you know his voice.
Don LaFontaine was the voice of Hollywood. Not just any voice, but  “a deep voice that sounds like a 7-foot-tall man who has been smoking cigarettes since childhood,” as he put it when parodying himself in a movie trailer for “The Hitchhiker’s Guide to the Galaxy.”

But that voice fell silent Monday, when LaFontaine died of complications from a lung-related illness. He was 68.

Still don’t know who I’m talking about? If that’s the case, I have three words for you: “In a world.”
With those three simple words, LaFontaine became more than a man, more than a voiceover artist, more than the Picasso of movie marketing hyperbole. He became a legend.

In a gravelly baritone that made even romantic comedies seem ominous, LaFontaine narrated more than 5,000 movie trailers and more than 350,000 television commercials, according to his Web site. Some of his best known trailer voiceovers include “The Terminator,” “Terminator 2: Judgment Day” and “Batman Returns.”

LaFontaine’s voice was everywhere, as was his “In a world …” catchphrase, which became such a running gag that he parodied it himself, playing “that announcer guy from the movies” in a TV commercial for GEICO insurance: “In a world where both our cars were totally underwater …”

In a 2007 interview, he explained why he came up with his “In a world…” gimmick.

“We have to very rapidly establish the world we are transporting (the audience) to,” he said. “That's very easily done by saying, ‘In a world where ... violence rules.’ ‘In a world where ... men are slaves and women are the conquerors.’ You very rapidly set the scene.”

In a world where you have only 60 seconds to grab people’s attention, LaFontaine was king. He redefined what it meant to be a voiceover announcer. So, if he wasn’t narrating a particular trailer for some upcoming movie, nine times out of 10, it was someone who sounded a lot like him.

Probably his closest competitor in the voiceover business was the somewhat mellower East Coast announcer Hal Douglas, who for years was the voice of Miramax, A&E, The History Channel and The WB. Before The WB merged with UPN to become The CW network, you could always count on Douglas to alert you to that “very special episode of ‘The Gilmore Girls’ ” airing later that night.

In one of those showbiz twists, Douglas probably has gotten just as much mileage out of “In a world…” as LaFontaine did, which is why they’re both known, depending on whom you ask, as the “In a world” guy. And when Jerry Seinfeld wanted to parody over-the-top movie trailers in the trailer for his 2002 documentary “Comedian,” he turned to Douglas, who dutifully rattled off such movie clichés as “In a world,” “In a land,” “In a time” and “In a land before time.”

In 1997, LaFontaine and Douglas teamed with four other voiceover artists for the short film “5 Men and a Limo,” which was shown before the 26th Annual Hollywood Reporter Key Art Awards and satirized each artist’s various catchphrases.

“5 Men” is now a popular video on YouTube, as is a short film about LaFontaine himself, “Don LaFontaine: The Voice,” which opens with LaFontaine solemnly intoning these words:

“Throughout history, man has marveled at the vast complexity of the universe. Without a single unified voice, humanity has been left searching for answers to the unknown. Now one man has the power to change that and to spread his voice across the Earth for all of mankind to hear. One man — me!”

In a world without Don LaFontaine, the movies are going to seem a lot less exciting.

Thursday, August 28, 2008

‘Dark Knight’ success ushers in dark days for all

Memo to Warner Bros.: You’re doing it wrong.

As the parent company of DC Comics, Warner is increasingly desperate for superhero franchises that can match those of Marvel Studios, the corporate sibling of DC’s longtime rival, Marvel Comics.
But apart from the Batman series — ably resurrected by director Christopher Nolan — Warner hasn’t had much luck. “Superman Returns” was a bloated, expensive bore, and the only thing epic about “Catwoman” was how much of a disaster it was. “Catwoman” managed to rack up Razzie awards for worst picture, worst actress, worst screenplay and worst director. Who lets someone named Pitof direct a movie, anyway?

Warner has batted around movies based on other DC characters for years, but none has gotten past the script stage. Most recently, Warner shelved a Wonder Woman film written by “Buffy the Vampire Slayer” creator Joss Whedon.

Another memo to Warner Bros.: You’re not going to do better than a Whedon-scripted “Wonder Woman,” so stop trying.

Never fear, though. Warner has a plan to finally bring DC’s stable of superheroes to the big screen. First, Warner has scrapped its troubled “Justice League” movie for the foreseeable future. Instead it will focus on solo adventures for Justice League members like Wonder Woman, the Flash and, of course, Superman.

The move echoes Marvel’s strategy of establishing Iron Man, the Hulk, Thor and Captain America separately before having them team up for “The Avengers” in 2011.

So far, so good.

Next, Warner will reboot the Superman franchise and hope we all forget how dire “Superman Returns” is.

Again, this is following Marvel’s example. After director Ang Lee’s “Hulk” disappointed Marvel executives, they opted to start from scratch with this year’s “The Incredible Hulk.”

Now, I could nitpick and point out that “Hulk” cost less to make and will probably earn more (adjusted for inflation) than “The Incredible Hulk,” but for some reason, everyone in Hollywood thinks Lee’s film was a flop while the new one is a success. And in Hollywood, perception is all that matters.

Again, so far, so good. Director Bryan Singer, who did a great job of bringing Marvel’s “X-Men” to the screen, made such a mess of “Superman Returns” that it’s difficult to imagine where a straightforward sequel could go. Just take a mulligan.

But after that, Warner’s plan begins to reek. The studio has learned all of the right lessons from its competition — but all the wrong lessons from its own success.

To understand Warner’s mistake, you first need to think like a studio executive. So, I’ll give you a few minutes to take some tequila shots and bang your head into a brick wall.

Ready? Here goes.

The thinking at Warner Bros. goes like this: “The Dark Knight” was a “dark” movie. It made a lot of money. Hey! All of our other superhero movies should be dark, too!

Really, wasn’t the last Superman movie dark, moody and angst-ridden enough? Singer even darkened the colors of Superman’s costume to fit the mood of his gloomy picture.

See, Batman is a dark character. He’s named for a bloodthirsty, nocturnal mammal. So, his movies should be dark. Superman, however, is not a dark character. Neither is Wonder Woman, nor the Flash. Green Lantern? He’s not dark, either.

Meanwhile, none of Marvel’s most successful movies is dark, and the best of them — “Iron Man,” “Spider-Man” and the second “X-Men” — all have a pretty good sense of humor.

Last memo to Warner Bros.: You want a successful superhero movie? Just stay true to the spirit of the characters. Marvel does that. “The Dark Knight” and “Batman Begins” did it. A “dark” Superman movie does not.

I predict dark days ahead for Warner Bros. — in more than one sense of the term.

Thursday, August 21, 2008

Beware: When celebrities attack — each other!

MTV’s “Celebrity Deathmatch” used to be a guilty pleasure of mine. What could be better than watching Claymation facsimiles of celebrities fight each other to the death?

Actually, one thing would be better: watching the actual celebrities fight to the death. Unfortunately, that’s probably illegal. I’ll have to check with my attorney. So, when it comes to real celebrities settling scores, I have to settle for a war of words.

Since sparing us the sight of her on network TV, Roseanne Barr has retreated to her Web site, where her blog functions as a virtual shotgun aimed at people who can still find work.

First, Barr fired a volley at Angelina Jolie, Jolie’s oft-estranged father Jon Voight and her boyfriend Brad Pitt. Voight had committed the unpardonable sin of campaigning for Sen. John McCain for president, while Jolie has had the temerity not to make up her mind yet between McCain and Sen. Barack Obama. And what’s Barr’s problem with Pitt, you ask?

Oh, why not just let Barr speak for herself, typos and all: “jon voight your evil spawn angelina jolie and her vacuous hubby brad pitt make about forty million dollars a year in violent psychopathic movies and give away three of it to starving children trying to look as if they give a crap about humanity as they spit out more dunces that will consume more than their fair share and wreck the earth even more.”

Ah, Rosanne, such class. It’s just like when you sang the national anthem.

But lest you think Barr is just carrying water for the Obama campaign, she’s not exactly thrilled he’s the Democratic presidential nominee. And whom does she blame for Obama’s rise? One guess: Oprah Winfrey.

Barr writes, “It is estimated that her endorsement of Obama swung over one million women's votes from Hillary, who remains the only candidate who polls to win against mccain. Oprah can rest well now knowing that her dogs will not have to pay inheritance tax over the next four years if she dies.”
Now, I could spend the rest of this column pointing out Barr’s factual errors and explaining the rules of capitalization. But there’s no fun in that, and it would be beside the point. Who cares what any of these Hollywood types think about McCain, Obama or inheritance taxes?

If celebrity political endorsements mattered, the Republicans wouldn’t have elected a president since Abraham Lincoln. And even Lincoln wasn’t beloved by all actors of his day. Just ask John Wilkes Booth.

But really, as far as celebrity smackdowns go, this isn’t a big one. Compared to the heavyweight couple of Jolie and Pitt, Barr is a mere featherweight, and her career has already suffered a TKO. It’s Bambi meets Godzilla, and, ironically, Roseanne isn’t Godzilla.

The movie and music industries have much better match-ups to offer. In fact, there is a Web site devoted to celebrity feuds, the aptly named

Many are obvious, like the ongoing hostilities between former couples Kim Basinger and Alec Baldwin or Burt Reynolds and Loni Anderson. Others are legendary, like the titanic struggle between screen legends Bette Davis and Joan Crawford. Others still are just bizarre: There’s a feud between Tina Turner and Elton John? Really?

Still, rivalries involving actors and musicians are comparatively lame. If it’s a war of words you want, then you want people who really know how to use their weapons. That’s why literary feuds are the best.

Ernest Hemingway vs. Gertrude Stein, Tom Wolfe vs. John Updike, Gore Vidal vs. Truman Capote — to say nothing of Gore Vidal vs. Norman Mailer and Gore Vidal vs. William F. Buckley Jr. — those were some epic battles.

Yes, dear Gore — so many enemies, and you’ve outlived them all. That’s how you win, you see, by getting the last word.

Thursday, August 14, 2008

What’s left for us when our bad girls go good?

What are we supposed to gossip about now? It’s been weeks, months even, since America’s reigning bad girls have caused trouble. How can the nation survive?

It may be a minor miracle, but Paris Hilton and Britney Spears have cleaned up their acts. And while Lindsay Lohan’s troubles seem to go deeper than the merely embarrassing, she has managed to avoid making headlines, too. This may be a threat to the republic as we know it. I can imagine celebrity gossip Web sites going dark, tabloid newspapers closing shop, even entire cable TV channels going out of business. Isn’t the economy in enough trouble already?

Granted, the situation could change between the time I write this column and when you read it, but as of right now, the three arrested adolescents seem to have given up their hard-partying ways and are — gasp! — behaving like responsible adults.

Without their favorite targets, Hollywood’s packs of celebrity-stalking photographers are apparently suffering, which is just fine as far as some people are concerned.

“If you notice, since Britney started wearing clothes and behaving; Paris is out of town not bothering anybody anymore — thank God — and evidently, Lindsay Lohan has gone gay, we don’t seem to have much of an issue” with the paparazzi, Los Angeles Police Chief William Bratton told a Los Angeles television station.

A quick Google search of the headlines doesn’t turn up much. Lindsay makes the news for saying she is avoiding the paparazzi. Well, that’s exactly the problem, isn’t it? Meanwhile, a British publication goes gaga for her model of BlackBerry. That’s nothing to excite the masses. The most salacious gossip to be found is about whether she is romantically linked with a female DJ named Samantha Ronson. But Lindsay isn’t talking.

Paris, of course, has been in the news a lot during the past week. But it wasn’t her fault. There she was, minding her own business, and all of the sudden, presidential candidate Sen. John McCain, was on her case.

The McCain campaign used images of Paris and Britney in an advertisement attacking McCain’s Democratic rival, Sen. Barack Obama of Illinois, as just a “celebrity” candidate. Paris responded with an Internet video that had both gossips and political pundits talking for days.

Now, I used to be involved in politics, so take it from me: When you lose a debate with Paris Hilton, your campaign is having a bad week. Yes, I know Paris didn’t write her own material — please! — but, frankly, her energy plan makes more sense than either McCain’s or Obama’s. What does it mean when Paris is serving up better copy for The Economist magazine than for The National Enquirer? It means the end of Western Civilization, that’s what!

Just when you think maybe Britney can save us, she lets us down, too. New photos of the former pop princess appeared last week showing her once again in fit condition. If she’s lost weight, that means she has managed to work up discipline of some kind, and that’s no good if you’re a tabloid editor desperate for Britney’s next hair-shaving meltdown.

What about Britney’s younger sister, notorious teen mom Jamie Lynn? She and her new baby are doing just fine, thank you. And Jamie Lynn seems to have learned from Britney’s mistakes. So far, she hasn’t been photographed driving around with her infant precariously perched in her lap. She’s no help at all.

The same goes for Paris’ former “Simple Life” co-star, Nicole Richie, who after a rash of legal problems and rumors of an eating disorder, has settled into a — ahem! — simple life of domestic bliss with boyfriend Joel Madden and daughter Harlow.

Efforts to turn Disney star Miley Cyrus to the Dark Side are mixed. Despite attempts to manufacture a scandal out of her Vanity Fair photos, Miley remains on the straight and narrow.

These are indeed dark days for us all. Never mind what the bobbleheads(cq)  on the E! channel will talk about. What will I write about on slow news days?

Thursday, August 07, 2008

After 25 years, Return to Forever returns for more

They’ve gotten the band back together.

Twenty-five years after their first, brief reunion and 32 years after their last album, Return to Forever’s classic lineup is back and touring the country.

The revived jazz fusion band performed Saturday in front of a capacity crowd of about 2,750 at Atlanta’s plush Cobb Energy Performing Arts Centre, which opened last fall. It was the 50th stop on the Return to Forever summer reunion tour.

Now, I’ve been to my fair share of concerts, but I’d never before been to one where every song earned a standing ovation. (Standing-room-only shows don’t count.) The years since Return to Forever’s break-up have only increased fans’ enthusiasm. Some on the front row handed their old, vinyl RTF albums to the roadies for the band to sign. Others paid hundreds of dollars extra for VIP packages that included meeting the band backstage.

Led by keyboardist Chick Corea and featuring guitarist Al Di Meola, bassist Stanley Clarke and drummer Lenny White, Return to Forever was one of three bands during the 1970s that defined jazz fusion. Along with Weather Report and guitarist John McLaughlin’s Mahavishnu Orchestra, Return to Forever mixed jazz improvisation with electric instruments and Motown-inspired rhythms, creating a sound that has influenced a generation of both jazz and rock artists.

Jazz fusion was the flipside of progressive rock, which saw bands like Pink Floyd and Yes blend rock music with elements of jazz and classical.

In conjunction with the tour, Return to Forever has released “The Anthology,” a two-disc set featuring remastered tracks from the group’s four best albums: “Hymn of the Seventh Galaxy,” “Where Have I Known You Before,” “No Mystery” and “Romantic Warrior.”

In the years following RTF’s breakup, Di Meola and Clarke launched successful solo careers. Di Meola’s lightning-paced, “shred” guitar style probably influenced even more heavy metal guitarists than jazz players. Clarke, meanwhile, divided his efforts between solo albums and scoring Hollywood movies, including “Boyz n the Hood”(cq)  and “The Transporter.”

Throughout the ’80s and ’90s, Corea continued with both solo projects and his new group, the Chick Corea Elektric Band.

But to see and hear Return to Forever on Saturday, you’d think the quartet had never been apart. With Clarke taking center stage, the four launched into their classic catalog, deftly trading solos and playfully trying to one-up one another.

The first set featured the group’s spacey, sci-fi influenced tracks like “Hymn of the Seventh Galaxy,” “Vulcan Worlds” and “The Sorceress.”

But RTF really shined in the second set, with Di Meola and Clarke switching to acoustic. Di Meola launched into a medley of his solo pieces, including covers of Argentine tango master Astor Piazzolla’s compositions and his signature song, “Mediterranean Sundance,” which elicited an enthusiastic response from the audience.

Di Meola may not be as fast as the 19-year-old prodigy who wowed critics and audiences alike in the ’70s, but he makes up for it with even better technique and the sort of emotional depth that comes with time. And he’s still plenty fast with his fingers.

Corea followed with a keyboard solo, setting up Clarke, whose virtuoso, marathon solo on upright bass had people in the balcony calling out, “I love you, Stanley!”

For the encore, RTF returned to its electric bread and butter for one of my favorite of the band’s tunes, “Duel of the Jester and the Tyrant.”

Three decades may have passed since RTF’s last recording, but those old songs seem timeless. But what else should you expect from a band named Return to Forever?

Thursday, July 31, 2008

‘Shaun of the Dead’ team’s ‘Spaced’ hits DVD

It was like a near-death experience. I saw my life flash before my eyes. Except it was divided into 14 half-hour episodes, and everyone was British. Oh, and it wasn’t actually my life.

Before they achieved fame, fortune and international acclaim with their hit films “Shaun of the Dead” and “Hot Fuzz,” actor/writer Simon Pegg and director Edgar Wright garnered a cult following with the TV series “Spaced,” which aired for two seasons on Britain’s Channel 4.

Now, finally, nearly a decade after it first aired, “Spaced” is available on DVD in the U.S.
Starring and written by Pegg and Jessica Stevenson, “Spaced” follows two twentysomething slackers who pretend to be a couple so they can rent an apartment advertised as for “professional couples only.”

But that’s not really what the show is about. If it were, it would be nothing more than the British version of “Three’s Company,” which is already just the American version of Britain’s “Man About the House.”

In fact, the whole “pretending to be a couple” thing comes up only twice as a major plot point, so forget I even mentioned it. Seriously.

Daisy (Stevenson) is an aspiring magazine writer who would rather do anything other than write. Her roommate Tim (Pegg) wants to draw comic books, but in the meantime he works in a store selling comic books.

Tim’s best friend is Mike, played by Nick Frost, who, at the risk of being typecast, also plays Pegg’s best friends in “Shaun of the Dead” and “Hot Fuzz.” Mike wants more than anything to join the army. Unfortunately, he got kicked out for stealing a tank and trying to invade Paris.

I know what you’re thinking. You don’t need a tank to invade Paris. Just roll down the Champs-Elysées in a Volkswagen Beetle, and the French will surrender faster than you can say David Hasselhoff. But never mind that.

Daisy’s best friend is Twist (Katy Carmichael), who is either incredibly shallow or incredibly evil. Tim can’t decide which.

Tim and Daisy’s neighbor Brian (Mark Heap) is an artist who paints in bold strokes of anger, pain, fear and aggression. And their landlady, Marsha (Julia Deakin), is a chain-smoking alcoholic with a thing for Brian.

Although set in London, “Spaced” captures perfectly the American Generation X experience of arrested adolescence, underemployment and relationships forged by shared cultural experiences. It’s like that Winona Ryder movie “Reality Bites” except it doesn’t suck.

No, far from sucking, “Spaced” is simply brilliant. And it’s without a doubt the most fanboy-friendly sitcom ever made.

A year after breaking up with his girlfriend, Tim must deal with the end of an even more important relationship — the one he had with George Lucas:

“ ‘The Phantom Menace’ was 18 months ago, Tim!” says Tim’s boss, Bilbo (Bill Bailey).
“I know, Bilbo, but it still hurts!” Tim says. “That kid wanted a Jar Jar doll!”

Under Wright’s direction, “Spaced” creates a surreal, cinematic world, where a seemingly ordinary conversation can turn into a re-enactment of “Dawn of the Dead” and a paintball fight is a good excuse to reference “Platoon.”

“Spaced” is no ordinary sitcom. It’s a sitcom for a generation incapable of having a conversation without referencing popular movies or TV shows. You know, like a typical episode of “Family Guy.” In short, it’s for people like me.

But not to worry if you don’t know the difference between “Babylon 5” and “Deep Space Nine.” The new DVDs include a handy subtitle feature that explains all of the references.

Thursday, July 24, 2008

‘Dark Knight’ takes spot atop best superhero flicks

“The Dark Knight” lived up to the hype, earning every penny of its record-setting, $158.4 million opening weekend and capping a summer of superhero movies that didn’t stink.

The second installment in director Christopher Nolan’s rebooted Batman franchise now sits squarely atop my personal top 10 list of the best superhero movies. Chalk up most of the credit to the late Heath Ledger’s performance as the Joker, which is now the definitive interpretation of Batman’s arch nemesis. But even without Ledger’s contribution, “The Dark Knight” would be a compelling thriller. I didn’t check my watch during the entire 2½ hours, which is really something, given my short attention span and bladder the size of a walnut.

One of the summer’s other superhero offerings also shook up my rankings. While “The Incredible Hulk” and “Hellboy II: The Golden Army” are on the outside looking in, “Iron Man” rocketed to the No. 2 spot.

Structurally, “Iron Man” is a by-the-numbers origin story. What sets it apart is Robert Downey Jr.’s fantastic performance as Iron Man’s alter ego, billionaire inventor/playboy Tony Stark. Downey’s Stark is the perfect blend of humor, ego and heroism. As with most Marvel Comics superheroes, the man is more interesting than the superman, and Downey makes the most of that. Yes, the action scenes are great, but they pale next to Tony Stark just being Tony Stark.

“The Dark Knight” may be better overall, but “Iron Man” is more fun. Marvel has “Iron Man 2” on its schedule for 2010, and 2010 can’t get here fast enough.

And now, the rest of my top 10:

“Batman Begins”: In retrospect, Nolan’s first outing with Christian Bale as the Caped Crusader seems like little more than a practice run for “The Dark Knight.” But that doesn’t keep it from being a great superhero movie in its own right. And “Batman Begins” gives Bale a chance to shine as Batman without having to compete for screen time with the Joker. Whether played by Ledger, Jack Nicholson (“Batman”) or Mark Hamill (“Batman: The Animated Series”), the Joker always steals the scene, no matter who is in the Bat suit.

“Batman” (1989): Speaking of Nicholson, his hammy take on the Joker was right at home in Tim Burton’s first Batman film. So what if “Batman” isn’t as dark or as serious as “Batman Begins” and “The Dark Knight”? It is still a showcase for Burton’s moody, fairytale version of Gotham City. And for the record, Michael Keaton is still the big screen’s best Batman, even if Bale is a better Bruce Wayne.

“Danger: Diabolik” (1968): Chances are you haven’t seen this psychedelic masterpiece from Italian director Mario Bava. Better known for his horror movies — which have influenced filmmakers from Martin Scorsese to Quentin Tarantino — Bava turns a popular Italian comic book about a costumed thief into a glowing spectacle of pop art. John Philip Law (“Barbarella”) stars as Diabolik, who steals from the rich and gives to himself, while also taking down pompous politicians and rival criminals.

“X2: X-Men United”: The only “X-Men” movie worth repeat viewings, “X2” hits all the main themes of Marvel’s flagship series about mutants sworn to defend a world that hates and fears them. Plus, it has fantastic set pieces, like the attack on the X-Men’s mansion and Magneto’s escape from prison. And how can you not love a movie that so shamelessly rips off the ending of “Star Trek II: The Wrath of Khan”?

“Spider-Man”: “Spider-Man 2” has better acting, better directing, better cinematography and better special effects. But the original has the better script and translates Stan Lee’s corny, soap-opera-style superheroics to the screen without seeming corny itself.

“Unbreakable”: Before he became a bad joke, M. Night Shyamalan did direct two good movies, and “Unbreakable” holds up better than “The Sixth Sense” because it’s not as dependent on its twist ending.

“The Incredibles”: Writer/director Brad Bird scores with his Oscar-winning animated feature about a family of superheroes and a villain who is the ultimate fanboy gone bad.

“Hulk” (2003): Ang Lee’s movie alienated more fans than it thrilled, but its comic-book-inspired editing and psychological depth mark “Hulk” as interesting and ambitious, if flawed.

Thursday, July 17, 2008

Obama cartoon stirs controversy in post-irony U.S.

Maybe it’s true. Maybe irony really is dead.

At least I heard it was dead. “Death of irony” speculation was all the rage after Sept. 11, 2001. It was bigger than the death of Elvis. Mind you, I never grasped the link between the 9/11 attacks and irony. Still, there were people on TV who assured me there indeed was a link. You know, just like there was a link between Saddam Hussein and al-Qaida.

But irony’s passing didn’t really hit me until Monday, when I heard the collective gasp of the nation’s political bloggers and cable TV pundits going into shock. Soon afterward, they all got their chance to beat up The New Yorker magazine for its current issue, which features a cover illustration of Democratic presidential candidate Sen. Barack Obama and would-be first lady Michelle Obama.
The cartoon depicts Sen. Obama wearing traditional Arab garb and doing the now infamous “fist bump” with his wife, who is dressed as a 1960s-style “black militant,” complete with military fatigues, AK-47, bandoleer and afro.

To make the matter even more scandalous, Mr. and Mrs. Obama are shown in the Oval Office, which, for added effect, is decorated with a portrait of Public Enemy No. 1 Osama bin Laden hanging above a fireplace in which burns an American flag.

There is a word that perfectly describes The New Yorker’s cartoon. That word is “irony.”

Instead, Obama’s campaign went with “tasteless” and “offensive.” The campaign of Sen. Obama’s Republican rival, Sen. John McCain, apparently at a loss for its own adjectives, released a statement basically saying, “Ditto.” Surely, this is the start of what will be the cleanest, most civil presidential campaign since George Washington ran against nobody.

The reaction all across the Internet and CNN was much the same, not that any of the outraged parties actually believed The New Yorker was guilty of racism, stereotyping or just plain old libel. They all recognized the cartoon for what it was — irony. They were just afraid all the unwashed masses out in Middle America wouldn’t get the joke. Because, as you know, irony is dead. Gone and forgotten.
This saddens me. Irony was my third-favorite literary device, after sarcasm and double entendres and just ahead of puns.

I mean, really, how can anyone without an advanced literature degree from Harvard or Columbia be expected to recognize irony now? It takes years of graduate studies to know that The New Yorker’s illustration really meant the opposite of what it depicted — that it was, in fact, satirizing the many conspiracy theories that have grown like kudzu around the Obama campaign.

If Jonathan Swift were to write his 1729 essay “A Modest Proposal” today, who knows what would happen? Why, the Irish might actually eat their own children, just as Swift suggested. But at least that would add a little protein to the typical Irish diet, which consists solely of potatoes and whiskey.

Obama is not a radical Islamic terrorist just waiting to seize control of the U.S. government. He is exactly what he says he is — an agent of change. Except he means “C.H.A.N.G.E,” which stands for Cannibals, Homosexuals and Atheists for Nuking Grandmothers Everywhere.

Irony is not something we Americans do well. Consider Alanis Morissette’s song “Ironic,” with lyrics like “It’s like rain on your wedding day” and “It’s the good advice that you just didn't take,” none of which are ironic because irony expresses the opposite of its literal meaning. Irony would be something like getting divorced on your wedding day, while rain on your wedding day isn’t ironic, just unfortunate. Unless you like rain.

Of course, a song about irony that contains no irony might, in fact, be ironic. And, in any case, Morissette is a Canadian. Isn’t it ironic?

Thursday, July 10, 2008

An old Hollywood gimmick makes a comeback — in 3-D!

A 1950s movie theater gimmick is making a comeback.

The latest cinematic adaptation of Jules Verne’s “Journey to the Center of the Earth” opens in theaters Friday. And apart from a new leading man in the form of Brendan Fraser (“The Mummy”), this version sports a new look. Moviegoers will see Verne’s prehistoric world at the center of the Earth as never before — in 3-D.

“Journey” is the first live-action, action/adventure movie to open in the Real D Cinema format. Theaters screened last year’s CGI “Beowulf” in both 3-D and 2-D. The recent Hannah Montana and U2 concert films also used Real D Cinema.

You still must wear special glasses to get the 3-D effect, but the new glasses are a far cry from the red-and-blue filters of years past. They use polarized lenses that allow each eye to see only one of two projected images, creating the illusion of three dimensions.

Special effects wizards love new toys, so it was a given that 3-D would make a comeback in a more user-friendly format. But declining movie attendance is probably even more responsible for the new generation of 3-D movies. You have to give the audience something it can’t get at home.

That, of course, is exactly how 3-D movies began.

In the ’50s, theaters faced their first real competition, a newfangled entertainment device called a “television,” or “TV” for short.

The prospect of families staying home to watch the black-and-white flicker of Jackie Gleason and Ed Sullivan terrified theater owners. Theaters needed something new to lure people to the big screen.
Hollywood experimented with various 3-D formats, but the best remembered is actually one of the least used — anaglyph 3-D, which required audiences to wear headache-inducing cardboard glasses with colored filters, usually one red and one cyan.

Anaglyph 3-D turned up mostly on B movies and short subjects. Despite its disadvantages, anaglyph 3-D was easier to use given the technology of the day. Other ’50s 3-D formats, which relied on two projectors and polarized glasses similar to those now in use, were hard to maintain. But Anaglyph 3-D was perfect for low-budget movies.

Hollywood produced anaglyph 3-D movies into the early 1960s, and, ironically, some of these films eventually made their way to television, the medium they were produced to combat.

By the 1980s, 3-D was all but dead in theaters, with rare exceptions like “Amityville 3-D,” “Jaws 3-D” and “Friday the 13th Part III.” Still, that was enough to trigger a boomlet in old 3-D movies airing on TV.

That was when I had my first 3-D experience, when a local TV station aired the 1961 horror movie “The Mask.”

“The Mask” was mostly in 2-D but featured 3-D dream sequences. Whenever the main character donned a demonic mask, he hallucinated being in a spooky, 3-D landscape. Viewers knew it was time to put on their 3-D glasses when a voice intoned, “Put the mask on now!”

As a bonus, the TV station also aired a 3-D “Three Stooges” short.

This was a big deal for me and my friends. We all had our parents drive us to the convenience stores where we could pick up our 3-D glasses. Then we waited impatiently for Saturday night to finally arrive, so we could all experience the magic of 3-D.

Yes, it was cool. But I can’t say I enjoyed the eye strain that came with those glasses.

Still, 3-D had come full circle. What started as a gimmick for theaters had become a gimmick for TV stations. And, yes, I still have the pair of 3-D glasses I got to watch “The Mask.”

So, the question is, has technology come far enough to make 3-D mainstream, or is this another gimmick in the making?