After a four-hour miniseries, 73 episodes, a TV movie and numerous Internet “webisodes,” “Battlestar Galactica” has finally reached its journey’s end. Now the arguments can begin.
If you haven’t yet seen the series finale, consider yourself warned. There are spoilers ahead.
The two-hour conclusion of “Battlestar Galactica,” which aired Friday, gave its major characters closure, but it left some major plot points unanswered. Or, to be more exact, it answered them ambiguously.
Were the apparitions that appeared to Baltar and Caprica Six really angels? Was everything that happened in the series really the will of God, or some deity-like facsimile? What was Starbuck anyway? Was she an angel or a demon? And who or what resurrected her after her apparent death in season 3?
The only thing certain is that the ending has people talking. Fans who have followed “Galactica” since the 2003 miniseries seem split. Some think they were cheated. Some think producer/writer Ronald D. Moore delivered an emotionally satisfying conclusion. Some think Moore was just “making it up” as he went and didn’t know how to end the series. And others are just confused.
Put me in the camp that is pleased with how Moore ended the series. All of the major characters reached the end of their personal story arcs and were better for the journey, even if some of those endings were bittersweet. As for the unanswered questions, science fiction is still fiction. In science, it is nice to have answers, but in fiction it is usually more fun to be left with questions.
I don’t really care what Starbuck was or how she returned from the dead. What is important is that her character finally found a purpose — along with inner peace. And when she simply vanished without a trace, her mission at last complete, that floored me as much as anything on television has in a long, long time. It was as close to perfect as television gets.
Did Moore and his writing staff make things up as they went? Sure. You can’t expect a TV series to come out looking like a novel. I suspect a lot of sci-fi fans have taken away the wrong lesson from “Babylon 5,” J. Michael Straczynski’s epic “novel for television,” which spanned five seasons. Straczynski started out with a definite beginning, middle and end in mind. But he still had to make major changes as the series progressed, some of which improved on his original plan.
Even when it comes to novels of the regular paper variety, it is rare for an author’s outline to survive the first draft. The difference is when you read a novel, you’re getting a finished product, but when you watch a TV series — even a so-called novel for television — you’re getting a work in progress, one episode at a time. You have to expect the writers to change their minds, try new ideas and abandon ones that aren’t working. They wouldn’t be doing their jobs if they just stuck to their original plan. Few writers are clever enough to get it right the first time.
In any case, it could be worse. At least “Battlestar Galactica” didn’t end like “The X-Files.” It would take a PowerPoint presentation, complete with flow charts, to explain the ins and outs of that show’s conspiracy-laden mythology.
“Battlestar Galactica” leaves its fans with lots of great moments and lots more to debate. That puts it in some pretty special company. Other classic SF series that had controversial endings include the British series “The Prisoner” and the animated Japanese series “Neon Genesis Evangelion,” both of which still set off heated arguments years and, in the case of “The Prisoner,” decades after they left the air. Both also remain extremely popular. AMC is producing a miniseries remake of “The Prisoner,” while in Japan, new feature-film retellings of “Evangelion” are still landing in theaters.
So, thanks, “Battlestar Galactica,” both for the memories and for the arguments to come.
Thursday, March 26, 2009
Thursday, March 19, 2009
Sci-Fi Channel finally does away with the ‘Sci-Fi’
What is in a name? Quite possibly more than even Shakespeare ever dreamt.
The Sci-Fi Channel, which has gone by just “Sci Fi” for most of the decade, announced Monday that it will have a new name as of July 7.
Say goodbye to Sci Fi, and say hello to SyFy — different spelling, same pronunciation. And unlike the generic term “sci-fi,” which was coined by the late Forrest J Ackerman, SyFy has the advantage of being something the network can trademark, said Bonnie Hammer, former Sci-Fi Channel president and current head of parent company NBC Universal Cable Entertainment.
Unfortunately, SyFy has the disadvantage of being lame, if not downright insulting to sci-fi fans. Yet it took marketing geniuses years to come up with it — if they actually did. The name originally belonged to a Web site called SyFy Portal, which sold the SyFy name and reopened as Airlock Alpha.
Sci Fi executives have been looking for a new name and a new image for a long time. In fact, they seem embarrassed by the idea of being associated with science fiction.
TV Week cites television historian Tim Brooks, who helped launch the Sci-Fi Channel when he worked at USA Network, Sci Fi’s sister channel.
“The name Sci Fi has been associated with geeks and dysfunctional, antisocial boys in their basements with video games and stuff like that, as opposed to the general public and the female audience in particular,” Brooks said.
So, in case you were wondering what Sci Fi’s executives think of their core audience — the geeks and “dysfunctional” fanboys who have watched the channel religiously since it started up in 1992 — now you know.
Current Sci Fi President Dave Howe was a little more diplomatic.
“If you ask people their default perceptions of Sci Fi, they list space, aliens and the future,” he said in a New York Times story. “That didn’t capture the full landscape of fantasy entertainment: the paranormal, the supernatural, action and adventure, superheroes.”
Well, we certainly can’t have a TV channel devoted to space, aliens and the future, now can we? In fact, during her tenure as president, Hammer pretty much let everyone know of her contempt for real science fiction. She canceled “Farscape,” a popular series set in space and loaded with extraterrestrials, and packed the schedule with “reality” and paranormal programs like “Crossing Over with John Edward.”
From the beginning, Sci Fi aired a lot of horror and fantasy programming, and most sci-fi fans accepted that. In the publishing world, science fiction, fantasy and horror have always been linked. But then Sci Fi aired “Braveheart,” a movie that isn’t sci-fi, fantasy or horror. It’s just bad history. After that, nothing was off limits. Not even professional wrestling.
With the new name, SyFy, Hammer finally has what she always wanted: something that sounds like sci-fi and looks a little like sci-fi, but isn’t sci-fi.
Sure, SyFy will still air science fiction, with “Caprica” — a prequel series to “Battlestar Galactica” — and a new “Stargate” series yet to come. But I would expect more wrestling and action movies if I were you. Toss in some “C.S.I.” reruns, and SyFy would probably look a lot like Spike TV.
Along with the new name, the channel is also getting a new slogan, “Imagine Greater,” and you have to wonder how long it took the marketing guys to come up with that.
What does it even mean? Greater than what? Maybe they mean “Imagine More” or “Imagine Better.” Is there a language on Earth in which “Imagine Greater” even makes sense? Maybe it sounds better in the original Klingon, or would if SyFy didn’t hate aliens.
Hey, SyFy executive folks, how about trying to “Imagine Grammar” instead?
The Sci-Fi Channel, which has gone by just “Sci Fi” for most of the decade, announced Monday that it will have a new name as of July 7.
Say goodbye to Sci Fi, and say hello to SyFy — different spelling, same pronunciation. And unlike the generic term “sci-fi,” which was coined by the late Forrest J Ackerman, SyFy has the advantage of being something the network can trademark, said Bonnie Hammer, former Sci-Fi Channel president and current head of parent company NBC Universal Cable Entertainment.
Unfortunately, SyFy has the disadvantage of being lame, if not downright insulting to sci-fi fans. Yet it took marketing geniuses years to come up with it — if they actually did. The name originally belonged to a Web site called SyFy Portal, which sold the SyFy name and reopened as Airlock Alpha.
Sci Fi executives have been looking for a new name and a new image for a long time. In fact, they seem embarrassed by the idea of being associated with science fiction.
TV Week cites television historian Tim Brooks, who helped launch the Sci-Fi Channel when he worked at USA Network, Sci Fi’s sister channel.
“The name Sci Fi has been associated with geeks and dysfunctional, antisocial boys in their basements with video games and stuff like that, as opposed to the general public and the female audience in particular,” Brooks said.
So, in case you were wondering what Sci Fi’s executives think of their core audience — the geeks and “dysfunctional” fanboys who have watched the channel religiously since it started up in 1992 — now you know.
Current Sci Fi President Dave Howe was a little more diplomatic.
“If you ask people their default perceptions of Sci Fi, they list space, aliens and the future,” he said in a New York Times story. “That didn’t capture the full landscape of fantasy entertainment: the paranormal, the supernatural, action and adventure, superheroes.”
Well, we certainly can’t have a TV channel devoted to space, aliens and the future, now can we? In fact, during her tenure as president, Hammer pretty much let everyone know of her contempt for real science fiction. She canceled “Farscape,” a popular series set in space and loaded with extraterrestrials, and packed the schedule with “reality” and paranormal programs like “Crossing Over with John Edward.”
From the beginning, Sci Fi aired a lot of horror and fantasy programming, and most sci-fi fans accepted that. In the publishing world, science fiction, fantasy and horror have always been linked. But then Sci Fi aired “Braveheart,” a movie that isn’t sci-fi, fantasy or horror. It’s just bad history. After that, nothing was off limits. Not even professional wrestling.
With the new name, SyFy, Hammer finally has what she always wanted: something that sounds like sci-fi and looks a little like sci-fi, but isn’t sci-fi.
Sure, SyFy will still air science fiction, with “Caprica” — a prequel series to “Battlestar Galactica” — and a new “Stargate” series yet to come. But I would expect more wrestling and action movies if I were you. Toss in some “C.S.I.” reruns, and SyFy would probably look a lot like Spike TV.
Along with the new name, the channel is also getting a new slogan, “Imagine Greater,” and you have to wonder how long it took the marketing guys to come up with that.
What does it even mean? Greater than what? Maybe they mean “Imagine More” or “Imagine Better.” Is there a language on Earth in which “Imagine Greater” even makes sense? Maybe it sounds better in the original Klingon, or would if SyFy didn’t hate aliens.
Hey, SyFy executive folks, how about trying to “Imagine Grammar” instead?
Thursday, March 12, 2009
Children’s brains may evolve for Information Age
If TV doesn’t rot your child’s brain, the Internet will, according to the latest scary speculation.
Susan Greenfield, an Oxford University neuroscientist, says exposure to social networking Web sites like Facebook and Twitter, along with video games and fast-paced TV shows, may be rewiring children’s brains.
I can think of two reasonable responses to this claim: “So what?” and “So what?”
At one level, this is exactly the sort of thumb-sucking response that has always greeted new media. Since the 1930s, do-gooders have blamed jazz, comic books, rock ’n’ roll, television, and now video games and the Internet, for everything wrong with “the children” — even when there isn’t anything wrong with children.
Like Paul Lynde in the Broadway musical “Bye Bye Birdie,” they ask, “Why can’t they be like we were, perfect in every way? What’s the matter with kids today?”
Juvenile crime across the board is down compared to 10 years ago. Fewer teens are having sex, and the ones who do are more likely to use protection. Fewer teens are getting abortions.
OK. I give. What exactly is the matter with kids today? If online social networking is bad for them, it hasn’t shown up in the statistics yet.
But let’s assume MySpace and Facebook really are rewiring children’s brains. In fact, I would be surprised if that were not the case. The real question is “Why is this a problem?”
This isn’t the first time the environment has rewired our brains. Before our ancestors developed written languages, they had excellent memories. But writing has taken away our ability to remember epics like “The Iliad.” Still, that seems a good trade-off.
Adults often complain that change happens too quickly and that the world is becoming too fast-paced. If children’s brains are adapting to the pace of modern society, that seems like a net plus. A brain that is better at filtering information from many different sources in a short span of time is a brain that is better suited to the Information Age.
How fast-paced is the world today? One useful measure is the length of the average movie shot. An average shot in a major motion picture today is 2 or 3 seconds long. By comparison, film critic Roger Ebert notes, the average shot in “Citizen Kane” (1941) is 11.4 seconds. By today’s standards, “Pulp Fiction” is a slow movie, which isn’t surprising given that Quentin Tarantino is, in many respects, an old-fashioned filmmaker. The average shot in “Pulp Fiction” is 7.9 seconds.
There is a question of cause and effect here. Are movies contributing to children’s shorter attention spans by rewiring their brains? Maybe, but I doubt that’s the whole answer. I suspect, instead, that movies are mainly catering to the shorter attention spans that have resulted from the quickening pace of life in general, with movies being a small part of that.
Phil Edholm, chief technology officer at Nortel, created a minor stir last year when he suggested that Attention Deficit Hyperactivity Disorder, at least in its mild form, may be an evolutionary adaptation for multitasking in the digital era. Parents of ADHD children where not amused. But maybe there is some truth to his speculation.
Scientists at Northwestern University have discovered that a genetic variation associated with ADHD seems to help people in nomadic cultures survive. The variation also seems to encourage novelty-seeking behavior. The researchers speculate that some ADHD-associated traits may have been beneficial in our prehistoric past.
If modern society is taking on attributes that make it more like our hunter-gatherer past — for example, greater mobility and a faster pace than agricultural societies — then genetic traits that were advantageous in the past could reassert themselves.
If so, that’s not cause for alarm. That might be just evolution.
Susan Greenfield, an Oxford University neuroscientist, says exposure to social networking Web sites like Facebook and Twitter, along with video games and fast-paced TV shows, may be rewiring children’s brains.
I can think of two reasonable responses to this claim: “So what?” and “So what?”
At one level, this is exactly the sort of thumb-sucking response that has always greeted new media. Since the 1930s, do-gooders have blamed jazz, comic books, rock ’n’ roll, television, and now video games and the Internet, for everything wrong with “the children” — even when there isn’t anything wrong with children.
Like Paul Lynde in the Broadway musical “Bye Bye Birdie,” they ask, “Why can’t they be like we were, perfect in every way? What’s the matter with kids today?”
Juvenile crime across the board is down compared to 10 years ago. Fewer teens are having sex, and the ones who do are more likely to use protection. Fewer teens are getting abortions.
OK. I give. What exactly is the matter with kids today? If online social networking is bad for them, it hasn’t shown up in the statistics yet.
But let’s assume MySpace and Facebook really are rewiring children’s brains. In fact, I would be surprised if that were not the case. The real question is “Why is this a problem?”
This isn’t the first time the environment has rewired our brains. Before our ancestors developed written languages, they had excellent memories. But writing has taken away our ability to remember epics like “The Iliad.” Still, that seems a good trade-off.
Adults often complain that change happens too quickly and that the world is becoming too fast-paced. If children’s brains are adapting to the pace of modern society, that seems like a net plus. A brain that is better at filtering information from many different sources in a short span of time is a brain that is better suited to the Information Age.
How fast-paced is the world today? One useful measure is the length of the average movie shot. An average shot in a major motion picture today is 2 or 3 seconds long. By comparison, film critic Roger Ebert notes, the average shot in “Citizen Kane” (1941) is 11.4 seconds. By today’s standards, “Pulp Fiction” is a slow movie, which isn’t surprising given that Quentin Tarantino is, in many respects, an old-fashioned filmmaker. The average shot in “Pulp Fiction” is 7.9 seconds.
There is a question of cause and effect here. Are movies contributing to children’s shorter attention spans by rewiring their brains? Maybe, but I doubt that’s the whole answer. I suspect, instead, that movies are mainly catering to the shorter attention spans that have resulted from the quickening pace of life in general, with movies being a small part of that.
Phil Edholm, chief technology officer at Nortel, created a minor stir last year when he suggested that Attention Deficit Hyperactivity Disorder, at least in its mild form, may be an evolutionary adaptation for multitasking in the digital era. Parents of ADHD children where not amused. But maybe there is some truth to his speculation.
Scientists at Northwestern University have discovered that a genetic variation associated with ADHD seems to help people in nomadic cultures survive. The variation also seems to encourage novelty-seeking behavior. The researchers speculate that some ADHD-associated traits may have been beneficial in our prehistoric past.
If modern society is taking on attributes that make it more like our hunter-gatherer past — for example, greater mobility and a faster pace than agricultural societies — then genetic traits that were advantageous in the past could reassert themselves.
If so, that’s not cause for alarm. That might be just evolution.
Thursday, March 05, 2009
Book publishers risk repeating the music industry’s mistakes
At the risk of slipping into “grumpy old man” mode, life used to be so much simpler.
You bought a book, and it was yours to do with as you pleased. You could read it silently to yourself. You could read it aloud to someone else. You could dog-ear the corners, underline passages and make notes in the margin. And when you were finished, you could sell the book or give it away.
If someone wanted to borrow it, your only problem was getting it back in good condition. I’m still upset about the book I lent out that ended up covered in vegan mayonnaise.
Nowadays, you can purchase and download digital books, which seems like a great idea — until you learn that publishers want to control how you use them.
You young people and your Kindles! You don’t know what it used to be like. When I was your age, books had pages! Now get off my lawn!
Sorry. That was Old Man Grumpy talking.
Actually, I really do think electronic books are a great idea. Who wouldn’t like to keep hundreds of books stored on one convenient, easy-to-read digital device? For me, that is the main attraction of electronic readers like Amazon.com’s Kindle. You start to worry when you have so many books that they threaten to bury you beneath an avalanche.
Unfortunately, owning a book downloaded from Amazon isn’t the same as owning a book. You can’t resell it, and you can’t let someone borrow it without also letting them borrow your Kindle. And I wouldn’t take the chance of a friend accidentally dropping my $360 reader into a jar of Vegenaise(cq).
Making matters worse, some publishers seem bent on limiting Kindle’s features.
Amazon’s new Kindle 2 includes a text-to-speech feature that allows it to read aloud any book loaded onto it. That feature, however, scared publishers and the Authors Guild, which claimed text-to-speech would threaten sales of audio books.
Not everyone was concerned. Bestselling author and recent Newberry Medal recipient Neil Gaiman blogged that when you buy a book, “you’re also buying the right to read it aloud, have it read to you by anyone, read it to your children on long car trips, record yourself reading it and send that to your girlfriend etc.”
Kindle’s text-to-speech isn’t a threat to audio books, he concluded, because audio books are, in effect, performance art. A Kindle reading a book to you in a robotic voice isn’t the same as the vocal performance you get with a true audio book.
But rather than fight a protracted copyright battle, Amazon gave in, offering rights holders the option of disabling text-to-speech on selected books.
The publishing industry is repeating the music industry’s mistakes. Music companies lost money while they tried to fight and then restrict music downloads. Only in the past year have the major recording labels learned their lesson: When customers buy digital media, they don’t want restrictions.
Ironically, Amazon helped pioneer music downloads free of the Digital Rights Management software that limits what you can do with a song after you’ve purchased it. After Amazon started selling DRM-free songs, industry leader iTunes followed. Now downloads are the music industry’s one area of sales growth.
By caving to the demands of publishers and the guild, Amazon may be hurting itself. Kindle books already come with use restrictions you don’t find with other electronic books. The Internet is full of DRM-free books you can download — many in the public domain and free of charge. Author and blogger Cory Doctorow gives away entire novels online, which he says helps drive sales of his physical books.
Doctorow’s sales benefit, no doubt, from the fact that a lot of people still like to own real books. But as devices like the Kindle improve, and reading an electronic book becomes more like reading a real one, that may not remain the case.
Book publishers should concentrate on finding business models that don’t risk alienating their customers. Otherwise, they could end up playing catch-up just like the music industry.
You bought a book, and it was yours to do with as you pleased. You could read it silently to yourself. You could read it aloud to someone else. You could dog-ear the corners, underline passages and make notes in the margin. And when you were finished, you could sell the book or give it away.
If someone wanted to borrow it, your only problem was getting it back in good condition. I’m still upset about the book I lent out that ended up covered in vegan mayonnaise.
Nowadays, you can purchase and download digital books, which seems like a great idea — until you learn that publishers want to control how you use them.
You young people and your Kindles! You don’t know what it used to be like. When I was your age, books had pages! Now get off my lawn!
Sorry. That was Old Man Grumpy talking.
Actually, I really do think electronic books are a great idea. Who wouldn’t like to keep hundreds of books stored on one convenient, easy-to-read digital device? For me, that is the main attraction of electronic readers like Amazon.com’s Kindle. You start to worry when you have so many books that they threaten to bury you beneath an avalanche.
Unfortunately, owning a book downloaded from Amazon isn’t the same as owning a book. You can’t resell it, and you can’t let someone borrow it without also letting them borrow your Kindle. And I wouldn’t take the chance of a friend accidentally dropping my $360 reader into a jar of Vegenaise(cq).
Making matters worse, some publishers seem bent on limiting Kindle’s features.
Amazon’s new Kindle 2 includes a text-to-speech feature that allows it to read aloud any book loaded onto it. That feature, however, scared publishers and the Authors Guild, which claimed text-to-speech would threaten sales of audio books.
Not everyone was concerned. Bestselling author and recent Newberry Medal recipient Neil Gaiman blogged that when you buy a book, “you’re also buying the right to read it aloud, have it read to you by anyone, read it to your children on long car trips, record yourself reading it and send that to your girlfriend etc.”
Kindle’s text-to-speech isn’t a threat to audio books, he concluded, because audio books are, in effect, performance art. A Kindle reading a book to you in a robotic voice isn’t the same as the vocal performance you get with a true audio book.
But rather than fight a protracted copyright battle, Amazon gave in, offering rights holders the option of disabling text-to-speech on selected books.
The publishing industry is repeating the music industry’s mistakes. Music companies lost money while they tried to fight and then restrict music downloads. Only in the past year have the major recording labels learned their lesson: When customers buy digital media, they don’t want restrictions.
Ironically, Amazon helped pioneer music downloads free of the Digital Rights Management software that limits what you can do with a song after you’ve purchased it. After Amazon started selling DRM-free songs, industry leader iTunes followed. Now downloads are the music industry’s one area of sales growth.
By caving to the demands of publishers and the guild, Amazon may be hurting itself. Kindle books already come with use restrictions you don’t find with other electronic books. The Internet is full of DRM-free books you can download — many in the public domain and free of charge. Author and blogger Cory Doctorow gives away entire novels online, which he says helps drive sales of his physical books.
Doctorow’s sales benefit, no doubt, from the fact that a lot of people still like to own real books. But as devices like the Kindle improve, and reading an electronic book becomes more like reading a real one, that may not remain the case.
Book publishers should concentrate on finding business models that don’t risk alienating their customers. Otherwise, they could end up playing catch-up just like the music industry.
Subscribe to:
Posts (Atom)