The never-ending saga for America’s transition to digital television just got a little more never-ending.
The U.S. Senate voted Monday for a four-month delay that would encourage but not require over-the-air TV stations to continue broadcasting both analog and digital signals until June 12. House Republicans blocked the measure Wednesday, but Democrats plan to bring it up again and have the votes to pass it. After June 12, all stations would finally turn off their analog signals.
President Barack Obama made the delay a top priority upon taking office, arguing that the original Feb. 17 deadline would leave millions of Americans unable to receive over-the-air TV broadcasts.
Despite a year of public service announcements and a government program to subsidize people buying analog-to-digital converter boxes for their old TV sets, it seems a lot of people still haven’t gotten the message. And some who did get it are out of luck.
The converter boxes range in price from $40 to $80, and the Commerce Department has run out of money for the coupon program intended to help people pay for them. The program’s funding limit is $1.34 billion, but the number of people on the waiting list for the $40 coupons has reached 2.6 million. Dangle “free” money around, and the world will beat a path to your door.
This, however, is just the latest screw-up in the long and convoluted transition to digital television. Congress and the Federal Communications Commission have been bungling the process for 20 years. Concessions to broadcasters delayed DTV implementation, which resulted in there being little incentive, until recently, for consumers to buy DTV-ready TV sets or converter boxes.
Those concessions are on top of the fact that the government gave additional broadcast spectrum to the stations free of charge.
One rule of economics is that people care more about something when they have to pay for it. But according to one estimate, TV broadcasters received up to $100 billion worth of prime broadcast real estate for nothing. That certainly didn’t give TV stations an incentive to quickly implement DTV and offer high-quality digital programming that would entice viewers to buy digital televisions. Instead, most stations just sat on that spectrum for years.
Now that broadcasters are finally ready to go all-digital, this latest delay has some of them asking for even more concessions. PBS chief executive Paula Kerger said Monday that continuing to broadcast in analog until June 12 would cost PBS affiliates $22 million, and she wants the federal government — meaning taxpayers — to pick up the tab.
Like most things the government does, there was a better way.
Congress and the FCC could have abandoned the fiction that the broadcast spectrum is and must remain a public resource, which broadcasters merely use so long as they promote “the public interest,” which is whatever politicians say it is. Broadcast spectrum is valuable property. Congress could have auctioned it off, forcing broadcasters to pay market prices. That would have ensured the spectrum would have gone to broadcasters who were prepared to exploit it as soon as possible and to its fullest.
Then, Congress could have allowed broadcasters to auction off their old analog spectrum to recoup some of their costs. As it is, as soon as broadcasters shut off their analog signals, that spectrum will go to wireless services and local public safety agencies. But there is no reason why TV stations couldn’t have sold those frequencies to wireless companies or back to the government. That would have been a fair concession to broadcasters.
Of course, this mess means nothing to the vast majority of TV viewers. More than 80 percent get their TV via cable or satellite. This expensive, drawn-out process is about the 20 percent of viewers who still rely on rabbit ears. And that number would be smaller if Congress and the FCC hadn’t spent most of the past 40 years restricting pay-TV’s growth and stifling competition — yet another gift to over-the-air broadcasters.
Thursday, January 29, 2009
Thursday, January 22, 2009
Obama is not America’s first celebrity president
There already was a fine line between being a politician and being a celebrity, but President Barack Obama has just erased it.
If the millions of people who turned the National Mall into a giant mosh pit during Tuesday’s inaugural ceremonies weren’t evidence enough, you only have to look to the thousands more who filled stadiums last year to hear candidate Obama speak. It’s difficult to tell the difference between a politician and a rock star when you have people fainting in the audience.
You may chalk that up to “hope” or “change you can believe in,” but more likely it’s simply the culmination of the obvious: Presidential elections have, for a long time, been mostly high-stakes popularity contests. Instead of getting a recording contract, you get to be the most powerful man in the world.
William Henry Harrison won the election of 1840 with the help of a song, “Tippecanoe and Tyler Too.” The song’s title doubled as his campaign slogan. Even back then, branding was everything, although it helped that the Democrats that year were too busy beating up each other to worry about Harrison and the Whig Party.
Franklin Pierce won the election of 1852 in a landslide, aided in part by a campaign biography written by his old pal and college roommate Nathaniel Hawthorne, whose novel “The Scarlet Letter” has tortured high school students for decades. Say what you will about Pierce’s presidency, his political propaganda had the best pedigree ever. The closest President Obama could come to topping that was writing two autobiographies of his own.
Even if he isn’t the first celebrity president, President Obama has played the part better than any of his predecessors — although I have heard that President James Buchanan could really tell a joke. He even has suggested CNN medical correspondent Dr. Sanjay Gupta as the nation’s surgeon general. What’s next? “America’s Most Wanted” host John Walsh as FBI director?
One thing is certain. The new president is well on his way to being the most commercialized public figure since Elvis Presley. You can’t watch television without being bombarded by advertisements for Obama commemorative plates and laser-dyed Obama coins. That’s in addition to the ubiquitous T-shirts, posters and tote bags. Can velvet paintings of the commander in chief be far behind?
But it gets stranger. The New York Times found Obama hot sauce, Obama soap and even Obama toilet paper, which is just decorative because the ink on the paper is toxic. What would Surgeon General Gupta have to say about that? Nothing good, I bet.
Then it gets downright insane. If a T-shirt isn’t enough of a statement, you can always wear the Obama thong, although hopefully not in public. And then, the Times reported, there is the Obama sex toy, which, given recent federal court decisions, is illegal under Alabama law, unless used for a legitimate medical purpose, whatever that means.
The world hasn’t seen such a marketing blitz since George Lucas unleashed the first batch of “Star Wars” lunchboxes and bed sheets.
At this rate, how long will it be before presidential elections are determined by text messaging, just like “American Idol” winners are? Some people have already proposed allowing Americans to vote for president via the Internet, which isn’t much different.
Other countries have the same idea. According to The Hollywood Reporter, “Jeopardy!” host Alex Trebek will soon be back in his native Canada to host a reality show called “Canada’s Next Prime Minister.” Will the winner become a serious contender? Who knows? But the Canadian Broadcasting Corp. has already licensed the show’s format to the BBC, so Great Britain won’t be far behind.
When voters start following candidates from campaign stop to campaign stop — like fans following the Grateful Dead — we’ll know the age of celebrity presidents has truly arrived.
If the millions of people who turned the National Mall into a giant mosh pit during Tuesday’s inaugural ceremonies weren’t evidence enough, you only have to look to the thousands more who filled stadiums last year to hear candidate Obama speak. It’s difficult to tell the difference between a politician and a rock star when you have people fainting in the audience.
You may chalk that up to “hope” or “change you can believe in,” but more likely it’s simply the culmination of the obvious: Presidential elections have, for a long time, been mostly high-stakes popularity contests. Instead of getting a recording contract, you get to be the most powerful man in the world.
William Henry Harrison won the election of 1840 with the help of a song, “Tippecanoe and Tyler Too.” The song’s title doubled as his campaign slogan. Even back then, branding was everything, although it helped that the Democrats that year were too busy beating up each other to worry about Harrison and the Whig Party.
Franklin Pierce won the election of 1852 in a landslide, aided in part by a campaign biography written by his old pal and college roommate Nathaniel Hawthorne, whose novel “The Scarlet Letter” has tortured high school students for decades. Say what you will about Pierce’s presidency, his political propaganda had the best pedigree ever. The closest President Obama could come to topping that was writing two autobiographies of his own.
Even if he isn’t the first celebrity president, President Obama has played the part better than any of his predecessors — although I have heard that President James Buchanan could really tell a joke. He even has suggested CNN medical correspondent Dr. Sanjay Gupta as the nation’s surgeon general. What’s next? “America’s Most Wanted” host John Walsh as FBI director?
One thing is certain. The new president is well on his way to being the most commercialized public figure since Elvis Presley. You can’t watch television without being bombarded by advertisements for Obama commemorative plates and laser-dyed Obama coins. That’s in addition to the ubiquitous T-shirts, posters and tote bags. Can velvet paintings of the commander in chief be far behind?
But it gets stranger. The New York Times found Obama hot sauce, Obama soap and even Obama toilet paper, which is just decorative because the ink on the paper is toxic. What would Surgeon General Gupta have to say about that? Nothing good, I bet.
Then it gets downright insane. If a T-shirt isn’t enough of a statement, you can always wear the Obama thong, although hopefully not in public. And then, the Times reported, there is the Obama sex toy, which, given recent federal court decisions, is illegal under Alabama law, unless used for a legitimate medical purpose, whatever that means.
The world hasn’t seen such a marketing blitz since George Lucas unleashed the first batch of “Star Wars” lunchboxes and bed sheets.
At this rate, how long will it be before presidential elections are determined by text messaging, just like “American Idol” winners are? Some people have already proposed allowing Americans to vote for president via the Internet, which isn’t much different.
Other countries have the same idea. According to The Hollywood Reporter, “Jeopardy!” host Alex Trebek will soon be back in his native Canada to host a reality show called “Canada’s Next Prime Minister.” Will the winner become a serious contender? Who knows? But the Canadian Broadcasting Corp. has already licensed the show’s format to the BBC, so Great Britain won’t be far behind.
When voters start following candidates from campaign stop to campaign stop — like fans following the Grateful Dead — we’ll know the age of celebrity presidents has truly arrived.
Thursday, January 15, 2009
Poe gains respect in time for his 200th birthday
Edgar Allan Poe |
He died Oct. 7, 1849, four days after being discovered delirious and “in great distress” on the streets of Baltimore. Poe never regained his senses, so he never said how he came to be wandering the streets alone. He was 40 years old, and his cause of death remains mostly speculation. Newspaper reports at the time attributed it, more or less, to alcoholism. Thus began the legend of Poe — the tortured, self-destructive, unfairly maligned and often misunderstood artist.
Poe himself, however, began 200 years ago Monday.
He was born Jan. 19, 1809, in Boston. His parents died when he was young, and John and Frances Allan of Richmond, Va., took him in. The couple never adopted him, but they did give him his middle name, and because of his time growing up in Virginia, Poe always thought of himself as a Southern gentleman. The Southern literary establishment reciprocated by claiming Poe as one of its own.
While some critics debate just how Southern Poe was, there seems little question that the bleak, gothic style of Poe’s poems and stories fits well with the Southern mindset after the Civil War. The crumbling castles and corrupt bloodlines of Poe are a non-regional counterpart to the wrecked plantations and ruined families of the South. Obvious comparisons between Poe and William
Faulkner are the stuff of which college and high school term papers are made.
But it was much later before the rest of America’s literary establishment — the critics and the college professors — would give Poe his due.
Poe did himself no favors. He disliked most of the powerful and influential literary figures of his day, except Nathaniel Hawthorne. He practically dared later critics to take sides against him.
Sure, the French have always loved Poe, but they also think Jerry Lewis is a comic genius. So, what do they know?
Stephen King counts Poe as an influence. So did H.P. Lovecraft, the early 20th century pulp writer who admired Poe above all other American writers.
King won a National Book Award in 2003, which dismayed self-appointed keepers of the American literary canon like Yale professor Harold Bloom. Meanwhile, the Library of America recently published a collection of Lovecraft’s stories, admitting Lovecraft into that canon that Bloom so loves to defend.
In the 1960s, director Roger Corman made a series of films loosely based on Poe’s stories and starring Vincent Price. Some critics now rank Corman and Price’s Poe adaptations — considered cheap, drive-in fare at the time — among the finest horror films ever produced.
Poe’s own reputation has undergone a similar rehabilitation, and perhaps it’s because his critics have finally caught up to him. They now have a healthy appreciation for a finely crafted weird tale filled with horror and despair.
But Poe has never needed the appreciation of the literary gatekeepers to capture his readers’ imaginations. And even some people who have never read his poems can quote the first lines of his most famous work, “The Raven.”
“Once upon a midnight dreary…”
So, on Monday, the 200th anniversary of Poe’s birth, a 60-year-old ritual will continue. A person concealed beneath a heavy coat and hat will visit Poe’s grave in Baltimore and leave three roses and a half-empty bottle of cognac. It’s a ritual begun by one of Poe’s admirers and carried on still, supposedly as a family tradition.
Few people have ever tried to discover the mysterious mourner’s identity. Knowing would spoil the mystery, and mystery has been part of Poe’s legend since the day he was laid to rest.
Thursday, January 08, 2009
Critical favorites undergo changes for ’09
The new year means change, and television is no exception. Three critically acclaimed TV shows are either winding down, with an eye toward spin-offs, or undergoing radical makeovers.
First is “Scrubs,” which returned this week for its eighth and possibly final season. The hospital-based comedy had been a staple on NBC but is now on ABC. I guess NBC has used up its good karma for picking up the final season of “Taxi” in 1982 after ABC canceled it.
“Scrubs” has had a good run, and this slightly abbreviated season of 18 episodes should give it a satisfying conclusion. I know other critics love the American version of “The Office” more, but I’ll nominate “Scrubs” as the funniest half hour on television since Fox canceled “Arrested Development.”
Still, even after this year, “Scrubs” could continue.
The show’s star, Zack Braff, is definitely leaving. Creator and executive producer Bill Lawrence is probably moving on, too. But there is persistent talk in Hollywood of either a spin-off or continuing of the series with a new cast.
I think it’s best to let “Scrubs” end now. Eight seasons is a lot for a TV show — especially a good one, because the good die young.
Besides, the last time a medical show mixing comedy and drama spawned a spin-off, we got “After M*A*S*H.” And I don’t think anybody wants another “After M*A*S*H.”
Meanwhile, I’m awaiting the return of “Battlestar Galactica” on Jan. 16 on the Sci-Fi Channel.
The mid-season cliffhanger was the most shocking — and depressing — of the series’ run, which has always been full of dramatic and daring plot twists.
The show’s producers have promised more of the same for the series’ final 10 episodes, which will close out arguably the best science fiction series ever aired on American television. Who would have expected so much from “Galactica” given the show’s pedigree? The current version of “Galactica,” which won a Peabody Award in 2005, is in an entirely different universe than its cheesy 1970s counterpart.
To get viewers ready, Sci-Fi is posting 10 mini-episodes on its Web site, Scifi.com. Also, the first half of the current season, season 4, was released Tuesday on DVD.
What’s next after “Battlestar Galactica”? Sci-Fi has set a 2010 premier for “Caprica,” a prequel series set decades before the events of “Galactica.” Prequels are usually a bad idea (for example, the “Star Wars” prequels), but the early footage from “Caprica” looks promising.
The biggest news of the past week, however, came from Britain, where the BBC announced the 11th actor to take on the lead role of “Doctor Who,” the longest-running sci-fi franchise in television.
Twenty-six-year-old Matt Smith will become to the youngest actor to play the role of the 900-year-old Time Lord, replacing David Tennant, who has become one of the series’ most popular Doctors, probably second only to Tom Baker, who held the mantle from 1974 to 1981. In Britain, picking a new Doctor is more important than picking a new James Bond.
Smith has a short but impressive resume, but his hiring has raised about the same amount of controversy as Auburn University’s latest pick for head football coach. Fortunately, I have confidence in incoming “Doctor Who” producer Steven Moffat, writer of most of the best “Doctor Who” stories since the series’ return in 2005.
Like soccer, “Doctor Who” is something the entire English-speaking world obsesses over — except the United States. The show was a cult phenomenon here during its original run (1963-1989). But the revived series has gained a larger following after airing on Sci-Fi and BBC America. And if any sci-fi series can challenge “Galactica” for best ever, it’s not “Star Trek.” It’s “Doctor Who.”
Tennant will remain the Doctor for a few specials throughout 2009, but in 2010 the team of Smith and Moffat will take the long-lived program into its sixth decade.
First is “Scrubs,” which returned this week for its eighth and possibly final season. The hospital-based comedy had been a staple on NBC but is now on ABC. I guess NBC has used up its good karma for picking up the final season of “Taxi” in 1982 after ABC canceled it.
“Scrubs” has had a good run, and this slightly abbreviated season of 18 episodes should give it a satisfying conclusion. I know other critics love the American version of “The Office” more, but I’ll nominate “Scrubs” as the funniest half hour on television since Fox canceled “Arrested Development.”
Still, even after this year, “Scrubs” could continue.
The show’s star, Zack Braff, is definitely leaving. Creator and executive producer Bill Lawrence is probably moving on, too. But there is persistent talk in Hollywood of either a spin-off or continuing of the series with a new cast.
I think it’s best to let “Scrubs” end now. Eight seasons is a lot for a TV show — especially a good one, because the good die young.
Besides, the last time a medical show mixing comedy and drama spawned a spin-off, we got “After M*A*S*H.” And I don’t think anybody wants another “After M*A*S*H.”
Meanwhile, I’m awaiting the return of “Battlestar Galactica” on Jan. 16 on the Sci-Fi Channel.
The mid-season cliffhanger was the most shocking — and depressing — of the series’ run, which has always been full of dramatic and daring plot twists.
The show’s producers have promised more of the same for the series’ final 10 episodes, which will close out arguably the best science fiction series ever aired on American television. Who would have expected so much from “Galactica” given the show’s pedigree? The current version of “Galactica,” which won a Peabody Award in 2005, is in an entirely different universe than its cheesy 1970s counterpart.
To get viewers ready, Sci-Fi is posting 10 mini-episodes on its Web site, Scifi.com. Also, the first half of the current season, season 4, was released Tuesday on DVD.
What’s next after “Battlestar Galactica”? Sci-Fi has set a 2010 premier for “Caprica,” a prequel series set decades before the events of “Galactica.” Prequels are usually a bad idea (for example, the “Star Wars” prequels), but the early footage from “Caprica” looks promising.
The biggest news of the past week, however, came from Britain, where the BBC announced the 11th actor to take on the lead role of “Doctor Who,” the longest-running sci-fi franchise in television.
Twenty-six-year-old Matt Smith will become to the youngest actor to play the role of the 900-year-old Time Lord, replacing David Tennant, who has become one of the series’ most popular Doctors, probably second only to Tom Baker, who held the mantle from 1974 to 1981. In Britain, picking a new Doctor is more important than picking a new James Bond.
Smith has a short but impressive resume, but his hiring has raised about the same amount of controversy as Auburn University’s latest pick for head football coach. Fortunately, I have confidence in incoming “Doctor Who” producer Steven Moffat, writer of most of the best “Doctor Who” stories since the series’ return in 2005.
Like soccer, “Doctor Who” is something the entire English-speaking world obsesses over — except the United States. The show was a cult phenomenon here during its original run (1963-1989). But the revived series has gained a larger following after airing on Sci-Fi and BBC America. And if any sci-fi series can challenge “Galactica” for best ever, it’s not “Star Trek.” It’s “Doctor Who.”
Tennant will remain the Doctor for a few specials throughout 2009, but in 2010 the team of Smith and Moffat will take the long-lived program into its sixth decade.
Thursday, January 01, 2009
This New Year’s, resolve to fight pessimism
It’s the start of a new year, a time when most of us are hopeful about what the future will bring.
Unfortunately, that hope tends to fade quickly, and not only in tough times. Even though America is the “land of opportunity,” most Americans — in fact, most people everywhere — are pessimistic about the long run.
That is the bad news. The good news is our pessimism is unfounded.
Bryan Caplan, a professor of economics at George Mason University in Virginia, calls this “pessimistic bias.” It is one of four biases he cites in his 2007 book, “The Myth of the Rational Voter,” as leading to bad public policy.
Caplan shows that people are generally too pessimistic about long-term economic trends. In short, they believe things are bad when they’re actually good, and they believe things are worse when they’re merely bad. This is despite overwhelming evidence that, in the long run, economic conditions are improving overall.
Long-run improvement wasn’t always the case. Throughout most of history, living conditions barely improved at all.
In his recent book “A Farewell to Alms,” Gregory Clark, chairman of the economics department at the University of California, Davis, writes that for most of the last 100,000 years, living conditions stayed the same — awful. Even with improvements in technology, the arts and science, people ate poorly and died young. Short-term improvements in living standards led to larger populations that then overwhelmed available resources, causing living standards to drop. Over the long run, life for the vast majority of people remained nasty, brutish and short.
But in 1800, all of that changed. The Industrial Revolution made long-run improvements in living standards possible, at least in countries that had an industrial revolution. In the 21st century, we’re reaping the benefits of more than 200 years of economic growth.
Still, the pessimistic bias remains, with many people and interest groups predicting the End of the World is just around the corner. Professional pessimists still cite books written in the early 1970s that predicted a worldwide economic collapse by the 1990s. Even the current recession isn’t that bad.
With the U.S. and most of the rest of the industrialized world in a recession, some measure of short-term pessimism is understandable. But as a whole, we’re almost always more pessimistic than we should be, whatever the economic conditions.
Yet, while the bias toward economic pessimism is important — especially if, as Caplan argues, it leads to voters voting irrationally — it’s not our only pessimistic bias.
Various studies have found that people overestimate their cancer risk, think divorce rates are increasing when the opposite is true and greatly overestimate the threat of terrorist attacks following 9/11. Pessimistic bias is everywhere.
French researchers have found that people are most pessimistic when they think they have no control over a situation. Thus, many have a pessimistic bias against airplane travel, even though flying is far safer than being behind the wheel of a car.
Caplan’s George Mason colleague Tyler Cowen devotes a chapter of his 1998 book, “In Praise of Commercial Culture,” to cultural pessimism, the belief that art, music and culture as a whole are always getting worse.
“The later we stand in history, the more likely that our favorite cultural products will lie in the relatively distant past,” he writes. “The passage of time implies that the entirety of the past contains an increasing amount of culture relative to any single point in time, such as the present. Cultural pessimism therefore appears increasingly persuasive over time.”
Persuasive? Yes. But not justified, as Cowen shows.
So, this New Year’s Day, resolve to hang on to that optimism you’re feeling. Even when everything around you seems to be falling apart, chances are at least some of that pessimism that taunts you is just in your head.
Unfortunately, that hope tends to fade quickly, and not only in tough times. Even though America is the “land of opportunity,” most Americans — in fact, most people everywhere — are pessimistic about the long run.
That is the bad news. The good news is our pessimism is unfounded.
Bryan Caplan, a professor of economics at George Mason University in Virginia, calls this “pessimistic bias.” It is one of four biases he cites in his 2007 book, “The Myth of the Rational Voter,” as leading to bad public policy.
Caplan shows that people are generally too pessimistic about long-term economic trends. In short, they believe things are bad when they’re actually good, and they believe things are worse when they’re merely bad. This is despite overwhelming evidence that, in the long run, economic conditions are improving overall.
Long-run improvement wasn’t always the case. Throughout most of history, living conditions barely improved at all.
In his recent book “A Farewell to Alms,” Gregory Clark, chairman of the economics department at the University of California, Davis, writes that for most of the last 100,000 years, living conditions stayed the same — awful. Even with improvements in technology, the arts and science, people ate poorly and died young. Short-term improvements in living standards led to larger populations that then overwhelmed available resources, causing living standards to drop. Over the long run, life for the vast majority of people remained nasty, brutish and short.
But in 1800, all of that changed. The Industrial Revolution made long-run improvements in living standards possible, at least in countries that had an industrial revolution. In the 21st century, we’re reaping the benefits of more than 200 years of economic growth.
Still, the pessimistic bias remains, with many people and interest groups predicting the End of the World is just around the corner. Professional pessimists still cite books written in the early 1970s that predicted a worldwide economic collapse by the 1990s. Even the current recession isn’t that bad.
With the U.S. and most of the rest of the industrialized world in a recession, some measure of short-term pessimism is understandable. But as a whole, we’re almost always more pessimistic than we should be, whatever the economic conditions.
Yet, while the bias toward economic pessimism is important — especially if, as Caplan argues, it leads to voters voting irrationally — it’s not our only pessimistic bias.
Various studies have found that people overestimate their cancer risk, think divorce rates are increasing when the opposite is true and greatly overestimate the threat of terrorist attacks following 9/11. Pessimistic bias is everywhere.
French researchers have found that people are most pessimistic when they think they have no control over a situation. Thus, many have a pessimistic bias against airplane travel, even though flying is far safer than being behind the wheel of a car.
Caplan’s George Mason colleague Tyler Cowen devotes a chapter of his 1998 book, “In Praise of Commercial Culture,” to cultural pessimism, the belief that art, music and culture as a whole are always getting worse.
“The later we stand in history, the more likely that our favorite cultural products will lie in the relatively distant past,” he writes. “The passage of time implies that the entirety of the past contains an increasing amount of culture relative to any single point in time, such as the present. Cultural pessimism therefore appears increasingly persuasive over time.”
Persuasive? Yes. But not justified, as Cowen shows.
So, this New Year’s Day, resolve to hang on to that optimism you’re feeling. Even when everything around you seems to be falling apart, chances are at least some of that pessimism that taunts you is just in your head.
Subscribe to:
Posts (Atom)