Showing posts with label neuroscience. Show all posts
Showing posts with label neuroscience. Show all posts

Thursday, December 29, 2011

Culture Shock 12.29.11: Not everything was annoying in 2011

Once again, it's time to say goodbye and good riddance to another year.

As far as years go, 2011 isn't winning any awards, but at least it isn't the end of the world. For that, we have to wait until next December, or so I'm told by people who don't really understand how calendars work.

On Dec. 21, 2012, the Mayan calendar "runs out." This uneventful event is apparently regarded as cosmically significant by people who don't realize that our own Gregorian calendar runs out every Dec. 31 — at which point it cycles back to the beginning, as it has since it was introduced in 1582, replacing the Julian calendar, which ran out with virtually identical frequency, give or take a few doomsdays.

At this point, however, if the world did end, I'd be hard pressed to say we didn't have it coming.

When I look back at 2011 and see that one of the cultural high points was the return of "Beavis and Butt-Head" to MTV, I know the pickings are pretty slim.

Still, in keeping with the spirit of the season, here are a few things that did not annoy me in 2011:

The sixth season of "Doctor Who" was the best since the show's revival in 2005, and Matt Smith firmly established himself as my favorite Doctor since Tom Baker's tenure in the 1970s.

"House" is never going to be as good as the first three seasons were, but at least the current Cuddy-free season is an improvement over last year's. Nothing against departed co-star Lisa Edelstein, but after the writers decided to have House and Cuddy get together — and then break up — her leaving was the only thing that could save the show. It has been more than 20 years since Dave and Maddie's kiss of death on "Moonlighting," yet TV writers still tempt fate.

C'est la vie.

At the movies, the best of the best was Werner Herzog's 3-D documentary "Cave of Forgotten Dreams," which I reviewed a few weeks ago.

For superhero movies, it was a down year, with "X-Men: First Class" the best of the bunch, despite glaring flaws like January Jones' non-performance. I have higher hopes in 2012 for "The Avengers," but I'm worried about "The Dark Knight Rises," which seems dangerously close to taking the whole "taking Batman seriously" thing way too seriously.

Apart from "The Avengers," the two films I'm most anticipating are both prequels: Ridley Scott's "Alien" prequel "Prometheus," and Peter Jackson's return to Middle Earth with "The Hobbit: An Unexpected Journey," featuring Martin Freeman as Bilbo Baggins. (If you're honest, you'll admit "The Hobbit" is a much better story than the bloated "Lord of the Rings" trilogy.) Speaking of Martin Freeman, he and Benedict Cumberbatch return next year for a second season of "Sherlock," the BBC's modern-day version of Sherlock Holmes, from the creative team of Mark Gatiss ("League of Gentlemen") and Steven Moffat ("Doctor Who").

The best books I read in 2011 were mostly nonfiction: "The Tell-Tale Brain: A Neuroscientist's Quest for What Makes Us Human" by V.S. Ramachandran and "Pauline Kael: A Life in the Dark," Brian Kellow's biography of the still-influential New Yorker movie critic whose style the rest of us all lamely try to imitate.

I'll have to take an incomplete on Haruki Murakami's newly translated novel "1Q84," which I've just started and is approximately the length of the Tokyo phone directory.

And lastly, on the music front, a word of advice: If you have a chance to see thepau Alabama Shakes perform live, take it. This little band, originally from Athens — as am I, so I confess a slight bias — is probably about to hit it big, and deservedly so.

For the Shakes, 2011 wasn't a bad year at all. But 2012 will be even better.

Thursday, July 21, 2011

Culture Shock 07.21.11: No, the Internet is not making you stupid

Google the question "Is Google making us stupid?" and you'll get roughly 640,000 results.

Like every new technology, the Internet is going through that phase when it gets the blame for everything wrong with the world.

The Internet exploded into the mainstream in the mid-1990s, so it's been a very long phase. But 15 or so years is nothing. Fifty years after Newton N. Minow, then-chairman of the Federal Communications Commission, called television a "vast wasteland," TV still takes heat for all sorts of societal ills, from youth violence (even though youth violence has been steadily going down since the 1970s) to childhood obesity (which you could easily blame on any sedentary activity, like reading a book).

The claim that the Internet is making us stupid got a level-up this past week after publication of a study showing that the Internet is actually changing how we think and remember things.

Now, as it happens, the authors of the study said, explicitly, that their findings did not — I repeat, not — mean the Internet is making us stupid.

"I don't think Google is making us stupid — we're just changing the way that we're remembering things," lead author Betsy Sparrow of Columbia University told the BBC. "If you can find stuff online even while you're walking down the street these days, then the skill to have, the thing to remember, is where to go to find the information."

Basically, our brains are getting better at knowing where and how to find stuff, but at the expense of just remembering stuff, and that's because we're unconsciously training, and rewiring, our brains to work that way by how we use the Internet.

For some people, this is a self-evidently bad thing.

Yet if so, why stop there? The Internet isn't the first technology to alter the workings of our brains.

Centuries ago, our ancestors had fantastic memories that put ours, even pre-Internet, to shame. Bards and poets in the ancient Greek world could recite, from memory, epic tales of gods and heroes that, when written down today, run on for hundreds of pages. True, they used repeating lines and other mnemonic tricks to keep everything straight, but they really did have memories built for recalling a lot of information.

If behavior can rewire the human brain, then that's exactly what you'd expect, because the Greeks who lived in the dark ages between the fall of the Bronze Age city states and the rise of classical Athens, were, by and large, illiterate. This is when bards created the oral tradition of epic poetry that would eventually find its way into print as "The Iliad" and "The Odyssey."

The finished works we have today, attributed to the blind poet Homer, may not have been written down until the classical period, centuries after their composition began. Before that, these unwritten stories were passed down by memory, bard to bard, generation to generation.

Arguably the greatest, most influential works of Western literature were composed by illiterates, who relied not on writing but on their remarkable recall.

So, if you're really worried about technology robbing people of their ability to remember things, start with writing. That's where the trouble all began.

Thursday, October 22, 2009

Culture Shock 10.22.09: Gap between art, science no longer so wide


If I'd been better at math in high school, I probably would have become a scientist.

When I was young, I was obsessed with cosmology and paleontology, which is a fancy way of saying I was really into dinosaurs and outer space. While all of my classmates were reading "Where the Wild Things Are," I was reading elementary-level books about biology and astronomy.

I taught myself to spell by memorizing dinosaur names and the nine (now eight) planets of the solar system.

In any case, my math skills were less than stellar, so eventually I gravitated from science geek to art geek. Still, I've tried to maintain something more than a layman's knowledge of science.

The conventional wisdom is that art and science are incompatible ways of looking at the world, separated by a Grand Canyon of misunderstanding, distrust and outright hostility.

Certainly that's the impression one gets from 19th century poet John Keats, who lamented that Isaac Newton had "destroyed the poetry of the rainbow by reducing it to a prism."

The Romantic movement, to which Keats belonged, was a reaction to the scientific, rationalist philosophy of the Enlightenment, which had taken hold in Europe and America a century earlier. Today, in some circles, it's still fashionable to badmouth the Enlightenment, which, besides the scientific method, gave us the Declaration of Independence.

Newton, however, is getting his revenge. Neuroscientists and biologists are increasingly close to having a scientific understanding of why we make art in the first place.

Neuroscientist V.S. Ramachandran has speculated that some people are capable of creating great art because their brains are better wired than most for metaphorical thinking.

They may have more neural pathways, for example, between the part of the brain used for speech and the part used for identifying color. That could lead to a better ability to make associations between things that don't seem all that related, which is what poets do all the time.

The membrane separating science and art is not quite as impermeable as most people think, and it allows travel in both directions.

Albert Einstein was arguably engaged in artistic thinking when he imagined himself on a beam of light and thereby unlocked the door to a new understanding of space and time. The metaphor came first, the equations later.

More to the point, science could become as much an inspiration to artists as religion and mythology have been.

There is some evidence of that already. John Boswell has begun a project he calls Symphony of Science, online at www.symphonyofscience.com. So far, he has created two music videos spliced together from clips of Carl Sagan, Stephen Hawking, Richard Feynman and other scientists. It's art for the Information Age, inspired by science.

Who knows? Boswell's work could end up inspiring a new generation of artists. Or even scientists, depending on their math scores.

Thursday, March 12, 2009

Children’s brains may evolve for Information Age

If TV doesn’t rot your child’s brain, the Internet will, according to the latest scary speculation.

Susan Greenfield, an Oxford University neuroscientist, says exposure to social networking Web sites like Facebook and Twitter, along with video games and fast-paced TV shows, may be rewiring children’s brains.

I can think of two reasonable responses to this claim: “So what?” and “So what?”

At one level, this is exactly the sort of thumb-sucking response that has always greeted new media. Since the 1930s, do-gooders have blamed jazz, comic books, rock ’n’ roll, television, and now video games and the Internet, for everything wrong with “the children” — even when there isn’t anything wrong with children.

Like Paul Lynde in the Broadway musical “Bye Bye Birdie,” they ask, “Why can’t they be like we were, perfect in every way? What’s the matter with kids today?”

Juvenile crime across the board is down compared to 10 years ago. Fewer teens are having sex, and the ones who do are more likely to use protection. Fewer teens are getting abortions.

OK. I give. What exactly is the matter with kids today? If online social networking is bad for them, it hasn’t shown up in the statistics yet.

But let’s assume MySpace and Facebook really are rewiring children’s brains. In fact, I would be surprised if that were not the case. The real question is “Why is this a problem?”

This isn’t the first time the environment has rewired our brains. Before our ancestors developed written languages, they had excellent memories. But writing has taken away our ability to remember epics like “The Iliad.” Still, that seems a good trade-off.

Adults often complain that change happens too quickly and that the world is becoming too fast-paced. If children’s brains are adapting to the pace of modern society, that seems like a net plus. A brain that is better at filtering information from many different sources in a short span of time is a brain that is better suited to the Information Age.

How fast-paced is the world today? One useful measure is the length of the average movie shot. An average shot in a major motion picture today is 2 or 3 seconds long. By comparison, film critic Roger Ebert notes, the average shot in “Citizen Kane” (1941) is 11.4 seconds. By today’s standards, “Pulp Fiction” is a slow movie, which isn’t surprising given that Quentin Tarantino is, in many respects, an old-fashioned filmmaker. The average shot in “Pulp Fiction” is 7.9 seconds.

There is a question of cause and effect here. Are movies contributing to children’s shorter attention spans by rewiring their brains? Maybe, but I doubt that’s the whole answer. I suspect, instead, that movies are mainly catering to the shorter attention spans that have resulted from the quickening pace of life in general, with movies being a small part of that.

Phil Edholm, chief technology officer at Nortel, created a minor stir last year when he suggested that Attention Deficit Hyperactivity Disorder, at least in its mild form, may be an evolutionary adaptation for multitasking in the digital era. Parents of ADHD children where not amused. But maybe there is some truth to his speculation.

Scientists at Northwestern University have discovered that a genetic variation associated with ADHD seems to help people in nomadic cultures survive. The variation also seems to encourage novelty-seeking behavior. The researchers speculate that some ADHD-associated traits may have been beneficial in our prehistoric past.

If modern society is taking on attributes that make it more like our hunter-gatherer past — for example, greater mobility and a faster pace than agricultural societies — then genetic traits that were advantageous in the past could reassert themselves.

If so, that’s not cause for alarm. That might be just evolution.