I am quite eager to comment on the recent explosion of e-commentary regarding Nicolas Carr’s new book. Bloggers have already done an excellent job summarizing the response to Carr’s argument. Further, Clay Shirky and Jonah Lehrer have both argued convincingly that there’s not much new about this sort of reasoning. I’ve also argued along these lines, using the example of language itself as a radical departure from pre-linguistic living. Did our predecessors worry about their brains as they learned to represent the world with odd noises and symbols?
Surely they did not. And yet we can also be sure that the brain underwent a massive revolution following the acquisition of language. Chomsky’s linguistics would of course obscure this fact, preferring us to believe that our linguistic abilities are the amalgation of things we already possessed: vision, problem solving, auditory and acoustic control. I’m not going to spend too much time arguing against the modularist view of cognition however; chances are if you are here reading this, you are already pretty convinced that the brain changes in response to cultural adaptations.
It is worth sketching out a stock Chomskyian response however. Strict nativists, like Chomsky, hold that our language abilities are the product of an innate grammar module. Although typically agnostic about the exact source of this module (it could have been a genetic mutation for example), nativists argue that plasticity of the brain has no potential other than slightly enhancing or decreasing our existing abilities. You get a language module, a cognition module, and so on, and you don’t have much choice as to how you use that schema or what it does. The development of anguage on this view wasn’t something radically new that changed the brain of its users but rather a novel adaptation of things we already and still have.
To drive home the point, it’s not suprising that notable nativist Stephen Pinker is quoted as simply not buying the ‘changing our brains’ hypothesis:
“As someone who believes both in human nature and in timeless standards of logic and evidence, I’m skeptical of the common claim that the Internet is changing the way we think. Electronic media aren’t going to revamp the brain’s mechanisms of information processing, nor will they supersede modus ponens or Bayes’ theorem. Claims that the Internet is changing human thought are propelled by a number of forces: the pressure on pundits to announce that this or that “changes everything”; a superficial conception of what “thinking” is that conflates content with process; the neophobic mindset that “if young people do something that I don’t do, the culture is declining.” But I don’t think the claims stand up to scrutiny.”
Pinker makes some good points- I agree that a lot of hype is driven by the kinds of thinking he mentions. Yet, I do not at all agree that electronic media cannot and will not revamp our mechanisms for information processing. In contrast to the nativist account, I think we’ve better reason than ever to suspect that the relation between brain and cognition is not 1:1 but rather dynamic, evolving with us as we develop new tools that stimulate our brains in unique and interesting ways.
The development of language massively altered the functioning of our brain. Given the ability to represent the world externally, we no longer needed to rely on perceptual mechanisms in the same way. Our ability to discriminate amongst various types of plant, or sounds, is clearly sub-par to that of our non-linguistic brethren. And so we come full circle. The things we do change our brains. And it is the case that our brains are incredibly economical. We know for example that only hours after limb amputation, our somatosensory neurons invade the dormant cells, reassigning them rather than letting them die off. The brain is quite massively plastic- Nicolas Carr certainly gets that much right.
Perhaps the best way to approach this question is with an excerpt from social media. I recently asked of my fellow tweeps,
To which an astute follower replied:
Now, I do realize that this is really the central question in the ‘shallows’ debate. Moving from the basic fact that our brains are quite plastic, we all readily accept that we’re becoming the subject of some very intense stimulation. Most social media, or general internet users, shift rapidly from task to task, tweet to tweet. In my own work flow, I may open dozens and dozens of tabs, searching for that one paper or quote that can propel me to a new insight. Sometimes I get confused and forget what I was doing. Yet none of this interferes at all with my ‘deep thinking’. Eventually I go home and read a fantastic sci-fi book like Snowcrash. My imagination of the book is just as good as ever; and I can’t wait to get online and start discussing it. So where is the trade-off?
So there must be a trade-off, right? Tape a kitten’s eyes shut and its visual cortex is re-assigned to other sensory modalities. The brain is a nasty economist, and if we’re stimulating one new thing we must be losing something old. Yet what did we lose with language? Perhaps we lost some vestigial abilities to sense and smell. Yet we gained the power of the sonnet, the persuasion of rhetoric, the imagination of narrative, the ability to travel to the moon and murder the earth.
In the end, I’m just not sure it’s the right kind of stimulation. We’re not going to lose our ability to read in fact, I think I can make an extremely tight argument against the specific hypothesis that the internet robs us of our ability to deep-think. Deep thinking is itself a controversial topic. What exactly do we mean by it? Am I deep thinking if I spend all day shifting between 9 million tasks? Nicolas Carr says no, but how can he be sure those 9 million tasks are not converging around a central creative point?
I believe, contrary to Carr, that internet and social media surfing is a unique form of self stimulation and expression. By interacting together in the millions through networks like twitter and facebook, we’re building a cognitive apparatus that, like language, does not function entirely within the brain. By increasing access to information and the customizability of that access, we’re ensuring that millions of users have access to all kinds of thought-provoking information. In his book, Carr says things like ‘on the internet, there’s no time for deep thought. it’s go go go’. But that is only one particular usage pattern, and it ignores ample research suggesting that posts online may in fact be more reflective and honest than in-person utterances (I promise, I am going to do a lit review post soon!)
Today’s internet user doesn’t have to conform to whatever Carr thinks is the right kind of deep-thought. Rather, we can ‘skim the shallows’ of twitter and facebook for impressions, interactions, and opinions. When I read a researcher, I no longer have to spend years attending conferences to get a personal feel for them. I can instead look at their wikipedia, read the discussion page, see what’s being said on twitter. In short, skimming the shallows makes me better able to choose the topics I want to investigate deeply, and lets me learn about them in whatever temporal pattern I like. Youtube with a side of wikipedia and blog posts? Yes please. It’s a multi-modal whole brain experience that isn’t likely to conform to ‘on/off’ dichotomies. Sure, something may be sacrificed, but it may not be. It might be that digital technology has enough of the old (language, vision, motivation) plus enough of the new that it just might constitute or bring about radically new forms of cognition. These will undoubtably change or cognitive style, perhaps obsoleting Pinker’s Bayesian mechanisms in favor of new digitally referential ones.
So I don’t have an answer for you yet ToddStark. I do know however, that we’re going to have to take a long hard look at the research review by Carr. Further, it seems quite clear that there can be no one-sided view of digital media. It’s not anymore intrinsically good or bad than language. Language can be used to destroy nations just as it can tell a little girl a thoughtful bed time story. If we’re to quick to make up our minds about what internet-cognition is doing to our plastic little brains, we might miss the forest for the trees. The digital media revolution gives us the chance to learn just what happens in the brain when its’ got a shiny new tool. We don’t know the exact nature of the stimulation, and finding out is going to require a look at all the evidence, for and against. Further, it’s a gross oversimplification to talk about internet behavior as ‘shallow’ or ‘deep’. Research on usage and usability tells us this; there are many ways to use the internet, and some of them probably get us thinking much deeper than others.