Nicholas Carr, the Pulitzer-Prize nominee discusses the inherent ‘shallowness’ of Web 2.0 technologies, and the troubling consequences for our brains
More than 50 years ago, one of Marshall McLuhan’s predictions was that the world would become ‘a tightly-networked global village marked by the return of tribalism’. Some would say he’s not that far off. Do you think anything about today’s hyper-connected environment surprise him?
I certainly think McLuhan would be amazed by the fact that our telephones are now used more for exchanging text messages than voice messages, because he very much believed that electronic media would herald the end of the era of the written word and that we would move into a world of oral communication and rich images. While he was very prescient in many ways, there is a lot about our heavily technologized world that doesn’t quite fit with his predictions.
You have said that a technology’s ‘intellectual ethic’ has a profound effect on the way we think. What is the intellectual ethic of the Internet?
Inherent in any media technology – from the telephone to TV to Twitter -- is an emphasis of some ways of thinking and a de-emphasis on other ways of thinking. If you look at the Internet, what it emphasizes is the ability to supply lots of information in many forms very quickly. As a result, it encourages us to browse through information in a similar way – by grabbing lots of bits of information in many forms simultaneously. What it doesn’t encourage us to engage in is more attentive ways of thinking -- the kind of thinking that requires us to shield ourselves from distractions and focus on something for a long time -- the mode of thinking that underpins deep reading, contemplation, reflection and introspection. All of these ways of using our minds -- which to me, are very important -- are de-emphasized by the Internet. We’re not practicing them as much anymore, and I worry that as a society, we are in danger of losing them.
Describe the role long-term memory plays in intelligence, and the role the Internet is playing in its development.
To understand how the Internet affects us, you really have to look at how our brains work – and in particular, at the way intelligence is formed. Our working memory is basically the contents of our consciousness at any given moment -- it’s what we are aware of as we go through our waking hours. The fundamental quality of working memory is that it has an incredibly small capacity. Back in the 1950’s there was a famous paper published titled, “The Magic Number of Seven,” which argued that working memory could only hold about seven elements at any time. Today, even that is considered an exaggeration: it is now believed it can only hold about two to four pieces of information at a time.
What is crucial about building individual intelligence -- particularly deep, conceptual knowledge and understanding -- is the ability to move information that comes into our working memory over to our long-term memory, which is not constrained at all in terms of its capacity. The term for this is ‘memory consolidation’, and where the Internet begins to disrupt this process is that it keeps our working memory overloaded. When you’re constantly taking in new bits of information -- as we do when we browse the Web, check e-mail or read text messages on our cell phones -- we take things in very quickly, and because our working memory has such a small capacity, it has to kind of shepherd information into and out of the brain very quickly to make room for more new incoming stuff. If your working memory is constantly overloaded and you never pay focused attention to one thing, you aren’t consolidating information into your long-term memory, and as a result, you’re not building all the mental connections between information and experiences and emotions that are essential to developing a rich intellect. Research shows that these connections are essential to conceptual thinking and critical thinking and even to certain types of creative thinking.
You have said that the most unsettling economic phenomenon produced by the Internet is that, “The sharecroppers operate happily in an attention economy while their overseers operate happily in a cash economy.” Please explain.
The metaphor of sharecropping applies to a lot of the social media that we have seen arise over the last six years -- often collectively termed ‘Web 2.0’. In the United States after the Civil War, a lot of plantation owners began to allow sharecroppers to farm on small plots of land on their plantations. However, the plantation owners basically took all of the economic value this created, by renting out the tools to the sharecroppers and taking a large amount of the crops they produced. In a similar way, when you look at sites like YouTube or Facebook, these huge companies are enabling people to create information of various sorts and distribute it through their sites; but all of the economic value created -- usually from advertising -- goes back to the company rather than to the individuals who are actually drawing people there.
There is an element of exploitation to this model, but it doesn’t feel like exploitation to the people that are engaged in social media because they aren’t interested in money so much as attention and the ability to express themselves and communicate with others. If you look at the economic value of each individual’s work on these sites, it’s actually quite trivial; where it becomes sizable is when you aggregate all of those little bits of media production and sell them off to advertisers as these big companies are doing. This is a new form of media business that on one hand, sells the ideal of ‘social production’, but on the other hand, uses that ideal to make a lot of money without funnelling it back to the content producers.
You believe that computing is turning into a utility, much like electricity, and that as this happens, it will send ‘shock waves’ across society: “If the electric utility helped create the vast middle class, the computing utility may help destroy it.” Please explain.
One hundred years ago, if a company wanted to use electricity to power its machinery, it had to build and operate its own electric generator -- so that’s what hundreds of factories did. This was the basis for Thomas Edison’s business, which ultimately became General Electric. Then we saw the arrival of the electric utility, and suddenly companies didn’t have to run their own generators anymore. The generation and distribution of electricity was centralized into huge utilities, and companies, as well as individuals, could simply plug in to the electric grid. I think what we’ve seen over the last 10 years is a similar shift in the nature of computing. At one time, every company had to run its own data centre to operate all of its applications.
Individuals had to keep all their software and their data on their own hard drive inside their PC. But that mode of computing is now shifting to much more of a centralized or utility model: increasingly, companies and individuals simply go to the Internet and tap into big, centralized data centres to store their data, to gain access to data and to run applications. This is commonly referred to as ‘cloud computing’, and I believe it represents as fundamental a technological shift as we saw with the centralization of electricity.
After the building of the electric grid, all sorts of changes took place in business and society: people brought new appliances into their homes and companies used abundant electricity to refashion their manufacturing processes and so forth. I think we’re starting to go through a similar, radical kind of transformation with the arrival of the cloud computing industry. More and more companies are beginning to say, we don’t have to buy our servers and data storage gear and so forth; we can just plug into the Internet and get it all from central suppliers. Likewise, individuals increasingly don’t go out and buy software programs, they just go online and download them from these centralized services.
And how will this affect employment?
One of many consequences of the arrival of the electric grid was a growth in employment, particularly industrial employment. While lots of craftspeople lost their jobs as business tapped into this cheap power to build large, centralized manufacturing plants and distribute their goods across large areas, the number of manufacturing jobs that was created was far greater than the ones lost. Unfortunately, we are not seeing a similar scenario with the rise of centralized and cheap computing. We appear to be seeing a net loss of jobs, because cheap computing allows so much work to be automated. It’s still too early to say whether this trend will hold; we might end up seeing a large growth in middle class jobs, but personally, I’m very concerned -- as are many economists -- that the growth in cheap, centralized computing is going to destroy more middle class jobs than it creates.
You believe that the ‘wisdom-of-crowds effect’ has been exaggerated. Please explain why.
To date, all the major examples of how the Internet enables crowds of people to produce goods indicate that crowds of people are very good at some things, but not others. For instance, in the Linux open source model, crowds have been shown to be very good at finding bugs in massive amounts of software code; and with Wikipedia, crowds have been shown to be very good at editing individual encyclopedia entries. What this means is that a company can use crowds of online people to do these sort of routine, repetitive tasks -- which used to be very expensive to do because they required a lot of labour. So by enabling crowds, you can produce things that were more difficult to produce before.
However, what online crowds don’t seem to very good at is innovation. We haven’t seen much evidence that crowds of people can come up with great new products or business ideas in the way that talented individuals and small groups can. And crowds of people don’t seem to be very good at ‘polishing’ products, either -- taking a good product and getting it into the final format required to bring it to market and have it be successful. There too, you need individuals and small groups of people working together. You can’t compare, for instance, open source software applications with the types of consumer applications that Apple produces by hiring very talented software engineers and designers. I think it’s very easy to exaggerate the capabilities and the impact of crowds of online volunteers or contributors, and it’s important for business leaders to recognize that crowds can never replace talented individuals when it comes to creating value for a business.
Most people don’t have the option of ‘disconnecting’ from Web 2.0, because it is so essential to modern work and social life. For those who want to maintain a capacity for deep thinking and human understanding, what is your advice?
The Internet isn’t going anywhere, but if people value calmer, more contemplative thought, they need to change the way they use the technology. The goal should be to clear considerable portions of your day for working, conversing, thinking and playing without the mediation (or the interruption) of screen technologies. Once you start doing that – and I would emphasize that even modest cutbacks in connectivity are difficult to pull off -- I think you’ll fairly quickly begin to have a better perspective on the proper role for digital media in your life. You’ll realize that we often reach for a computer or a smartphone out of laziness, boredom or habit, and that resisting that temptation can be healthy, particularly for the depth of our intellectual and social lives.
Nicholas Carr’s most recent book, The Shallows: What the Internet Is Doing to Our Brains, was nominated for the 2011 Pulitzer Prize. He is also the author The Big Switch: Rewiring the World, from Edison to Google (2008) and Does IT Matter? (2004). A columnist for The Guardian in London, he has written for The Atlantic, The New York Times, The Wall Street Journal, Wired and The Financial Times and sits on the board of the World Economic Forum's cloud computing project. Read his blog at roughtype.com
[This article has been reprinted, with permission, from Rotman Management, the magazine of the University of Toronto's Rotman School of Management]