By Jennifer O’Connell
In the last 24 hours, I have read roughly 57,626 words.
I know this because I kept a diary.
For a while now, I’ve had a niggling suspicion that I’m not reading anymore; that my brain has responded to the cacophony of noise and chatter to which it is constantly subjected by sinking into a sullen torpor.
It’s like my hard drive has been trying to run too many applications at once, and now it’s telling me to take myself off into a quiet corner to press Ctrl/Alt/Delete.
I can remember the precise moment when it started.
I was sitting in a radio studio in Montrose, around a table as part of a discussion hosted by Ryan Tubridy. The panel included former education minister Gemma Hussey. We were talking about the internet in general, and social networking tools in particular.
‘‘Where does anyone find the time?" Hussey wanted to know.
‘‘So many people seem to spend so much time sitting in front of a screen clicking buttons, does it mean they have a lack of inner resources? Does it mean there’s nothing else going on in their heads, or their lives? How would you ever read a decent book?"
Afterwards, her disapproving words echoed in my head. Was she right?
Like many people - journalists in particular - I am always on my laptop, and my right thumb twitches uneasily if it’s not attached to my iPhone.
It is also incontrovertibly the case that I don’t consume novels at the prolific rate I once used to.
Am I becoming an empty-headed exhibitionist? Could too much technology be making me stupid?
The diary proved one thing: I am reading more words than ever before. It’s how I’m reading that’s the problem.
My reading matter over the last 24 hours comprised a dozen pages of a non-fiction book I’m reviewing; four feature-length articles from a newspaper, and a couple of pages of Grazia magazine; some research notes culled from the internet; a handful of e-mails; part of a free sample of Wolf Hall, Hilary Mantel’s book on Thomas Cromwell, which I downloaded via the new Kindle app for iPhone; a recipe for raspberry tiramisu; a short poem I came across on the internet; the children’s book, Tyrannosaurus Drip by Julia Donaldson; and somewhere between 500-600 Twitter updates, each tweet made up of up to 140 characters.
And yet. Fifty-seven thousand words is most of a short novel. I wouldn’t expect to read a short novel and instantly forget most of it. But somehow, I’ve managed to absorb only superficially - or not at all - the vast bulk of what I’ve read this weekend. I do remember most of what was in the research notes, and I know Tyrannosaurus Drip off by heart. Otherwise, there is a vast expanse of nothingness.
It seems quite likely that I am suffering from a phenomenon identified 18 months ago by the American writer Nicholas Carr in an essay in the Atlantic Monthly magazine, in which he wrote how ‘‘immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case any more. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do."
In Carr’s essay, he went on to question whether the internet might have detrimental effects on cognition, diminishing our capacity for concentration and contemplation.
His question tapped into fears being harboured secretly - and not so secretly - by many people, from parents to educators; technology addicts and technophobes to RTE radio presenters; and former education ministers to Daily Mail readers.
Nick Shackleton Jones, the manager of online and informal learning at the BBC, wrote recently how he believes that ‘‘it is likely that there are quite dramatic changes in brain development occurring on a massive scale as a result of the sudden introduction of pervasive technologies which change the nature of communication’’.
These effects might, he wrote, include ‘‘areas such as attention-span, distractability, memory, mood and self-control’’.
Is this what has happened tome? In reality, we can’t know if Shackleton Jones, or Hussey, or Carr is right - at least, not until there is time for some conclusive, longitudinal research to be carried out on the topic.
But some early studies do appear to bear out some of these fears.
A study of online habits conducted at University College London found that students were increasingly ‘‘power-browsing’’ articles, rather than reading them in depth.
Over a five-year period, the research team pulled data from logs which documented the behaviour of visitors to two websites providing access to academic journals, articles, and e-books.
They found that people using the sites exhibited ‘‘a form of skimming activity’’ - hopping from one source to another, rarely returning to pages they’d previously visited.
They typically read a page or two from any article or book, before apparently getting distracted and moving on.
‘‘It almost seems that they go online to avoid reading in the traditional sense," the researchers concluded.
Part of the problem may be that technology allows us to fool ourselves that we are getting lots done - the more windows you have open on your laptop, the more productive you feel.
But a study conducted at Stanford University in the US found that what it called ‘high-tech jugglers’ are, in fact, a lot less effective than old-fashioned single-taskers.
‘‘People who are regularly bombarded with several streams of electronic information do not pay attention, control their memory or switch from one job to another as well as those who prefer to complete one task at a time," the study found.
The researchers put 100 students through a series of three tests, and concluded that ‘heavy media multitaskers’ are suffering from what they called a ‘filter failure’.
‘‘They’re suckers for irrelevancy," said Communications Professor Clifford Nass, one of the researchers involved. ‘‘Everything distracts them."
The Stanford study is ongoing into whether chronic media multitaskers are simply born with an inability to concentrate, or - more worrying - have damaged their cognitive control by willingly taking in so much at once.
Another oft-quoted study by Microsoft points the finger of blame at the latter. It found that someone distracted by an e-mail message alert took an average of 24 minutes to return to the same level of concentration.
Anne Collins, a senior librarian at the Business Information Centre in the Ilac Library in Dublin has noticed our increasingly fickle attention spans trickling down into diminishing patience on the part of library users, and a desire for ever more instant results.
‘‘I think people are less inventive about how they look for information - they expect to get everything very much instantaneously, and if they don’t, they get frustrated and may be inclined to give up. Often, they have little idea how to search for anything beyond typing a word into Google," she says.
‘‘Students come into the library now and it sometimes hasn’t entered their heads to look in a book for answers."
The notion that technology has changed the way we access and absorb information is hardly a revolutionary one. But could it also be affecting us at a deeper level? Could it - as Carr suggests in his essay - be changing not just how we look for information, but our cognitive processes as well?
Until 20 years ago, researchers used to believe that the brain was static, and that it was on an inexorable downward spiral throughout our lives. We now know that this isn’t true - the human brain is actually incredibly elastic.
Numerous studies have proven that experience can shape brain anatomy and physiology.
What this means is that we are capable of developing not just new synapses - or neurotransmitters - in response to stimulus, but entirely new neurons, or brain cells, right into adulthood.
The brain’s capacity to grow and learn is, as Emily Dickinson put it, ‘‘wider than the sky’’.
So it is not unthinkable, then, that the brain could reprogramme itself to better suit the qualities favoured by this new, more superficial way of learning.
Of course, there’s always another possibility. Human beings - especially those of a more mature vintage - have an almost unbroken record of heralding every new development as though it marked the end of civilisation as we knew it - particularly when it’s a development more likely to appeal to the young.
In Plato’s time, Socrates bemoaned the development of writing. He feared that, as people came to rely on the written word as a substitute for the knowledge they used to carry inside their heads, they would ‘‘cease to exercise their memory and become forgetful’’; they would also be able to pass themselves off as ‘‘very knowledgeable when they are, for the most part, quite ignorant’’.
When Gutenberg developed the printing press in the 15th century, there were similar concerns that the easy availability of books would lead to intellectual laziness, making men ‘‘less studious’’ and weakening their minds.
Jane Ruffino, a journalist and specialist in the history and archaeology of cartography, points out that many of the same suspicions surrounded the advent of widespread mapping in the 17th century; and again ‘‘later when photography came in, people were afraid that it would destroy the experience of seeing things. In fact, what it did was to free painting up to become more abstract’’.
Douglas Adams perhaps described the phenomenon best in his 2002 book, The Salmon of Doubt: ‘‘Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works. Anything that’s invented between when you’re 15 and 35 is new and exciting and revolutionary and you can probably get a career in it. Anything invented after you’re 35 is against the natural order of things."
So could this be merely an extension of the hysteria that greeted the arrival of everything from the first printed pamphlets to the domestic television set and the sight of Elvis’s gyrating hips in living rooms around America?
Because, despite all these treats to our identity of knowledge, human civilisation has somehow survived and prospered. We made space for the new way of doing things, without abandoning the old.
Inevitably, some things were lost along the way - much valuable local knowledge vanished with map-making; storytelling skills were compromised by the development of printing; television did indeed encroach on family mealtimes.
But much, much more was gained.
I want to remember what the world was like before there was Twitter, or Wikipedia, or YouTube, or blogging, or Google, or LexisNexis, or Facebook, or LinkedIn; before even there was Windows or e-mail. Before all that, if you were a journalist, there were really only two information sources: other people and the library.
Though it’s hard to imagine now, journalists used to spend a lot of time in libraries. Those with romantic notions of the pre-technology era will point to all the things we have lost since we stopped using libraries as our primary source of information: time; space; peace; opportunities for reflection and unexpected discoveries. The chance to let those synapses do their thing.
As a newly-minted reporter, I passed many hours in the Ilac Business Library, digging through extensive cuttings files for background information on the week’s profile subject, or searching one of the early, subscription-based online resources. The library was our Google.
I am half-surprised to find that it’s still open. In fact, it’s not merely open - it’s positively kicking.
According to the helpful senior librarian in the business information centre, Anne Collins, libraries have (contrary to what you might believe) prospered in the age of the internet.
‘‘Last year, we answered 50,000 enquiries from members of the public. In 2008 and 2009, almost 500,000 people came to the library every day. In 2008, they borrowed 188,000 books. In 2009, that number had grown to 218,500," she says.
‘‘So yes, people are still reading; they’re still getting their information from traditional sources as well as from online ones."
Collins says her job these days is less about digging out obscure book titles and more about helping people navigate the vast universes of information now available to them.
‘‘Indisputably, technology has enhanced all our lives. But it does also mean that students have lost some of their research skills. So our role these days is much more about helping them to develop the tools to find what they’re looking for; and to learn how to assess the quality of various information sources. Weeding out what’s a quality source and what’s not is a huge part of what we teach them. We won’t do the research for them, but we will teach them how to research."
Despite this, Collins is almost unequivocally positive about the changes technology has made to the role of libraries.
‘‘Do libraries still have a role to play? Certainly. There’s more of a role than ever, but it’s an increasingly hand-in-hand role."
Universities have always been early adopters when it comes to internetbased technologies.
So it’s perhaps no surprise that dire predictions heralding the end of learning and the dawning of an age of stupidity leave the director of NUI Galway’s Centre for Excellence in Learning and Teaching wholly unruffled.
Iain McLaren thinks there is too much emphasis on the perceived downsides of technology in education: the superficiality of learning, poor concentration, the temptation to plagiarise.
‘‘It’s true that there’s more information than ever available to students, and when there’s so much information, there’s a pressure on them to be a bit more superficial. And yes, students scavenge like everyone else when they’re on a deadline," he says.
‘‘But absorbing lots of information quickly is a skill that’s important in certain environments - for example, when they start work, they might have to write a report quickly, using lots of different sources."
McLaren points out that most universities, including his own, have sophisticated plagiarism software systems to ensure that ‘researching’ doesn’t become ‘cutting and pasting’.
‘‘We use a programme which scans essays for any information that has been cut and pasted from any where else on the web, or any other essay ever submitted by a student in any university which uses the same system. So yes, Google has made it easier to plagiarise - but technology has also made it much, much easier to detect."
McLaren believes that, rather than making learning more superficial, technology has helped to do the opposite.
‘‘Part of the art of being a student has always been to learn to be quite tactical about when you can get away with being superficial, and when you need to spend a bit of time researching something.
‘‘Our university uses a system which allows students write online journals - blogs, essentially -which are only accessible by the students and their lecturers. The students use them to update their tutors on the progress of an essay, for example. So it means lecturers can be much more hands-on during the whole process - they can tell a student if they’re on the right track, or if they need to delve a bit deeper into something," he says.
‘‘It’s made the whole process much more inclusive and made it easier for students to approach their lecturers with a question or a problem."
McLaren suggests much of the suspicion surrounding the effect of new technology on learning is down to old-fashioned Luddism.
‘‘On the whole, students tend to be much more willing to adapt and experiment than their parents were. I sometimes think our fears are just a product of our minds being swamped by the possibilities it all offers," he says.
It is indisputable that technology has made education more accessible; there are clear benefits for opening learning courses to using the internet as a platform.
But without ever having paid a cent in registration or tuition fees, you can download entire university courses across a wide range of subjects through iTunes or YouTube and watch them on your laptop or iPhone.
For teachers willing to experiment, technology has opened up exciting new possibilities in the classroom too.
In Scotland, Ollie Bray, a deputy head teacher and prolific blogger, is constantly blurring the lines between modern and traditional teaching methods. He has used Twitter as away to gather information on real-time data on weather conditions during the recent freeze; employs Google Earth as a tool to teach geography; and Star Wars as an engaging way to teach climate regions to his students.
‘‘In my first year of teaching I had an autistic student in my geography class who was fascinated by Star Wars," he wrote on his blog. ‘‘Getting him interested in geography was, of course, a different matter. One day, I had a brainwave and taught the Standard Grade Climate Regions Unit in the style of Star Wars planets. The tundra became Hoth the ice planet, and so on.
‘‘It worked brilliantly, and the rest of the class enjoyed it as well. Although, I have to admit, the next unit on industry was a bit problematic to get them motivated for." Like the headlines last year which declared that ‘Facebook causes cancer’, proclaiming that ‘technology makes us stupid’ seems like a hysterical overreaction based on not very much evidence.
Even evidence that once seemed solid appears less compelling as time goes by. The authors of an oft-cited 1998 study entitled Internet Paradox: A Social Technology That Reduces Social Involvement and Psychological Well Being?, which had discovered small but perceptible negative effects in social involvement and psychological wellbeing among internet users, have since changed their minds.
‘‘In a three-year follow-up of the original sample, we find that negative effects dissipated over the total period," they wrote recently.
‘‘We also report findings from a longitudinal study in 1998-1999 of new computer and television purchasers. This new sample experienced overall positive effects of using the internet on communication, social involvement and well-being."
Similarly, video games - the scourge of every parent of a teenager in the 1990s - have been found to have many positive benefits including on coordination and reaction times, says Piaras Kelly, an account director at PR firm Edelman and busy blogger who admits he is ‘‘online 24/7’’.
Kelly gets irritated when video games ‘‘are dismissed as children’s toys’’, and believes much of the negative press attention given to new media is being fed by intellectual snobbery.
‘‘I’m watching television shows and reading books I’d never have read without the internet - I just read something called Pride, Prejudice and Zombies - it’s like a remix of the original Pride and Prejudice. That may be hacker culture, but it’s another form of culture nonetheless - and it’s a book I’d never have read otherwise."
In Britain, the psychologist Tanya Byron was commissioned by the government to conduct a review of the studies looking at the effect of technology and video games on children’s brain development and found ‘‘no clear evidence of desensitisation in children’’; ‘‘children actively involved in sport play on consoles for the same amount of time as those who are not’’ and concluded that there was much ‘‘technology specifically useful; for those with learning difficulties and disabilities’’.
Of course, it is possible that the dizzying pace at which we are able to access and accumulate information now will eventually have some cognitive impact. Or perhaps we’ll just accept that we’re not very good multitaskers and get better at shutting down some of those windows occasionally.
But there is plenty of concrete evidence to suggest that, when used correctly, technology can actually be away to enhance learning.
‘‘This idea that technology makes us do anything is to suggest that we have no agency, that we’re passive consumers - which is really ironic, because the whole thing is about being an active producer," says Jane Ruffino.
‘‘When these things first come in, we feel like we should lament what we’re going to lose. But we won’t lose anything unless we give it up willingly - nobody’s going to take away our books, or our maps. When we put things online, it becomes like an auxiliary brain. It has changed the role of memory, reduced the need for rote memorisation."
As Ruffino point out, in any case, we’ve had all these conversations before. ‘‘In the early 17th century, there was a huge surge in using an archive to represent the self," she says. ‘‘It was at that time that what eventually became the British Library was set up, that the early maps were being drawn up.
"It sparked all this debate about what you include and what you throwaway; how the self should be represented, what your public self was, and what your private self should be. It’s a really striking parallel to the question of having an offline identity and an online identity which we’re debating today in the context of Twitter and Facebook.
‘‘We’re having the same conversations now that we’ve been having for 450 years. And we still don’t have any answers."
This feature article first appeared in The Sunday Business Post on February 14, 2010