It’s not a new concept: Technology is changing the way we think. But one prominent researcher at a recent conference discussed a more controversial idea: Technology could be moving us away from innovation and progress, and closer to the Stone Age in terms of how we process information—a scary thought, considering the country’s desperate call for 21st-century thinking.
This disturbing theory comes from Pulitzer Prize nominee and New York Times bestseller Nicholas Carr’s book The Shallows: What the Internet Is Doing to Our Brain. And at the 15th annual American Association of School Librarians (AASL) conference in Minneapolis, Minn., Carr emphasized that it’s not just adults who should be worried.
“Schools and libraries are good places to see a snapshot of the cultural mindset on digital issues and change, and what they’re showing us is that instant access to information is everywhere,” said Carr.
Carr began his opening keynote by relating his own experiences with technology and the internet, saying he one day realized he had a harder time concentrating on one task.
“My mind wanted to jump around and not go word-to-word in a linear way. I thought: My mind wants to behave like the internet, like my smart devices,” he explained.
Carr then began to research why this brain pattern change could be happening. The answer, he found, lies in neuroscience and psychology.
According to Carr’s research, the invention of the internet is nothing new, in terms of history—objects like the map and the clock also changed the way our brains operate.
“With the map, our mind became more abstract. Instead of being focused on what we see visually, we began to think outside of our own sensory limitations. With the mechanical clock, our minds became more synchronized, in that we began to change our habits and tasks to coincide with the scientific measurement of time,” Carr explained.
Carr calls these inventions “intellectual technology,” or the tools we use to think—to find, store, organize, analyze, and share information and ideas. These tools change our patterns of thought.
Along with intellectual technology comes “intellectual ethic,” or the assumptions about the mind embedded in and spread by an intellectual technology—the medium’s “message,” said Carr.
“This occurs because the more our environment changes, the more our habits change, and the more our brains change to suit the environment, since your brain is always trying to be efficient. This malleability is called neuroplasticity,” he said.
The good, the bad, and the Stone Age
As we use technologies like smart phones and the internet, our brains are changing as well, and Carr argued that although we acquire skills, such as increased visual-spatial intelligence (being aware of many moving parts at once), we also weaken our “mindful knowledge acquisition,” inductive analysis, critical thinking, imagination, and reflection. [See “Can gaming change education?” and “Rethinking research in the Google era.”]
“Our brain is becoming saturated with information, and it’s becoming harder for us to hold onto meaningful information, if we can even pick out what’s meaningful anymore,” he explained.
The need to acquire many bits of information is nothing new, Carr said. Supposedly, early man needed to gather as much information as quickly as possible just to survive.
The brain releases dopamine, a chemical associated with pleasure, every time we receive new information, said Carr. The printed page and reading eventually changed that, but now we’re reverting to old times all over again.
For example, Carr quoted a recent study showing that for an average adult, time devoted to looking at screens per day averaged 8.5 hours, whereas time devoted to reading from pages per day averaged 20 minutes.
“This is a problem, because our brain has a ‘bottleneck’ when we go back to these old habits, meaning that our working memory has a ‘cognitive overload’—which can negatively affect our long-term memory and our ability to evaluate information and distinguish what’s useful from what’s just trivia,” he said.
What does this mean for schools?
With the influx of technologies such as smart phones in the classroom and iPads for children as young as kindergarteners, Carr says there’s reason to be concerned—and this puts more pressure on schools and librarians to help students learn the deeper skills of critical thinking, introspection, and analysis.
“For schools,” he explained, “it may be hard to cut back on technology, because with technology it’s very easy to calculate the benefits with immediate data and observations—for example, seeing increased attention to subjects, more participation, et cetera. It’s not always easy to define the negatives, such as long-term memory retention loss and an inability to analyze correctly.”
Therefore, Carr said, it’s critical for librarians and teachers to guide students to open-ended thinking and problem solving. Also, using real books for a change might not hurt, he said.
Carr closed his keynote with a quote by the late David Foster Wallace during a commencement speech at Kenyon College in 2005: “Learning how to think … means being conscious and aware enough to choose what you pay attention to and choose how you construct meaning from experience.”
- #4: 25 education trends for 2018 - December 26, 2018
- Video of the Week: Dealing with digital distraction in the classroom - February 23, 2018
- Secrets from the library lines: 5 ways schools can boost digital engagement - January 2, 2018