To never confront the possibility of getting lost is to live in a state of perpetual dislocation. If you never have to worry about not knowing where you are, then you never have to know where you are. —Nicholas Carr, The Glass Cage
One time the internet went down at the library and it was like the Apocalypse. Patrons at the public computers and on their laptops saw their pages not loading and came to the desk to ask what was wrong. We told them the internet’s down, it’ll have to be restarted and it’ll be up again in a few minutes. Most were satisfied by this, if a bit peeved, but waited for order to be restored. But it was a serious outage and the wait was much longer than usual, so people at the computers just left. The lab, usually almost or completely full, was a ghost town.
Just as I was thinking (a bit judgmentally) how odd it was that people who temporarily didn’t have internet would just leave instead of using other parts of the library (like, you know, books ‘n’ stuff), I realized that the library catalog was down too. Without this mechanism that we use to search for items and get their call number for retrieval, I was librarianing in the dark. If someone came to the desk looking for a specific book, I had no way of a) knowing if we had it and it was on the shelf, or b) where it was among the thousands of books neatly lining the stacks before me. I knew generally where books on certain topics were—sports in the 790s, the 200s had religion, and so on—but without a specific call number I’d have to navigate the sea of spines book by book until by providence or luck I found the item within a reasonable amount of time.
The internet was restored, the computer lab filled again, and the catalog came back to life. No crises came to pass during this internet-less interlude, but I did wonder if I knew as much as I thought I did. Did I as a librarian rely too heavily on access to the online catalog to do my job? Without internet connectivity or hard-copy reference material, would we still be able to provide the information people ask for every day? Even looking up a phone number is much more easily done on Google than through a paper trail.
The times we’re not connected to the internet somehow are becoming less and less frequent, so existential crises like mine don’t have to last long. But the questions lingered as I read Nicholas Carr‘s new book, The Glass Cage: Automation and Us. It asks questions that apply not only to libraries but every facet of our lives: do humans rely too heavily on technology? And if so, what is that reliance doing to us? Well-versed in the effects of technology on human behavior, Carr, author of The Shallows and The Big Switch, posits that automated technology, though responsible for many improvements in industry, health care, transportation, and many other areas, can also degrade our natural skill-learning abilities and generate a false sense of security in technology that aims (yet often fails) to be perfect in an imperfect world.
Carr points to two phenomena that, taken separately or together, exacerbate our worst human tendencies and stunt the mental and physiological growth required for mastering complex tasks. Automation complacency, writes Carr, “takes hold when a computer lulls us into a false sense of security. We become so confident that the machine will work flawlessly, handling any challenge that may arise, that we allow our attention to drift.” Exhibit A: the opening library anecdote. Knowing that the online catalog will reliably provide the information I need when I ask for it, I’m much more liable not to retain useful knowledge despite the usefulness of retaining it and the pleasure I get from learning it.
The second phenomena is automation bias, which occurs “when people give undue weight to the information coming through their monitors. Even when the information is wrong or misleading, they believe it. Their trust in the software becomes so strong that they ignore or discount other sources of information, including their own senses.” I’ve experienced this too. One time a patron asked for the phone number of a business; because I was dealing with multiple things at once, I provided the first number that came up on Google without confirming its validity through another source, like the business’s website or the Yellow Pages. Turns out that number was outdated and the search engine hadn’t indexed the new one yet. But because I’d done that before with numbers that were accurate, to be expedient I trusted Google in that moment when I should have been more discerning.
Whichever technological tools Carr cites—airplane autopilot, assembly-line manufacturing, GPS—the theme that emerges is that after a certain point, the more automated technology takes away from humans the more we lose. This runs counter to the utopian belief that the iPhones and Google Glasses and self-driving cars of the world make our lives better by making them easier, that by ceding difficult tasks to machines we will be able to focus on more important things or use that extra time for leisure.
To some extent that is true, but there’s also a dark side to this bargain. By abdicating power over how we interact with the world, we stop being doers with agency over our skills and trades and become monitors of computer screens—supervisors of fast, mysterious, and smart machines that almost always seem to know more than us. This dynamic puts us at cross-purposes with the tools that should be working with us and for us, not in our stead. Humans’ greatest ability, writes Carr, is not to cull large amounts of data and make sense of complex patterns: “It’s our ability to make sense of things, to weave the knowledge we draw from observation and experience, from living, into a rich and fluid understanding of the world that we can then apply to any task or challenge.”
Automated tools like GPS, which we rely upon and follow without much question, take away that innate ability of sense-making and even dampen our desire to make our own observations based on first-hand experience. I should replace “we” with “I” here, because I struggle greatly with navigation and love being able to know where I am and get to where I’m going. But navigation is more than following the blue line directly from point A to point B as if A and B are the only data points that matter. The point of navigation is the map itself, the ability to make assessments based on acquired knowledge and turn that knowledge into informed action. When a computer does all that for us in a microsecond, then what’s the point of knowing anything?
Ominous implications like this are the star of The Glass Cage, which casts a discerning eye on the assumptions, implicit and explicit, that govern our relationship with technology. It’s a relationship that can be fruitful and healthy for everyone involved, but it also needs some work. Thankfully, Nicholas Carr has done the work for us in The Glass Cage. All we have to do is sit back and receive this knowledge.