Tag Archives: Nicholas Carr

Favorite Books of 2014

hardway whatif glassfivedeep

My favorite books from 2014 are all nonfiction, a thoroughly unsurprising result of it being way easier for me to get through a 700-page historical tome than a 200-page novel. Sorry, novels: this year it was especially true that the truth is stranger and more fascinating than fiction.

Deep: Freediving, Renegade Science, and What the Ocean Tells Us about Ourselves by James Nestor

I wrote about Deep at ThinkChristian and will keep writing about it to get people to read it. Despite submerging to depths few humans can withstand, Nestor only breaks the surface of what there is to know about the ocean and the people who explore it. He nimbly interweaves his experience learning how to freedive, which is like scuba diving sans equipment, with science of the deep and what we’ve yet to illuminate about the dark depths of our world.

Five Came Back: A Story of Hollywood and the Second World War by Mark Harris

Good to see this getting love from other year-end lists. The adept synthesizing Harris did in his first book, Pictures at a Revolution, shows up again in Five Came Back, which follows five top Hollywood directors through their unique wartime experiences. They encountered nearly every major part of the war, at home and abroad, and bring back hard-won lessons and personal experience that inform and mold their postwar work.

The Glass Cage: Automation and Us by Nicholas Carr

Wrote about this in October. It’s important to convey that Carr doesn’t think automation is bad (Alan Jacobs makes this clear in his review at Books & Culture), only that we have to make sure that it doesn’t make us worse off. Because there’s so much automation can do for us, it’s easy to start ceding other things to it without considering the consequences. Carr provides a good foundation for that consideration.

The Hard Way on Purpose: Dispatches from the Rust Belt by David Giffels

A series of essays on living in Akron, heart of the Rust Belt and perpetual underdog. Giffels writes about LeBron James, the Cleveland Browns, Chuck Taylor, about watching all his friends leave and the travails of Ohio living. Midwesterners who have seen their town, however big or small, decay amidst the wreckage of industrialization and unforgiving weather will find something familiar and bittersweet in Giffels’ writing.

What If? Serious Scientific Answers to Absurd Hypothetical Questions by Randall Munroe

I just got this at a used bookstore because I couldn’t resist. It’ll also give me a chance to better absorb the wonderfully rendered comic scenarios and Munroe’s dry humor, which I first devoured in one sybaritic sitting. Never before had I considered what would happen if someone tried to hit a baseball pitched at 90 percent the speed of light, but thanks to this book I now know. Great fodder for book groups and coffee tables of nerds.

The Glass Cage

To never confront the possibility of getting lost is to live in a state of perpetual dislocation. If you never have to worry about not knowing where you are, then you never have to know where you are. —Nicholas Carr, The Glass Cage

One time the internet went down at the library and it was like the Apocalypse. Patrons at the public computers and on their laptops saw their pages not loading and came to the desk to ask what was wrong. We told them the internet’s down, it’ll have to be restarted and it’ll be up again in a few minutes. Most were satisfied by this, if a bit peeved, but waited for order to be restored. But it was a serious outage and the wait was much longer than usual, so people at the computers just left. The lab, usually almost or completely full, was a ghost town.

Just as I was thinking (a bit judgmentally) how odd it was that people who temporarily didn’t have internet would just leave instead of using other parts of the library (like, you know, books ‘n’ stuff), I realized that the library catalog was down too. Without this mechanism that we use to search for items and get their call number for retrieval, I was librarianing in the dark. If someone came to the desk looking for a specific book, I had no way of a) knowing if we had it and it was on the shelf, or b) where it was among the thousands of books neatly lining the stacks before me. I knew generally where books on certain topics were—sports in the 790s, the 200s had religion, and so on—but without a specific call number I’d have to navigate the sea of spines book by book until by providence or luck I found the item within a reasonable amount of time.

The internet was restored, the computer lab filled again, and the catalog came back to life. No crises came to pass during this internet-less interlude, but I did wonder if I knew as much as I thought I did. Did I as a librarian rely too heavily on access to the online catalog to do my job? Without internet connectivity or hard-copy reference material, would we still be able to provide the information people ask for every day? Even looking up a phone number is much more easily done on Google than through a paper trail.

The times we’re not connected to the internet somehow are becoming less and less frequent, so existential crises like mine don’t have to last long. But the questions lingered as I read Nicholas Carr‘s new book, The Glass Cage: Automation and Us. It asks questions that apply not only to libraries but every facet of our lives: do humans rely too heavily on technology? And if so, what is that reliance doing to us? Well-versed in the effects of technology on human behavior, Carr, author of The Shallows and The Big Switch, posits that automated technology, though responsible for many improvements in industry, health care, transportation, and many other areas, can also degrade our natural skill-learning abilities and generate a false sense of security in technology that aims (yet often fails) to be perfect in an imperfect world.

Carr points to two phenomena that, taken separately or together, exacerbate our worst human tendencies and stunt the mental and physiological growth required for mastering complex tasks. Automation complacency, writes Carr, “takes hold when a computer lulls us into a false sense of security. We become so confident that the machine will work flawlessly, handling any challenge that may arise, that we allow our attention to drift.” Exhibit A: the opening library anecdote. Knowing that the online catalog will reliably provide the information I need when I ask for it, I’m much more liable not to retain useful knowledge despite the usefulness of retaining it and the pleasure I get from learning it.

The second phenomena is automation bias, which occurs “when people give undue weight to the information coming through their monitors. Even when the information is wrong or misleading, they believe it. Their trust in the software becomes so strong that they ignore or discount other sources of information, including their own senses.” I’ve experienced this too. One time a patron asked for the phone number of a business; because I was dealing with multiple things at once, I provided the first number that came up on Google without confirming its validity through another source, like the business’s website or the Yellow Pages. Turns out that number was outdated and the search engine hadn’t indexed the new one yet. But because I’d done that before with numbers that were accurate, to be expedient I trusted Google in that moment when I should have been more discerning.

Whichever technological tools Carr cites—airplane autopilot, assembly-line manufacturing, GPS—the theme that emerges is that after a certain point, the more automated technology takes away from humans the more we lose. This runs counter to the utopian belief that the iPhones and Google Glasses and self-driving cars of the world make our lives better by making them easier, that by ceding difficult tasks to machines we will be able to focus on more important things or use that extra time for leisure.

To some extent that is true, but there’s also a dark side to this bargain. By abdicating power over how we interact with the world, we stop being doers with agency over our skills and trades and become monitors of computer screens—supervisors of fast, mysterious, and smart machines that almost always seem to know more than us. This dynamic puts us at cross-purposes with the tools that should be working with us and for us, not in our stead. Humans’ greatest ability, writes Carr, is not to cull large amounts of data and make sense of complex patterns: “It’s our ability to make sense of things, to weave the knowledge we draw from observation and experience, from living, into a rich and fluid understanding of the world that we can then apply to any task or challenge.”

Automated tools like GPS, which we rely upon and follow without much question, take away that innate ability of sense-making and even dampen our desire to make our own observations based on first-hand experience. I should replace “we” with “I” here, because I struggle greatly with navigation and love being able to know where I am and get to where I’m going. But navigation is more than following the blue line directly from point A to point B as if A and B are the only data points that matter. The point of navigation is the map itself, the ability to make assessments based on acquired knowledge and turn that knowledge into informed action. When a computer does all that for us in a microsecond, then what’s the point of knowing anything?

Ominous implications like this are the star of The Glass Cage, which casts a discerning eye on the assumptions, implicit and explicit, that govern our relationship with technology. It’s a relationship that can be fruitful and healthy for everyone involved, but it also needs some work. Thankfully, Nicholas Carr has done the work for us in The Glass Cage. All we have to do is sit back and receive this knowledge.

Wait…

The Glass Cockpit

spiderman-wallpapers- (1)

Is the Internet making us smarter or stupider? It’s a question Q the Podcast recently tackled in a lively and in-depth debate between lots of smart and interesting people. There is enough evidence to support both sides of the debate. But what I concluded after listening to the show was that for all of the doomsday talk about the technologies and processes that have become embedded in our digitized culture within the last decade or so, how we use the Internet is ultimately not up to the Internet.

No matter how incentivizing are the apps and social networks we frequent; nor addicting the silly games we enjoy; nor efficient the tools we use, there is still a human being making decisions in front of a screen. So while I certainly sympathize with those who profess addiction (willing or otherwise) to Tweeting or checking Facebook, I remind everyone using technology of any kind of Uncle Ben’s famous maxim: “With great power comes great responsibility.” We as autonomous, advanced-brain human beings have the power to do or not to do things. It’s a great power to have, but it also requires perseverance. The allure of instant gratification the usual Internet suspects provide won’t be defeated easily. It takes a willpower heretofore unknown to modern peoples. It takes resolve to fight temptation that is equal or greater than the temptation itself.

Do you have what it takes? Do I? Eh, it’s day to day.

But flipping this entire argument on its head is Nicholas Carr’s recent article in The Atlantic called “All Can Be Lost: The Risk of Putting Our Knowledge in the Hands of Machines,” which delves into the burgeoning world of automation. He writes about how we’ve become increasingly reliant on computers to perform more elaborate and complicated tasks that had previously been done by humans. The benefit of this is that we’re able to get tasks done quicker and more efficiently. The downside is that some human services are no longer required, which means the skills needed to perform those services are eroding.

Carr uses the example of airplane pilots, who have been increasingly relegated to monitoring digital screens (the “glass cockpit”) as the computers do the heavy lifting and only sometimes take the plane’s reigns. While the usefulness of autopilot is obvious, when computers take away control of the primary functions of flying they are also taking away the neurological and physiological skills pilots have honed over years of flying.

This is a problem, says Carr, because “knowing demands doing”:

One of the most remarkable things about us is also one of the easiest to overlook: each time we collide with the real, we deepen our understanding of the world and become more fully a part of it. While we’re wrestling with a difficult task, we may be motivated by an anticipation of the ends of our labor, but it’s the work itself—the means—that makes us who we are.

Computer automation, he says, disconnects the ends from the means and thereby makes getting what we want easier without having to do the work of knowing. This just about nails social media, doesn’t it? It’s so easy to get what we want these days that the work we used to have to do no longer is required of us. To research a paper in college, one had to go to the physical library and pull out a physical book and transcribe quotes by hand; now a quick Google search and copy-paste will get that done in a jiff (or is it GIF?).

This isn’t a bad thing. I’m thankful that many tasks take eons less time than they used to. (I mean, typewriters are cool, but they’re not very amenable to formatting or mistakes.) My point is it’s important to understand how and why we use technology the way we do, and to acknowledge that we have agency over that use. To disregard that agency is to refuse to accept responsibility for our own power. And we know what happens then.