Tag: technology

  • The Glass Cockpit

    spiderman-wallpapers- (1)

    Is the Internet making us smarter or stupider? It’s a question Q the Podcast recently tackled in a lively and in-depth debate between lots of smart and interesting people. There is enough evidence to support both sides of the debate. But what I concluded after listening to the show was that for all of the doomsday talk about the technologies and processes that have become embedded in our digitized culture within the last decade or so, how we use the Internet is ultimately not up to the Internet.

    No matter how incentivizing are the apps and social networks we frequent; nor addicting the silly games we enjoy; nor efficient the tools we use, there is still a human being making decisions in front of a screen. So while I certainly sympathize with those who profess addiction (willing or otherwise) to Tweeting or checking Facebook, I remind everyone using technology of any kind of Uncle Ben’s famous maxim: “With great power comes great responsibility.

    We as autonomous, advanced-brain human beings have the power to do or not to do things. It’s a great power to have, but it also requires perseverance. The allure of instant gratification the usual Internet suspects provide won’t be defeated easily. It takes a willpower heretofore unknown to modern peoples. It takes resolve to fight temptation that is equal or greater than the temptation itself.

    Do you have what it takes? Do I? Eh, it’s day to day.

    But flipping this entire argument on its head is Nicholas Carr’s recent article in The Atlantic called “All Can Be Lost: The Risk of Putting Our Knowledge in the Hands of Machines,” which delves into the burgeoning world of automation. He writes about how we’ve become increasingly reliant on computers to perform more elaborate and complicated tasks that had previously been done by humans. The benefit of this is that we’re able to get tasks done quicker and more efficiently. The downside is that some human services are no longer required, which means the skills needed to perform those services are eroding.

    Carr uses the example of airplane pilots, who have been increasingly relegated to monitoring digital screens (the “glass cockpit”) as the computers do the heavy lifting and only sometimes take the plane’s reigns. While the usefulness of autopilot is obvious, when computers take away control of the primary functions of flying they are also taking away the neurological and physiological skills pilots have honed over years of flying.

    This is a problem, says Carr, because “knowing demands doing”:

    One of the most remarkable things about us is also one of the easiest to overlook: each time we collide with the real, we deepen our understanding of the world and become more fully a part of it. While we’re wrestling with a difficult task, we may be motivated by an anticipation of the ends of our labor, but it’s the work itself—the means—that makes us who we are.

    Computer automation, he says, disconnects the ends from the means and thereby makes getting what we want easier without having to do the work of knowing. This just about nails social media, doesn’t it? It’s so easy to get what we want these days that the work we used to have to do no longer is required of us. To research a paper in college, one had to go to the physical library and pull out a physical book and transcribe quotes by hand; now a quick Google search and copy-paste will get that done in a jiff (or is it GIF?).

    This isn’t a bad thing. I’m thankful that many tasks take eons less time than they used to. (I mean, typewriters are cool, but they’re not very amenable to formatting or mistakes.) My point is it’s important to understand how and why we use technology the way we do, and to acknowledge that we have agency over that use. To disregard that agency is to refuse to accept responsibility for our own power. And we know what happens then.


  • Bad Tesseractors

    Remember in The Avengers when it was revealed that Selvig, a scientist Loki brainwashed to do his bidding, had programmed a failsafe measure into the device he had created to harness the power of the Tesseract, and that failsafe was the villain Loki’s own scepter? Imagine that scenario with the good and evil dynamic reversed and you’ve got a pretty good idea of the new revelation, courtesy of ProPublica and The New York Times, that the NSA has been circumventing many of the encryption and security tools put in place to protect online communications from prying eyes.

    NSA agent.
    NSA agent.

    For this disclosure we can thank former NSA agent and current Russian tourist Edward Snowden, whose data dump contained documents that uncovered the NSA’s secret efforts to undermine Internet security systems using code-breaking super computers and collaboration with tech companies to hack into user computers before their files could be encrypted.

    The most nefarious aspect of this revelation, however, is the NSA’s attempt to “introduce weaknesses” into encryption standards used by developers that would allow the agency to easier hack into computers. So now, not only has the NSA flouted basic civil rights and U.S. law, they’re simply playing by their own rules. They couldn’t win the right to insert a “back door” into encryption standards in their 1990s court battles, so they gave the middle finger to the law and tried again anyway, but this time in secret. It’s a betrayal of the social contract the Internet was founded on, says engineer Bruce Schneier, and one that needs to be challenged by engineers who can design a more secure Internet and set up proper governance and strong oversight.

    The worst part of all this is that there’s probably some twisted legal justification for this somewhere. Starting in Bush’s administration and continuing into Obama’s, the dark world of “homeland security” has received both tacit and explicit approval from the executive, legislative, and judicial branches for its increasingly Orwellian surveillance techniques — all in the name of “national security.” I’m sure there’s a lot of good being done behind the scenes at the NSA, CIA, and other clandestine organizations, but really, who are we kidding?


  • A Ship-Shape Ticker

    John Harrison in a 1767 portrait.

    If I could bring back Google Maps to early eighteenth-century Britain, I’d be a millionaire. See, figuring out a ship’s longitudinal coordinates was a huge problem back then. So much so that the British Parliament offered a prize of what amounts to $2.2 million in today’s dollars to anyone who could produce a practical method for pinpointing a ship’s location.

    Latitude was pretty easy: All you needed was the sun and some charted data. But longitude had theretofore only been discernible by sheer instinct and guesswork, which often led to ships crashing into unforeseen hazards and hundreds of casualties. Even renowned navigators armed with a compass (which were still unreliable at the time) had to basically hope they weren’t going the opposite way or that the ship didn’t run aground.

    That’s where John Harrison came in. Dava Sobel’s Longitude: The True Story of a Lone Genius Who Solved the Greatest Scientific Problem of His Time tells the story of this lovably odd son of a carpenter with no formal scientific training who created a revolutionary maritime clock. Previous ship clocks couldn’t keep time in bad weather, but Harrison’s was self-stabilizing and self-lubricating so that it wouldn’t wear down and wouldn’t be affected by the briny sea air and turbulent waters.

    Harrison responded to Parliament’s challenge for a longitudinal tool, but unlike other people with crackpot submissions, he wasn’t in it for the money. He was like the Nikola Tesla of maritime horology: eccentric, hermetic, obsessive, but in it only for the joy of the scientific challenge itself. And like Tesla with Thomas Edison, Harrison had a natural antagonist in Nevil Maskelyne, a royal astronomer appointed to Parliament’s “Board of Longitude,” which controlled the terms of the prize money. Maskelyne had his heart set on the lunar distance method, which involved gauging the moon’s distance from another star to calculate the local time, and gave Harrison all kinds of politically motivated headaches along the way in order to get the lunar method some headway. Harrison’s son even had to resort to writing King George III (the King George) to get some help moving the intransigent Board along. Turns out the young monarch was a science geek himself and gladly helped the Harrisons out (just as he was levying heavy taxes on an increasingly disgruntled colonial America).

    Overall, Sobel’s book, though heavily biased toward Harrison, is an accessible, breezy account of his engineering process, the radical innovations he made in every version of his “chronometer,” and the obstacles he had to surmount to achieve recognition from a skeptical scientific community. Take some time to read it.


  • The Millennials Will Be All Right

    I finally read Joel Stein’s Time magazine piece on the Millennial Generation, called “The Me Me Me Generation.” For the record, unlike some of my Millennial cohorts I hate “selfies” (the term and the thing it describes), I don’t feel entitled to a great job right out of school, and I don’t sleep next to my phone. But I don’t think the article deserved all of the antipathy it received from the blogosphere. I thought it was a fair if slightly fogeyish and surface-level assessment of overall generational characteristics. The problems my generation struggles with — like narcissism and a sense of entitlement — are so noticeable largely because of the times we live in, with everything more public and social technology more widespread. You don’t think the Baby Boomers would have peppered Instagram with pictures from Woodstock? or that Gen-Xers would have had entire Spotify playlists dedicated to their collection of sad and angsty ballads? The manifestations of narcissism by young people today merely belie the human condition that plagues all humankind: We’re selfish creatures, no matter how old we are or how many Twitter followers we have.

    The combination of the influence of technology and how we collectively were reared — being told how special we were by over-protective helicopter parents — also contributes to how we are currently growing into adulthood. Generally speaking, we’re able to postpone full emergence into adulthood and still live with our parents because (a) we can and our parents don’t seem to mind (or at least don’t say so), and (b) because the economy sucks and has changed so much that traditional jobs and careers aren’t as feasible anymore. The Boomers were anxious to get out of the house and their parents were eager for them to leave, so naturally the way things are done now clashes with the way of the past. Welcome to The Present Reality.

    Having said that, we can’t abdicate responsibility for making choices about our lives. We don’t have to live with our parents or check Facebook ten times a day or start a YouTube channel to get famous, but we do anyway (well, not me, but the collective We certainly do). And that doesn’t just go for Millennials: Facebook usage is declining among younger people because their parents (Boomers! shakes fist) have slowly taken over. Magazine columnists can try to pin the narcissism epidemic on young people all they want, but when I go to restaurants nowadays I see just as many if not more parents on their phones than younger people. We can’t simply blame the times and the technology for our behavior, because we’re human beings with the capacity to choose whether to succumb to societal forces or to instead carve our own path, peer pressure be damned.

    I think we’ll be all right. Like generations before us, we have a great opportunity to make things better. That will involve some pushing back against the political and cultural acrimony that has characterized the Boomers’ ascendency and reign, but every generation has had to clean up the messes of its predecessors. We Millennials will inevitably make mistakes, and our kids will have been formed by them in some way, for better or for worse. Let’s just hope it’s for the better.


  • Data Dumped: On The Freedom Of Forgetting

    Do we have the right to forget the past, and to be forgotten?

    That’s the key question in this article from The Guardian by Kate Connolly, which part of a larger series on internet privacy. Connolly talks with Viktor Mayer-Schönberger, professor of internet governance at Oxford Internet Institute, who describes himself as the “midwife” of the idea that people have the legal, moral, and technological right to be forgotten, especially as it relates to the internet’s memory.

    In order to make decisions about the present and the future, Mayer-Schönberger claims, our brain necessarily forgets things, which allows us to think in the present:

    Our brains reconstruct the past based on our present values. Take the diary you wrote 15 years ago, and you see how your values have changed. There is a cognitive dissonance between now and then. The brain reconstructs the memory and deletes certain things. It is how we construct ourselves as human beings, rather than flagellating ourselves about things we’ve done.

    But digital memories will only remind us of the failures of our past, so that we have no ability to forget or reconstruct our past. Knowledge is based on forgetting. If we want to abstract things we need to forget the details to be able to see the forest and not the trees. If you have digital memories, you can only see the trees.

    One of his ideas to combat the negative effects of the permanence of data is to implement an “expiration date” for all data — akin to the “Use By” date on perishable food items — so that it can be deleted once it has served its primary purpose. “Otherwise companies and governments will hold on to it for ever,” he claims.

    A counter-argument for this right-to-be-forgotten strategy is that it could be impossible to implement due to the many back-ups that are made of the same data; if the data exists somewhere, then you’re technically not forgotten. But Mayer-Schönberger pushes back on this, saying even if Google has a back-up somewhere, if you search for the data and “99% of the population don’t have access to it you have effectively been deleted.”

    What’s unclear about his “expiration date” idea is whether it would include a self-destructing mechanism embedded within the data, like how e-books rented from libraries disappear after a predetermined time period, or whether the data’s user could choose to ignore its “Delete By” date. If the data holders are not legally or technologically compelled or obligated in some way to delete the data permanently after an agreed upon time, then this “right to be forgotten” becomes a lot weaker.

    As an aspiring archivist, tech enthusiast, and history buff, I can see where something like this could be detrimental to historians, information managers, and culture heritage caretakers. One of the Internet’s strengths is its ability to hold a vast amount of easily transmittable information, much more than any era before ours could, so to effectively neuter this ability would hinder present and future historians and archivists in their quest to accurately document the past. 

    A historian studying late-1700s American history has only select diaries, newspaper clippings, and other ephemera of deteriorating quality from which to cull contextual information and interpret that time period for modern audiences. Researchers studying the present day, however, have millions of gigabytes of data available to them on the Internet – way too much information for even the Internet Archive or Library of Congress to adequately archive, let alone make sense of.

    But as an individual, having the ability to regain a modicum of control over one’s own data is very appealing. Anyone who has ever posted a photo on Facebook they later regretted, or sent an email they wish they hadn’t, or wrote an inflammatory blog post years ago could see great value in data that can be, if not irreparably extirpated, then at least banished from digital civilization. This may lead to a less-complete record of our existence, but given how much more data we’re producing overall today than ever before we will not lack for records anytime soon.

    We should all, I believe, have the right to the digital equivalent of burning a letter we don’t want living on in perpetuity, even though this idea runs counter to the impulses of our over-sharing and hyper-connected world. It is also anathema in archives: just think of all the information in that letter we’ve lost forever! I hear you, imaginary archivist, but, to return to Mayer-Schönberger’s analogy, even if a forest loses a tree — from natural death or manmade causes — it will still be a forest. And as Theodore Roosevelt, a great man of nature and of letters, said, “There are no words that can tell the hidden spirit of the wilderness, that can reveal its mysteries, its melancholy and its charms.”

    The Internet, like a forest, should allow for mystery. Otherwise, where’s the fun in the searching?


  • Science Blows My Mind

    Like many English majors, science and mathematics were two subjects that gave me trouble throughout my primary, secondary, and college education. I think it was geometry class sophomore year of high school where I hit a wall and everything after that was a blur. Ditto with chemistry that year (what in the name of Walter White is a mole anyway?). But that didn’t hinder me from being wholly fascinated with science and nature, and more particularly with the people who know way more about those things than I do.

    I just finished reading Jennifer Ouellette’s Black Bodies and Quantum Cats: Tales from the Annals of Physics, a collection of short essays on various topics within the world of physics. Ouellette, also a former English major and self-professed “physics phobe,” adapted the essays from her column in APS News, a monthly publication for members of the American Physical Society. She tackles scientific topics from the earliest and most fundamental – like DaVinci and the golden ratio, Galileo and the telescope – to more recent discoveries like X-rays, wireless radio, and thermodynamics.

    True to her writing roots, Ouellette manages to take what can be very esoteric and labyrinthine scientific concepts and make them fascinating by linking them to things we regular people can understand: how Back to the Future explains Einstein’s theory of special relativity; Dr. Jekyll and Mr. Hyde representing the dual nature of light; induced nuclear fission as seen in Terminator 2: Judgement Day. These connections are lifesavers for right-brained humanities majors like me, who instead of seeing “SCIENCE” blaring on the cover and fleeing get to experience an “A-ha!” moment nearly every chapter.

    But here’s the thing: I love science. I don’t love it like a scientist does, by learning theories and experimenting. I don’t love it because I understand it – Lord knows that’s not the case. Rather, I love it because of what it does. I am consistently flabbergasted by what have become quotidian occurrences in our 21th century lives. Telephone technology is so quaint these days, but the fact that I can pick up a small device, speak into it, and instantaneously be heard by someone thousands of miles away blows my mind. The fact that I can get inside a large container that will propel itself through the air and arrive at a destination relatively quickly blows my mind. The fact that we can send a small, man-made vehicle into outer space and have it land on another planet blows my freaking mind.

    Science has improved our lives and advanced our knowledge of creation in a million ways. I’m simply grateful for the multitudes of geeks who have labored in that noble cause of discovery. Because of you, we have cell phones and airplanes and cameras and Velcro (did you know that term is a portmanteau of the French words velours [velvet] and crochet [hook]?) and Mars Curiosity and lasers (an acronym for Light Amplification by Stimulated Emission of Radiation) and automobiles and Xerox machines and countless other inventions, many of them engineered by the men and women Ouellette spotlights in her book.

    And that’s just physics. Think about what we know of biology, chemistry, geology, astronomy, and every other sub-category of science. If my mind hadn’t already been blown away earlier, it would have exploded now just thinking about what we know about our Earth and the things that it contains, and also what we have yet to discover.

    Though our country is in turmoil, the Curiosity roves a distant planet. Though we often disagree about basic scientific principles, we still seek to discover. As Carl Sagan said: “For all our failings, despite our limitations and fallibilities, we humans are capable of greatness.” As a sci-curious liberal arts nerd, I can’t wait to see what else we can achieve.


  • Pruning the Rosebushes: What Not to Share

    There’s a scene in Saving Private Ryan when Matt Damon’s Pvt. Ryan and Tom Hanks’ Capt. Miller sit and chat, waiting for the impending German offensive to hit their French town. Ryan’s three brothers had recently died and he can’t remember their faces. The Captain tells him to think of a specific context, something they’d shared together. When the Captain thinks of home, he says, “I like of my hammock in the backyard or my wife pruning the rosebushes in a pair of my old work gloves.”

    Ryan then tells the story of the brothers’ last night together before the war took them away, his enthusiasm growing as his face brightens with the look of recognition. After he finishes the story, he asks Captain Miller to tell him about his wife and the rosebushes. “No,” the Captain says. “That one I save just for me.”

    In this the Age of Oversharing, this is a refreshing if soon-to-be anachronistic sentiment. I’ll admit to feeling the ongoing urge to inform The World via Twitter of funny or interesting things that happen to me during the day, or to display my pithy wit with a topical one-liner. But lately I’ve been compelled by a new urge, similar to that of Tom Hanks’ laconic Captain Miller in this case, which tells me to think twice before sharing whatever it is I want to share with the world.

    Perhaps this is due to my being an inherently reserved person, reluctant to simply give away every little thought that enters my brain. Some people, I fully realize, aren’t built this way; they want to share themselves and their lives entirely and get fulfillment out of this. That’s perfectly fine. But I like the idea of keeping some moments – the rosebush prunings of our lives – special, not posted on Twitter or Instagram or even a WordPress blog.

    This requires a lot of discipline. Being hyperconnected to social networks makes sharing intentionally easy, so overcoming the desire to post a picture of a sunset scene you’re sharing with a loved one is tough, especially when the desire to share has been engrained and even encouraged by our plugged-in culture. But I think a special moment like that becomes a little less special when every one of your Facebook friends and their mother shares it too.

    This notion runs counter to many of my identities. As an amateur techie, I marvel at the capabilities the Web can give ordinary people to express themselves and enhance their lives. As a history buff and librarian/archivist in training, I understand the value of information as the record of history and the zeitgeist of an era. And as a user of Twitter, Instagram, and WordPress, I’ve come to enjoy having easily accessible and usable media to help me share cool photos, links, and thoughts short (on Twitter) and long (on here) whenever and wherever I want.

    In spite of all these conflicts of interest, I’m OK with, once in a while, letting moments and images and quotes pass by undocumented and unshared, if only so I can feel in that moment that I got a glance, however fleeting, at something beautiful or inspiring or funny or tragic or all of the above, and that it’s all mine. The memory of that moment may die with me, but hey, that’s life. No matter how high-quality resolution the camera or beautifully eloquent the prose, these second-hand records will never be quite as pure as the real thing, the moments they seek to honor.

    So here’s to, once in a while, living in the moment and only in the moment.


  • Steve Jobs Lives

    This 1987 concept video from Apple predicts not only the iPad and all of its capabilities, but also Siri, the speech-activated personal assistant that will be ubiquitous technology in a few years given how Apple products usually work. (H/T to Andrew Sullivan for the video)

    Andy Baio finds this amazing:

    Based on the dates mentioned in the Knowledge Navigator video, it takes place on September 16, 2011. The date on the professor’s calendar is September 16, and he’s looking for a 2006 paper written “about five years ago,” setting the year as 2011. And this morning, at the iPhone keynote, Apple announced Siri, a natural language-based voice assistant, would be built into iOS 5 and a core part of the new iPhone 4S.

    So, 24 years ago, Apple predicted a complex natural-language voice assistant built into a touchscreen Apple device, and was less than a month off.

    I never had the emotional attachment to Steve Jobs as many others around the web have been describing, but I do use his products. The iPhone, Macbook laptop, and the iPod seem so ordinary now, but 24 years ago who could have predicted how they would change the world as they did? I suppose that’s the best compliment you can give a technology geek like Jobs – that what he did changed the world for the better.


  • Facebook vs. Wikileaks

    What’s the difference between Facebook founder Mark Zuckerberg and Wikileaks founder Julian Assange? A recent Saturday Night Live skit with Bill Hader as Assange answered that question: “I give you private information on corporations for free and I’m a villain,” he says. “Mark Zuckerberg gives your private information to corporations for money and he’s Man of the Year.”

    It seems backwards, right? In a perfect world, the release of free information about corporate malfeasance would be celebrated and the selling of private information for profit would be illegal, or at least frowned upon. But we don’t live in a perfect world. Instead, Assange gets arrested and Zuckerberg makes billions and is named Time magazine’s Person of the Year.

    The U.S. government insists on secrecy. Every politician seems to campaign on bring transparency to Washington and making the government more for, by, and of the people. Yet it never seems to work. So when someone like Assange comes along and pulls back the curtain on important areas of public interest like the wars in Iraq and Afghanistan, the government goes code red.

    Facebook is the opposite. No one is forced to reveal personal information; we do it willingly. And the company takes that information and uses it to sell advertising and make billions of dollars in profit. Zuckerberg believes in total openness—on Facebook and in the world as a whole—yet somehow I think he’d had a problem if Wikileaks revealed how Facebook was using people and their information to make a huge profit.

    I’m not wholly anti-Facebook. I think it’s a great way to communicate and stay in touch with friends and family. And the way things are going it looks like the site will be the Internet one day. But there’s something very unsettling about how disclosure through Facebook is encouraged yet through Wikileaks it’s demonized. And as long as institutions like Time continue to honor this dangerous dichotomy, things won’t change.


  • iPad? I think not

    Originally published in the North Central Chronicle on April 16, 2010.

    With all the near-orgasmic praise Apple’s iPad has received lately, I feel like I should want to get one. But I don’t.

    Let’s be honest: it’s a cool toy. It does most of the things and iPod Touch or iPhone can do but on a bigger, more vibrant LCD screen. It also does some of the things a laptop does but in a more simplified and mobile way. But I don’t see the point of shelling out $500+ for a product fresh out of the factory just because Steve Jobs says it’s the future.

    The Cult of Apple is a little too much for me right now. Sarah Palin whined a lot about the news media fawning over Barack Obama during the presidential campaign, but that was nothing compared to the reception Jobs’ Apple products get every time they are released into the world.

    I understand brand loyalty, but some Apple fans get so wrapped up in their products it becomes hard to take their constant adulation seriously. While Apple’s products are often worthy of the praise they receive—it’s a sleek and dependable brand with great marketing—let’s not get carried away.

    Jobs may be right: the handheld touchscreen technology the iPad embodies will probably eventually become the standard for computing and communication. Like the iPhone and iPod before it, it will get better with every generation they release. And more people will probably buy it once Apple’s competitors like Google and Microsoft release their own version of the tablet computer.

    But the iPad as it is now is not there yet. As the first generation of its kind, it’s going to receive some major upgrades in the next few years. Remember the first generation iPod? At the time it was revolutionary, but now it’s laughably archaic. 

    The iPad, I suspect, will be similar. It’s cool now, but I’m going to let it cook a little longer before I buy what Steve Jobs is selling. Once tablet computers become a legitimate and irreplaceable technology—and offered from more companies than just Apple—then will it be worthwhile to invest in it.

    Until then, it’s still just a toy. A very expensive toy.


  • Net Neutrality

    This video explains Net Neutrality way better than I ever could…

    Created by Aaron Shekey of Apparently Nothing fame.