Finally made it through Ralph Ketcham’s James Madison: A Biography. Presidential biographies usually take longer to get through than other books, but I clamored for the end of this one. It’s funny how the POTUS books I’ve read thus far usually take on the characteristics of their subjects: Edmund Morris’ Theodore Roosevelt trilogy was expansive yet gripping; McCullough’s John Adams fiery and forthright; Cooper’s Woodrow Wilsonstately and academic. It makes sense, then, that Ketcham’s book was as bookish and rational as Madison was.
This was a man who was present at every key moment in the young nation’s history, from the famous (the Declaration and Constitution) to the infamous (fleeing the White House from the British in the War of 1812). Ketcham certainly had a lot to say about these events, as well as the intellectual forbearers and philosophies that accompanied Madison throughout his adult life, but decidedly little about the man himself. Perhaps that’s an expectation only modern readers have, to get to know the emotional lives of those we read about as much as their public ones. But, to me, without some deep insight into the subject I’m dedicating my time to, pages of analysis of events and goings-on quickly become a chore.
Or maybe I just need a break from presidential biographies.
If I could bring back Google Maps to early eighteenth-century Britain, I’d be a millionaire. See, figuring out a ship’s longitudinal coordinates was a huge problem back then. So much so that the British Parliament offered a prize of what amounts to $2.2 million in today’s dollars to anyone who could produce a practical method for pinpointing a ship’s location.
Latitude was pretty easy: All you needed was the sun and some charted data. But longitude had theretofore only been discernible by sheer instinct and guesswork, which often led to ships crashing into unforeseen hazards and hundreds of casualties. Even renowned navigators armed with a compass (which were still unreliable at the time) had to basically hope they weren’t going the opposite way or that the ship didn’t run aground.
That’s where John Harrison came in. Dava Sobel’s Longitude: The True Story of a Lone Genius Who Solved the Greatest Scientific Problem of His Time tells the story of this lovably odd son of a carpenter with no formal scientific training who created a revolutionary maritime clock. Previous ship clocks couldn’t keep time in bad weather, but Harrison’s was self-stabilizing and self-lubricating so that it wouldn’t wear down and wouldn’t be affected by the briny sea air and turbulent waters.
Harrison responded to Parliament’s challenge for a longitudinal tool, but unlike other people with crackpot submissions, he wasn’t in it for the money. He was like the Nikola Tesla of maritime horology: eccentric, hermetic, obsessive, but in it only for the joy of the scientific challenge itself. And like Tesla with Thomas Edison, Harrison had a natural antagonist in Nevil Maskelyne, a royal astronomer appointed to Parliament’s “Board of Longitude,” which controlled the terms of the prize money. Maskelyne had his heart set on the lunar distance method, which involved gauging the moon’s distance from another star to calculate the local time, and gave Harrison all kinds of politically motivated headaches along the way in order to get the lunar method some headway. Harrison’s son even had to resort to writing King George III (the King George) to get some help moving the intransigent Board along. Turns out the young monarch was a science geek himself and gladly helped the Harrisons out (just as he was levying heavy taxes on an increasingly disgruntled colonial America).
Overall, Sobel’s book, though heavily biased toward Harrison, is an accessible, breezy account of his engineering process, the radical innovations he made in every version of his “chronometer,” and the obstacles he had to surmount to achieve recognition from a skeptical scientific community. Take some time to read it.
Andrew Sullivan highlighted this post by a woman named Rachael, the daughter of Matt Slick, the founder of Christian Apologetics and Research Ministry (CARM). Rachael is now an atheist, largely in response to what (at least according to her post) was a spiritually abusive upbringing at the hands of her fundamentalist father.
To sum up: For a long time, Rachael was the “perfect” Christian child. She memorized Bible verses, passionately debated esoteric theological principles, and even “spouted off” religious arguments in college philosophy classes. She was so certain of her beliefs and took solace in the strength of her intellectual prowess. But soon the arguments she would make turned into questions of her own. The one that particularly stood out: “If God was absolutely moral, and if the nature of ‘right’ and ‘wrong’ surpassed space, time, and existence, then why were some things a sin in the Old Testament but not a sin in the New Testament?” She concluded that there wasn’t an answer for this:
Everyone had always explained this problem away using the principle that Jesus’ sacrifice meant we wouldn’t have to follow those ancient laws. But that wasn’t an answer. In fact, by the very nature of the problem, there was no possible answer that would align with Christianity. [Emphasis hers.]
She felt a “vast chasm” opening up in her identity, hearing a voice that said The Bible is not infallible.If it’s not infallible, you’ve been basing your life’s beliefs on the oral traditions of a Middle Eastern tribe. The Bible lied to you. “ I was no longer a Christian,” she said.
I recount her story here because I think it’s important to see how Rachael has jumped from one religious extreme to another without considering that there’s a middle ground. I’m blessed to have been reared in a positive, spiritually loving Christian home, so I can only imagine how difficult it was for Rachael to have endured such a destructive and rigid environment, and then to have her long-held and cherished assumptions smashed. In that context, I can understand why she has swung so strongly to the opposite end of the spiritual spectrum.
But I don’t think Rachael ever understood the crux of Christianity. She certainly understood it intellectually (or at least her father’s version of it), but by being so thoroughly fixated on the word of the law she seems to have ignored its spirit and its embodiment in Jesus. Reason was her idol, her “summum bonum identity” that was so easily destroyed when it came under attack.
But she has a new idol now. When asked whether she would have traded her childhood for another, Rachael said she wouldn’t:
Without that childhood, I wouldn’t understand what freedom truly is — freedom from a life centered around obedience and submission, freedom to think anything, freedom from guilt and shame, freedom from the perpetual heavy obligation to keep every thought pure. Nothing I’ve ever encountered in my life has been so breathtakingly beautiful. Freedom is my God now, and I love this one a thousand times more than I ever loved the last one.
This is ridiculous. Again: she has an understandably emotional aversion to the concepts of obedience, purity, and God. To her, obedience equals blindly following orders; purity equals punishing oneself for one’s humanity; and God equals a distant deity. But God is not the one who has lied about these things, and worshipping freedom is just as destructive as worshipping religion. Lord knows we Americans love to worship the god of freedom, but that also means we’re enslaved to it. We must have our guns, sugary drinks, money, land, power, sex, and so many other desirable but worthless things. We’re so subject to our whims and selfish desires that anyone trying to fight against them — a politician, pastor, or Jesus himself — is shouted down and has the Constitution thrown in his face.
I believe in freedom just as I believe in beauty, love, grace, joy, and many other blessed things in this world, but I don’t want to be enslaved to them. Only when used in tandem with obedience to their creator can they be fully realized. Since she barely mentioned Jesus in her article, I’m guessing this is why Rachael has such a perverted view of Christianity. Good things alone will never satisfy without the will to obedience towards Jesus. This true obedience — not the abusive, authoritarian kind of obedience so many erstwhile Christians like Rachael have unfortunately endured — gives us the freedom to rely upon something bigger than our fractured selves.
Despite becoming an atheist (and kind of a smug one at that), Rachael is no less religious than when she was a kid. Now, instead of worshipping words, she’s worshipping the god of her own volition. That probably feels better for her than what she had before, but it’s just as misguided.
We’ve got ourselves a good ol’ fashioned fire-eater here. And like fire itself, this brand of demagogue was a useful tool only until it burned its wielder.
A lawyer by trade, Rhett entered public service in 1826 as a South Carolina state legislator and continued as state attorney general, U.S. representative, and U.S. senator. Rhett came out loudly against President Jackson’s “Tariff of Abominations” in the 1830s, pushing secession before acceding to a “tyrannical” government:
Aye – disunion, rather, into a thousand fragments. And why, gentlemen! would I prefer disunion to such a Government? Because under such a Government I would be a slave – a fearful slave, ruled despotically by those who do not represent me … with every base and destructive passion of man bearing upon my shieldless destiny.
This, mind you, coming from a man who owned actual slaves. Rhett pushed for secession so hard that even John “Slavery Is A Positive Good” Calhoun wasn’t radical enough for him, which is like someone calling Ron Paul a moderate. But as this great New York Times profile of Rhett shows, that wasn’t even the guy’s best stuff. Through the Charleston Mercury, a newspaper he owned that was run by his son, Rhett spewed all kinds of obloquial, borderline slanderous “Rhett-oric” at Lincoln, Hannibal Hamlin, and the African slaves.
His secessionist dreams finally materialized in 1860 when South Carolina disunited itself after Lincoln’s election, prompting Rhett to help convene the Montgomery Convention that established the Confederate government and made Rhett a delegate. But like many a fire-eater who runs head first into the messy business of governing, Rhett soon became disillusioned by Jefferson Davis’ administration (Not seceded enough! Not fighting the Union enough! Wah!) and I’m guessing pretty pissed off by the war’s outcome. Though probably not as pissed off as dying from facial cancer in 1876.
I recently saw the above trailer for Steve McQueen’s upcoming film 12 Years a Slave and immediately got excited to see it on the merits of the trailer, cast, and director alone. But then at the library the following day I happened to see the memoir upon which the film is based and decided to read it.
Twelve Years A Slaveis the Solomon Northup’s first-hand account of his kidnapping into the cruel slavery world of the antebellum South and his long-awaited deliverance. Great Scott is his story breathtaking. The book is short yet wonderfully written, so I’d highly encourage you to read it before the movie comes out so you can read for yourself Northup’s concisely poetic narrative.
One particular passage that stood out was his description of Christmas day, one of the few days all year that the slaves didn’t work:
That morning [the slave] need not hurry to the field, with his gourd and cotton-bag. Happiness sparkled in the eyes and overspread the countenances of all. The time of feasting and dancing had come. … There were to be re-unions, and joy and laughter. It was to be a day of liberty among the children of Slavery.
One of the few ebullient passages in what is otherwise a dark and suffering-filled story, I like how it shows the slaves drawing their own joy and tangible meaning out of a holiday that was also celebrated by the very men who unjustly enslaved Solomon and his brethren.
Read the book. (And while you’re at it, check out the director Steve McQueen’s film Hunger, which chronicles the harrowing prison hunger strike of IRA rebel Bobby Sands.)
Brett McCracken was right to name Alan Jacobs’ The Pleasures of Reading in an Age of Distraction one of the five books recent college graduates should read. A quick yet deeply insightful read, this book was written, in the words of the author, for “those who have caught a glimpse of what reading can give—pleasure, wisdom, joy—even if that glimpse came long ago.” Jacobs writes not to those who have never liked reading, but instead to those who have grown accustomed to academic (i.e. obligated) reading or to “checklist” reading, whereby only “classic” or “important” books are deemed worthy of a reader’s time.
Jacobs provides some guidelines for how to read for fun:
Whim: read what you want, when you want to, without shame…
Aspiration: …but don’t get stuck reading the same stuff—branch out
Upstream: seek out the older works that inspired your favorites and be challenged to “swim upstream”
Responsiveness: don’t be afraid to take notes and respond to the text
Slow: you’ll miss the little things if you view reading as simply uploading information; slow down and you’ll absorb more
Though I read a lot as a kid and through adolescence (if mostly in school), I didn’t start reading for fun again until after college graduation. Faced with an entire life ahead of no-requirements reading (save for the brief graduate school detour), I plunged head first into reading books that greatly interested me. My palate has consisted mostly of history (specifically the presidential kind), nonfiction, and cultural topics, though I try to throw in a novel once in a while.
Like many people, I’m sure, I struggle with the concept of “so many books, so little time,” wanting to read as much as I can so I can get onto the next book. But in our distracted age, it’s important to practice mindfulness and deep thinking in order to buoy our increasingly attention-deficit brains. I want my mind to be strong and agile now and forevermore, if only so I can still shout out Jeopardy! answers when I’m an old man. Taking notes helps in that regard. Since I mostly use library books and can’t write in the books themselves, I keep a notebook nearby to jot down key points, new words, or cool names for future reference. I don’t take notes on everything; some books, like novels and memoirs, I think should just be enjoyed without the interruption of notes.
But whatever your strategy, I encourage you to read and read a lot. Jacobs’ book is the perfect defibrillator for those who have fallen off the reading wagon but want to get back on. As a formerly indifferent reader, I’m glad I rediscovered some literary locomotion. I can’t wait to see where it takes me.
Marching onward in my quest to read a biography of every U.S. president, I finally made it through Ari Hoogenboom’s Rutherford B. Hayes: Warrior and President. I confess to having held the same vague notions of Hayes that Hoogenboom writes he’s commonly known for: that he won the disputed 1876 presidential election, ending Reconstruction, and that he was just another forgettable (yet unforgettably bearded) president who fell through the cracks between Abraham Lincoln and the twentieth century.
But Rud, as he was known, is a perfect exemplar of the purpose of my biblio-presidential journey: to fill in the gaps of my U.S. history knowledge and give the lesser-known figures a fairer shake than high school textbooks give them. In the end I found Hayes to be a fascinating figure, whose presidency was as bland as his pre- and post-presidency years were compelling.
Hayes was raised in Ohio by a widowed mother and a strong-willed sister who both felt very protective of him. When twentysomething Rud was in Boston attending Harvard Law School, both women would constantly needle him about studying and finding a woman. I’m sure he was glad he took his time looking for a mate because the woman he married, Lucy Webb (the first First Lady to graduate from college), helped sway him away from his social-issue indifference toward support for abolition, temperance, and Christianity (though he could only latch onto very liberal Christian orthodoxy).
His newfound moralism continued into the Civil War, which he entered as a major in the Ohio 23rd infantry (fighting alongside future president William McKinley, who was a private in the 23rd, and James Garfield, a brigadier general and another eventual POTUS). In the Battle of South Mountain, Hayes led a charge and got shot in the left arm, fracturing his bone, but in a total Teddy Roosevelt move he stanched the wound and continued on in battle, eventually getting stranded between the lines. Seeing the end, he left notes for his family with wounded Confederate soldier nearby, only to be scooped up by his troops and brought to the hospital. Later in the war, Hayes earned plaudits from General Ulysses Grant that Hayes would brag about for the rest of his life: “His conduct on the field was marked by conspicuous gallantry as well as the display of qualities of a higher order than that of mere personal daring.”
After the war, Rud served in Congress and then as Ohio governor for two non-consecutive terms, the later of which he parlayed into the Republican nomination for president in 1876. Support of the 14th and 15th amendments and reform of the civil service/appointments system were Rud’s bread and butter during the campaign, which culminated in the “Compromise of 1877,” a.k.a. the most controversial election before 2000. The compromise boiled down to this: If Hayes were awarded the disputed presidency, he would agree to remove all remaining federal troops from the former Confederacy, thereby abandoning the fledgling Republican state governments in the South to the reemergent (erstwhile Confederate) Democrats. In exchange, the Democrats wouldn’t violently storm the inauguration in protest. Some deal. However, Hayes and the Republicans chose the presidency over the already withering GOP governments in the South and have earned scorn for ending Reconstruction ever since.
Rud’s presidency continued on, mostly filled with drama over Hayes’ attempted reform of how political appointments were dolled out (Hayes: “The president should make appointments instead of Congress!” Congress: “No.”) and more drama over returning to the gold standard, in addition to the drama over the Great Railroad Strike of 1877. (Two fun bits of trivia: Lucy Hayes hosted the first White House Easter Egg Roll in 1878 after Congress banished it from the Capitol grounds, and Rud hosted the 30-year-old Thomas Edison and his new phonograph.) But why the flippancy over Hayes’ single term? Because what he did after it was way more interesting.
In a nod to the third act of John Quincy Adams’ storied career, Hayes unleashed his very progressive views on race, education, and big business and became social justice crusader way before it was trendy. Among other things, he advocated for universal education as a means to ensure the suffrage and advancement of the recently freed yet woefully unsupported slaves. He served on the National Prison Reform Association board with the young New York state assemblyman Teddy Roosevelt and railed against income disparity and the plight of the poor that corrupt monopolies exacerbated. He was a trustee of Ohio State University (a school he helped to found as Ohio governor) and endorsed the 24-year-old W.E.B. DuBois for an educational scholarship.
Judged strictly on his presidential tenure, Hayes doesn’t inspire much praise. He came about during a time when the party bosses held as much if not more political power and control than the presidents did. I don’t think all forgotten presidents deserve to have their low reputation reconsidered (I’m coming for you, John Tyler), but viewed holistically I’d say Hayes deserves more than the middling (and slowly dropping) rank he often gets.
After years of hype and speculation, Arrested Development is back thanks to the tireless work of Mitch Hurwitz and the show’s writers. Watching these characters again has been surreal. I had the same feeling when I saw Toy Story 3 and The Hobbit: I’d watched the previous installments (the LOTR trilogy in the case of The Hobbit) so many times that seeing the same characters in new situations was delightful yet a bit disorienting. Which is how I can best describe the new fourth season of Arrested. The narrative Hurwitz and his writers have created is so labyrinthine and arcane that it will certainly require repeated viewings to fully grasp, just like the show’s first three seasons.
After one run through the season, my main critique is that unlike the original run, the new season lacks a core. Instead of focusing on the family unit throughout the narrative, each episode centers on one main character, with the others coming and going while engaged in their own overlapping story. It’s hard to fall back in love with the family when we only see them in bits and pieces over fifteen episodes. When the story falters, it’s because it upended the show’s tried-and-true structure of Michael being the reliable straight-man, the eye of the Bluth storm. He held the show together — sometimes only barely — so that the rest of the ensemble could wig out. However, with this new season’s decentralized narrative and Michael’s uncharacteristic recklessness, there is no reliable foundation; the inmates are now running the asylum. Whenever we’re following a new wacky non-Bluth character (and there are many of them this season) instead of the family we’ve come to love (or at least be amused by), the show almost always lags. This, combined with the fact that episodes are now 30-35 minutes long instead of their network TV-sanctioned 22, makes for an Arrested Development that doesn’t quite feel like the Arrested Development of yore.
I realize this (lack of) structure was largely a logistical one; they had to film it in discrete pieces simply because it was so hard to get the full cast together at one time. But that challenge turned into a net positive for the show. Its brand of postmodern meta-comedy is perfectly equipped to handle the structureless vacuum season four has created, as NPR’s interactive guide of the show’s running gags illustrates. When so much of the show’s humor is derived from its interconnected meta-jokes, allusions (or illusions), and sight gags, the central storyline — if there even is one this season — is often irrelevant.
In fact, I sometimes found myself disengaging with the main story of each episode with the hope of catching the many in-jokes, callbacks, and other background tidbits that zoom by. One critic found this meta-humor distracting, saying the season “trades far too easily on callbacks to the early seasons, a sort of unpleasant fan service that is depressing to watch.” Indeed, there are a lot of callbacks, but that’s nothing unique to this season. As the aforementioned guide points out, even season one was making callbacks and allusions to itself on dozens of jokes. I’m sure the in-jokes this season felt like a fan service because the fans are essentially why this season exists at all, but they have been the show’s lifeblood since its infancy and almost always impress rather than depress.
And that’s why you always leave a note I’ve attested to the new season development and give a hearty Huzzah! to Mitch Hurwitz & Co. for giving in to the Internet’s collective conch call to bring the gang back together and for going all-out on it. Often I stared at Netflix in wonder, wanting to slow-clap the writers for what they have attempted with this season. While it might have a slightly lower batting average than in the first three, it was a swing for the fences. And like Maeby this season, I think they made it home.
At the climax of The Dark Knight, Joker has Batman trapped on the top of a skyscraper while he waits for the boats full of prisoners and civilians to blow up. The clock strikes midnight — the deadline the Joker gave to those on the boat — but there’s no explosion. For the first time in the movie, the Joker looks surprised and out of control. Batman, despite being momentarily trapped and defenseless, chides him: “What were you trying to prove? That deep down, everyone’s exactly like you? You’re alone.”
I thought of this scene when reading about this brave British woman named Ingrid Loyau-Kennett who confronted the knife-wielding terrorists immediately after their barbarous acts today. They told her they wanted to start a war in London, to which she replied: “It is only you versus many people. You are going to lose; what would you like to do?”
Terror and fear don’t get to win. These men can be angry about people who are dying in Afghanistan, but propagating the “eye for an eye” principle leads only to self-destruction. These cowardly villains can make a grand show of their hate, but they won’t start a war in London nor anywhere else they wish. They won’t win converts to their twisted ideology, save for a few other confused souls. They are alone. And people like Ingrid Loyau-Kennett prove that every day.
I finally read Joel Stein’s Time magazine piece on the Millennial Generation, called “The Me Me Me Generation.” For the record, unlike some of my Millennial cohorts I hate “selfies” (the term and the thing it describes), I don’t feel entitled to a great job right out of school, and I don’t sleep next to my phone. But I don’t think the article deserved all of the antipathy it received from the blogosphere. I thought it was a fair if slightly fogeyish and surface-level assessment of overall generational characteristics. The problems my generation struggles with — like narcissism and a sense of entitlement — are so noticeable largely because of the times we live in, with everything more public and social technology more widespread. You don’t think the Baby Boomers would have peppered Instagram with pictures from Woodstock? or that Gen-Xers would have had entire Spotify playlists dedicated to their collection of sad and angsty ballads? The manifestations of narcissism by young people today merely belie the human condition that plagues all humankind: We’re selfish creatures, no matter how old we are or how many Twitter followers we have.
The combination of the influence of technology and how we collectively were reared — being told how special we were by over-protective helicopter parents — also contributes to how we are currently growing into adulthood. Generally speaking, we’re able to postpone full emergence into adulthood and still live with our parents because (a) we can and our parents don’t seem to mind (or at least don’t say so), and (b) because the economy sucks and has changed so much that traditional jobs and careers aren’t as feasible anymore. The Boomers were anxious to get out of the house and their parents were eager for them to leave, so naturally the way things are done now clashes with the way of the past. Welcome to The Present Reality.
Having said that, we can’t abdicate responsibility for making choices about our lives. We don’t have to live with our parents or check Facebook ten times a day or start a YouTube channel to get famous, but we do anyway (well, not me, but the collective We certainly do). And that doesn’t just go for Millennials: Facebook usage is declining among younger people because their parents (Boomers! shakes fist) have slowly taken over. Magazine columnists can try to pin the narcissism epidemic on young people all they want, but when I go to restaurants nowadays I see just as many if not more parents on their phones than younger people. We can’t simply blame the times and the technology for our behavior, because we’re human beings with the capacity to choose whether to succumb to societal forces or to instead carve our own path, peer pressure be damned.
I think we’ll be all right. Like generations before us, we have a great opportunity to make things better. That will involve some pushing back against the political and cultural acrimony that has characterized the Boomers’ ascendency and reign, but every generation has had to clean up the messes of its predecessors. We Millennials will inevitably make mistakes, and our kids will have been formed by them in some way, for better or for worse. Let’s just hope it’s for the better.
My cousin recently posted a quote from my Uncle Steve, who died of cancer in 2001. He was writing to a friend with whom it appears he was discussing the future and the stress of the unknown:
Admittedly, not knowing the outcome of my plans can create stress. It takes a great deal of courage to grab the reigns of life and ride off into the wilderness, to live in the chaos of the uncertain. But I would rather live in the stress of uncharted seas in the cause of exploring and living, than in the tepid, stale waters of the charted, programmed life. Let me die trying something new. None of my choices are ever wasted.
Wise and encouraging words. I knew Steve only in a limited way since he lived farther away and I was only 14 when he died. I wish I’d known him as an adult.
Do we have the right to forget the past, and to be forgotten?
That’s the key question in this article from The Guardian by Kate Connolly, which part of a larger series on internet privacy. Connolly talks with Viktor Mayer-Schönberger, professor of internet governance at Oxford Internet Institute, who describes himself as the “midwife” of the idea that people have the legal, moral, and technological right to be forgotten, especially as it relates to the internet’s memory.
In order to make decisions about the present and the future, Mayer-Schönberger claims, our brain necessarily forgets things, which allows us to think in the present:
Our brains reconstruct the past based on our present values. Take the diary you wrote 15 years ago, and you see how your values have changed. There is a cognitive dissonance between now and then. The brain reconstructs the memory and deletes certain things. It is how we construct ourselves as human beings, rather than flagellating ourselves about things we’ve done.
But digital memories will only remind us of the failures of our past, so that we have no ability to forget or reconstruct our past. Knowledge is based on forgetting. If we want to abstract things we need to forget the details to be able to see the forest and not the trees. If you have digital memories, you can only see the trees.
One of his ideas to combat the negative effects of the permanence of data is to implement an “expiration date” for all data — akin to the “Use By” date on perishable food items — so that it can be deleted once it has served its primary purpose. “Otherwise companies and governments will hold on to it for ever,” he claims.
A counter-argument for this right-to-be-forgotten strategy is that it could be impossible to implement due to the many back-ups that are made of the same data; if the data exists somewhere, then you’re technically not forgotten. But Mayer-Schönberger pushes back on this, saying even if Google has a back-up somewhere, if you search for the data and “99% of the population don’t have access to it you have effectively been deleted.”
What’s unclear about his “expiration date” idea is whether it would include a self-destructing mechanism embedded within the data, like how e-books rented from libraries disappear after a predetermined time period, or whether the data’s user could choose to ignore its “Delete By” date. If the data holders are not legally or technologically compelled or obligated in some way to delete the data permanently after an agreed upon time, then this “right to be forgotten” becomes a lot weaker.
As an aspiring archivist, tech enthusiast, and history buff, I can see where something like this could be detrimental to historians, information managers, and culture heritage caretakers. One of the Internet’s strengths is its ability to hold a vast amount of easily transmittable information, much more than any era before ours could, so to effectively neuter this ability would hinder present and future historians and archivists in their quest to accurately document the past.
A historian studying late-1700s American history has only select diaries, newspaper clippings, and other ephemera of deteriorating quality from which to cull contextual information and interpret that time period for modern audiences. Researchers studying the present day, however, have millions of gigabytes of data available to them on the Internet – way too much information for even the Internet Archive or Library of Congress to adequately archive, let alone make sense of.
But as an individual, having the ability to regain a modicum of control over one’s own data is very appealing. Anyone who has ever posted a photo on Facebook they later regretted, or sent an email they wish they hadn’t, or wrote an inflammatory blog post years ago could see great value in data that can be, if not irreparably extirpated, then at least banished from digital civilization. This may lead to a less-complete record of our existence, but given how much more data we’re producing overall today than ever before we will not lack for records anytime soon.
We should all, I believe, have the right to the digital equivalent of burning a letter we don’t want living on in perpetuity, even though this idea runs counter to the impulses of our over-sharing and hyper-connected world. It is also anathema in archives: just think of all the information in that letter we’ve lost forever! I hear you, imaginary archivist, but, to return to Mayer-Schönberger’s analogy, even if a forest loses a tree — from natural death or manmade causes — it will still be a forest. And as Theodore Roosevelt, a great man of nature and of letters, said, “There are no words that can tell the hidden spirit of the wilderness, that can reveal its mysteries, its melancholy and its charms.”
The Internet, like a forest, should allow for mystery. Otherwise, where’s the fun in the searching?
When I look back on my nearly 19 years of classroom education in elementary, middle, high school, college, and grad school, I think I’ll remember my junior year AP U.S. history class in high school as my favorite. What I loved about it was what probably bored most other students in class: it was a data dump of historical facts and anecdotes; a pure, unadulterated stream of Americana. Mr. Friedberg would spend each class throughout the semester explaining persons, places, dates, key events, and political concepts from the Revolutionary War to the Clinton presidency, and I would gleefully take notes.
I had a buddy in this class who shared the same affection for the subject matter, and more importantly, the detailed note-taking thereof. We would compare notes outside of class and discuss what we were learning, so much so that after the semester we planned to create a website where we would maintain an archive of our notes in narrative form as a resource for other Web denizens, but also because we just enjoyed writing about it. I recently found one of the documents we created for this endeavor titled “Jacksonian Democracy,” which detailed the politics and people of the period between 1824 and 1848 that was defined by the attitudes, actions, and aftermath of Andrew Jackson.
Re-reading these notes from high school was a kick because they retrod the same narrative of Andrew Jackson: His Life and Times, a biography of the seventh president by H.W. Brands that I just finished reading. In my quest to read a biography of every U.S. president (eight down, 35 to go), I recently tackled John Quincy Adams, Jackson’s presidential predecessor, bitter rival, and polar opposite. I knew after reading the JQA biography (John Quincy Adams by Paul Nagel) that I would need to read Jackson next, so I could give “Old Hickory” a fair shake after having read about him from JQA’s perspective, which, unsurprisingly, isn’t very adoring.
What I knew about the tempestuous Tennesseean before this was what most other people knew: he turned a hardscrabble upbringing into a career as a soldier, famously defeating the British at New Orleans in the War of 1812 and parlaying his fame into the first man-of-the-people election in the young nation’s history, which ushered in a new era of democratic reform. But seeing that life story rendered in detail by Brands gives me a new (qualified) appreciation for the General.
Brands’ take on the man, who was left fatherless before birth, shows a young boy deprived of formal education, adequate adult supervision, and a decent standard of living, the lack of which conspired to create a pugnacious, immature, and defiant ruffian who often (and sometimes purposefully) got in over his head. One such instance occurred during the Revolutionary War when as a 13-year-old courier he was captured by the British and implored to clean the boots of an English officer. When Jackson proudly refused the officer gashed his hand and head, leaving him with lifetime scars and a hatred of all things British. He later acquired more wounds from the countless duels he either initiated or was compelled to engage in. Seriously, I lost count of all the duels he was in one way or another involved in.
A huge part of Jackson’s life and identity was his wife Rachel. They married in the 1790s under scandalous circumstances: Rachel had divorced an abusive knave named Lewis Robards and apparently shacked up with Jackson before the divorce was finalized. Jackson’s fierce loyalty for his friends and intense hatred for anyone who betrayed that loyalty or besmirched his or his wife’s honor were revealed in this situation, in another during his presidency, and throughout his life.
Jackson’s insatiable defense of honor is what provided such a stark contrast with his predecessor Adams. Both men were children of the Revolution, though with extremely different upbringings. While Jackson was orphaned early on and as a teen fought British regulars in South Carolina, John Quincy was getting schooled at Harvard and traipsed around Europe with various Founding Fathers. JQA was self-loathing and depressed, which constantly stymied his intellectual ambitions; Jackson was a man of action, basically seeking out conflict and unabashedly fighting his way through court cases, wars, and political scandals, even while suffering through lifelong debilitating ailments. While Adams defended Jackson at times, the 1824 election imbroglio, its subsequent political skullduggery, and Adams’ Federalist leanings inevitably made him Jackson’s natural enemy.
There’s a lot not to like about Andrew Jackson. He was brash, bordering on unhinged, especially when dealing with an adversary. He could be annoyingly obstinate, like when he refused to honor or even acknowledge various Supreme Court decisions (mostly due to his ire for the federalist Chief Justice John Marshall). And, oh yeah, he owned a bunch of slaves. This, along with his involvement in the removal of Native American tribes, is usually the deal-breaker for people when considering his presidential greatness.
But his failings could also be interpreted as his strengths. His obduracy paid off for his democratic and anti-elite ideology in his fight against the banker Nicholas Biddle (yet another hated rival) and the National Bank. When faced with a tariff-induced constitutional crisisthat was spearheaded by his former VP John Calhoun and his South Carolinian brethren, Jackson brought the hammer down on his home state in favor of preserving the Union. Add to all this the fact that he narrowly escaped becoming the first assassinated president due to two pistols misfiring at point-blank range. God must have loved Andrew Jackson.
He wasn’t a lovable guy, but he was important for his time. He was the first president not from Virginia or Massachusetts, or from the elite establishment that until then had essentially dictated the course of public policy without a whole lot of input from average citizens. Jackson carried to Washington the mantle of idealized agrarianism and equality for the common man that was established by Jefferson, and from which the Jacksonian brand of democracy was sowed for future generations.
About two years ago I stopped watching cable news all together. Regardless of the channel, there is rarely anything on worth the time and energy it takes getting frustrated by the mostly non-news news stories being covered like Access Hollywood fluff pieces. But late last night as I was channel-surfing before turning in, I was chagrined to see this on Fox News.
I am loath to respond to anything Fox News does because it simply plays into their game, but consider this bait taken. To sum up, commentator Bob Beckel is upset that in last Sunday’s episode set in Vietnam, The Amazing Race had a clue marker at the site of the B-52 Memorial that contains the wreckage of an American bomber plane that was shot down during the Vietnam War. Additionally, the racers’ task for that leg’s Roadblock was to watch a performance of a socialist Vietnamese anthem and remember key lyrics that would be used for another task.
The B-52 Memorial in Hanoi, Vietnam.
The things the panel of commentators say in the video about this non-troversy are so unbelievably asinine and uninformed. Their main point, as far as it can be ascertained, is that the Amazing Race producers are anti-American for showing a socialist song and insensitive toward Vietnam veterans and civilians for acknowledging the existence of a downed American plane.
Beckel is so indignant that CBS hasn’t apologized, promising to “go after” CBS until they do. Dana Perino, former press secretary to George W. Bush, claims “of all the things people apologize for today, you would think this would be an obvious one, but they’re just being stubborn I guess.” Greg Gutfield adds with a supercilious smirk: “It’s called Fox News Syndrome. It’s that Fox News is covering it and no one else is.”
Fox News has some kind of syndrome, all right. Just not one that makes its latest ginned-up rage piece an actual story.
I’m about 98 percent certain none of these commentators have ever seen The Amazing Race, let alone this specific episode, because if they had they would know that the show has been all over the world, using hundreds of locations as backdrops for tasks and challenges that employ local customs, trades, and people of all kinds. They would also know that the show honors the cultural heritage of the locations they use, regardless of its nature. Using the socialist anthem in the show wasn’t an endorsement of socialism any more than using Stalin and Lenin impersonators in Russia on last season’s 8th leg was an endorsement of Bolshevism. The Race shows the host country’s history and culture for what it is, not for what the Greg Gutfields or Bob Beckels wish it would be.
As Lester at DryedMangoez points out, the show has been to Vietnam before: in season 3, when a Vietnam vet racer poignantly returned to the country for the first time; and in season 10, when the racers stopped by the Hanoi Hilton, where John McCain was held as a POW. They have also been to Auschwitz, the Berlin Wall, the USS Arizona memorial at Pearl Harbor, and the “House of Slaves” in Senegal, the embarkment of the Atlantic slave trade.
As much as the show seeks to respect its more serious stops, they plow through every task and location so quickly (it’s a race after all) that any attempt to honor the fallen comes off as rushed and ham-handed. But they still go to these places not to give in-depth history lessons, to make a political point, or to propagate Fox News’ brand of flag-waving American exceptionalism abroad, but because these places are interesting and so contestants can win money. I accept the superficial, TV-drama aspect of the show because it allows contestants and viewers like me to experience and enjoy faraway lands we wouldn’t be able to otherwise. I didn’t even know of the existence of the B-52 Memorial in Vietnam until I saw it on the Race, and I wouldn’t be surprised if these Fox commentators hadn’t either – at least until some producer looking to grease Fox’s manufactured-outrage machine alerted them.
The Amazing Race might be just a profit-driven reality show, but it does more to illuminate and celebrate world cultures, exotic locales, and peculiar customs than any other reality show does or than Fox News ever cares to. I hope CBS continues to refuse to apologize, because it has done nothing wrong.
Someone on the Internet once said something to the effect of: “I’m not a writer; I write.” Writing, for example, is something you do, but it’s not who you are. You might really love writing and consider it integral to your life, but it isn’t your very essence–at least, it shouldn’t be.
I’m re-reading Tim Keller’s The Reason for God. In a section about the social consequences of sin, Keller argues that when societies and individuals cling too strongly to any belief or entity other than God, that ultimately broken belief will become the essence of their identity and will let them down. He quotes Jonathan Edwards’ The Nature of True Virtue, which argues that a human society fragments when anything but God is its highest love: “If our ultimate goal in life is our own individual happiness,” Keller summarizes Edwards, “then we will put our own economic and power interests ahead of those of others. Only if God is our summum bonum, our ultimate good and life center, will we find our heart drawn out not only to people of all families, races, and classes, but to the whole world in general.”
The question then becomes: What is your summum bonum? What is your utmost identity? Our recently endured presidential election season provided ample proof that many people cling so tightly to their political views that it basically becomes who they are instead of merely what they believe. When that happens, civil discourse becomes impossible and anything approaching compromise is deemed impure and even cowardly. Keller sees the problem with this: “If we get our very identity, our sense of worth, from our political position, then politics is not really about politics, it is about us. Through our cause we are getting a self, our worth. That means we must despise and demonize the opposition.”
I think it’s especially easy for religious people to fall into this trap, despite good intentions. It understandably becomes hard not to transfer the ardent passion attached to a deep-seated belief in God or whomever onto other less divine issues. But as Stephen Colbert said, the road connecting politics and religion runs both ways; if we insist on forcing our religious identity into our politics, our muddied and corrupt politics will come right back into our religion.
Walt Whitman was right: I contain multitudes. I act as a son, brother, friend, significant other; I study, I write, I read, I work, I create… I identify with all of these roles, but any one of them doesn’t define me because I don’t rely on one to give my life meaning. I can argue with people who disagree with my views on politics or music or religion because my views aren’t me. They’re just the tools I use to try to make sense of life. They can be beautiful, inspiring, enraging, or intriguing–but they can’t be everything.
If someone made a movie about William Quantrill, he’d be sorta like Lt. Aldo Raines from Inglourious Basterds but a Confederate instead of U.S. Army and probably not as funny and killing civilians instead of Nazis. (Tarantino film coming in 3…2…) Originally a schoolteacher in Ohio, Quantrill toiled for a bit in low-paying jobs, his family saddled with debt after his father died. As a teenager he took up gambling in Salt Lake City and got handy with a knife and rifle before returning to Kansas where he quickly turned to the life of a brigand, earning money through noble affairs like capturing runaway slaves cattle raiding. It was during this time when his erstwhileanti-slavery views soured quickly toward Confederate sympathy.
At the war’s start in 1861, Quantrill joined up with Joel B. Mayes, a Cherokee chief and Confederate major who taught Quantrill the Native American-inspired guerrilla warfare techniques he’d later employ to a deadly degree. He fought for awhile, but soon spun off his own guerrilla band of bushwhackers later known as the Missouri Partisan Rangers, aka “Quantrill’s Raiders.” The group made its infamous name in Lawrence, Kansas, hotspot of Union activity, when it raided the town to avenge the deaths of some the Raiders’ kin in a Union prison. They executed 183 men and boys from age 14 to 90, looting the town bank and making off to Texas, where the group split off into smaller companies.
Quantrill’s last stand came in Kentucky in 1865 when he and a few remaining Raiders were killed in a raid, meeting at ignominious end at the age of 27. But his legacy lived on through one of his ex-Rangers named Jesse James, who used Quantrill’s hit-and-run tactics in bank robberies to great “success.” There also was established the William Clarke Quantrill Society, which is dedicated to “the study of the Border War and the War of Northern Aggression on the Missouri-Kansas border with an emphasis on the lives of Quantrill, his men, his supporters, his adversaries, and the resulting historical record.” In the South, the Civil War is known as the War of Northern Aggression; this may be my Yankeeness talking, but in Quantrill’s case the aggression was all his.
Up next in CCWN, the riled-up Robert Barnwell Rhett.
I just finished reading Hampton Sides’ Hellhound On His Trail: The Stalking of Martin Luther King, Jr., a recounting of the assassination of the famous civil rights leader through the perspectives of the people involved in the run-up to and aftermath of King’s slaying. I highly recommend this book for its extensive background on King’s assassin—the hermetic convict James Earl Ray—and its fast-moving report of the events in Memphis on April 4, 1968.
Meanwhile, I have been watching the 1987 miniseries Eyes On The Prize, which chronicles the Civil Rights movement from Brown vs. Board of Education to the Selma-Montgomery marches. It tells a gripping narrative of key events in the 1950s and 1960s, mostly in the South, through news footage and first-hand accounts by marchers, activists, politicians, and other figures involved in the struggle for freedom, for better or worse. It’s interesting learning about the development of the Civil Rights movement while reading about the MLK assassination, which in retrospect became the nadir of the movement and end of a transformational yet tumultuous chapter in civil rights history.
Watching the progression of the movement up close, via the documentary-style footage in Eyes On The Prize, has been fascinating and a bit distressing. The violence and unmitigated bigotry of the white communities that black citizens had to face every single step along the way never fails to bewilder me.
Maybe it’s my modern bias speaking here, but only one generation in the past, fire hoses and attack dogs and police brutality and miscarriages of justice met anyone—mostly black freedom fighters but also sympathetic white activists—who sought equal protection under the law. That troubles me greatly.
Those freedom fighters needed a hefty load of courage to face that persecution and risk of death for the sake of the Cause. Men and women like Rosa Parks, Martin Luther King, Malcolm X, Minnijean Brown, Medgar Evers, James Farmer, Stokely Carmichael, Andrew Goodman, Michael Schwerner, James Earl Chaney, and countless others risked life and limb (and often lost them) in an uphill battle for rights we take for granted today.
It makes me wonder what I as a white middle-class male would have believed or done if I were transported to 1960s Mississippi. Would I have linked arms in an anti-segregation march, or would I have been one of the townsfolk lining the street cursing out the marchers for upsetting the peace? More likely, I probably would have been in the middle—sympathizing with the pursuit of basic civil rights but not outwardly acting on or against that pursuit’s behalf. Moderation is key, the saying goes, but in this case it wouldn’t be enough.
The people featured in Eyes On The Prize decided to fight for their lives and the lives of others but without resorting to violence, facing an opposition that was armed and very invested in keeping the status quo. Those men and women chose liberty over life. How many of us could make that choice?
I ask these questions because I’m trying to sort through them myself. I’ve written before about how the orthodoxies we have today may be considered antiquated or even pernicious to future generations looking back. With this in mind, I think it’s important not to judge previous times too harshly without fully understanding the context and realities within which they lived. Since what I know of the Civil Rights movement generally consists of the remnants of a few years of history courses, I hope I will continue to learn about it in order to better understand the struggle of the people it involved.