That’s me at our local No Kings rally back in June. It’s the energy I’m bringing to Fourth of July this year, what with the United States government having been taken over by orcs, goblins, and all manner of Mordor-worthy villainy. May we the people soon topple their treachery and an Aragorn-esque leader one day unite the forces of good against such reckless hate.
(Yes, Tolkien nerds, I’m aware “Forth Eorlingas” is a Rohirrim rallying cry and thus not an Aragorn thing, but I couldn’t turn down the title pun.)
In recent years, millennials, the former hip young things that once seemed so cutting edge when cast side-by-side with the out-of-touch baby boomers and the rather nondescript generation X, have become, well, a bit cringe. …
But, I’ll confess, being part of a generation that felt so progressive compared with its predecessors, bridging the gap between analogue and digital, felt significant, essential, and yes, bloody cool, actually. It’s a shock, then, to wake up one morning and realise you’ve been usurped.
The first thing I thought of while reading this? June George from Mean Girls:
I can’t for the life of me track it down, but there was a tweet long ago that stuck with me that said basically: cool doesn’t exist, it’s a made-up concept that makes people act dumb and it’s pointless to chase after.
Just be yourself, like what you like, and forget about trying to impress strangers on the internet or IRL—especially people younger than you.
‘Cause you know what’s cool? A billion dollars Not obsessing about what’s cool, what’s cringe, or whatever the latest Gen Z slang is. Embrace the freedom that aging and earnestness provide.
Gracy Olmstead is back with another excellent issue of her Granola newsletter, this time on mundanity, the mind, and AI:
While doing the mundane, we lose ourselves in process and place. The mundane roots us in the present, stubbornly refusing the demands of clock or calendar. It will take as much time as it requires. And so we pull at a thread of argument, uproot weed after weed, or sweep every nook and cranny until the room is clean. We sink into a new experience of time and place, in which everything diminishes but the now and here. Ironically (and sometimes, maddeningly) we may have to do it all again: Sit down to rewrite, hone, edit, and polish. Return to the nasty weeds that pop up day after day. Tackle the dust and grime of another week.
Yes, the mundane is not always pretty. What these experiences shape is not always a finished product that we can hold up and boast about. Sometimes, yes. But not always. What is always true is that these processes are shaping and honing us. They are showing us who we are, how to be, and what it means to think and live. The work of the mundane tethers us to place, to our bodies, to the people we love and live with, and—perhaps in a way I never realized before AI—to our minds themselves.
If the mundane elements of our lives show us who we are, how to be, and what it means to think and live, then what will become of us when we outsource that being, thinking, and living to AI or other ideologies? We sacrifice those essential elements of existence and become their opposite: nothing.
Because it’s the process of slogging through an argument—feeling out its contours and edges, remolding and reshaping them like a potter—that teaches us how to think. Strong arguments do not spring fully formed from the mind. They simmer and stew. They emerge half-formed, and have to be reshaped. Essays materialize when you start to write, and realize you did not yet know what you thought. In the process of verbalizing thoughts, there is room to grow, stretch, and challenge the mind. There is even room to change your mind. AI short circuits this opportunity—in giving us what we ask for, it in fact steals opportunities for growth. It cheats the process of becoming that the mundane offers.
This really spoke to me in relation to writing specifically, whether for this blog or Cinema Sugar. Some writers bemoan the writing process itself, slow and tedious and frustrating as it can be. “I love having written something” goes the trite phrase. And it’s indeed satisfying to finally arrive at the end product. But I also love being in the weeds of the thing. Thinking and rethinking, writing and rewriting, arranging and rearranging, rinse and repeat—that time spent with my hands in the metaphorical dirt, in the mundane, is where the real magic happens.
“Artificial intelligence” is not a technology. A chef’s knife is a technology, as are the practices around its use in the kitchen. A tank is a technology, as are the ways a tank is deployed in war. Both can kill, but one cannot meaningfully talk about a technology that encompasses both Sherman and santoku; the affordances, practices, and intentions are far too different to be brought into useful conversation. Likewise, in the hysterical gold rush to hoover up whatever money they can, the technocrats have labeled any and all manner of engineering practices as “AI” and riddled their products with sparkle emojis, to the extent that what we mean when we say AI is, from a technology standpoint, no longer meaningful. AI seems to be, at every moment, everything from an algorithm of the kind that has been in use for half a century, to bullshit generators that clutter up our information systems, to the promised arrival of a new consciousness—a prophesied god who will either savage us or save us or, somehow, both at the same time. There exists no coherent notion of what AI is or could be, and no meaningful effort to coalesce around a set of practices, because to do so would be to reduce the opportunity for grift.
So what is it? An ideology:
… A system of ideas that has swept up not only the tech industry but huge parts of government on both sides of the aisle, a supermajority of everyone with assets in the millions and up, and a seemingly growing sector of the journalism class. The ideology itself is nothing new—it is the age-old system of supremacy, granting care and comfort to some while relegating others to servitude and penury—but the wrappings have been updated for the late capital, late digital age, a gaudy new cloak for today’s would-be emperors. Engaging with AI as a technology is to play the fool—it’s to observe the reflective surface of the thing without taking note of the way it sends roots deep down into the ground, breaking up bedrock, poisoning the soil, reaching far and wide to capture, uproot, strangle, and steal everything within its reach. It’s to stand aboveground and pontificate about the marvels of this bright new magic, to be dazzled by all its flickering, glittering glory, its smooth mirages and six-fingered messiahs, its apparent obsequiousness in response to all your commands, right up until the point when a sinkhole opens up and swallows you whole.
“The American lawn is a thing, and it is American, deeply American,” Paul Robbins, an expert in environmental studies at the University of Wisconsin-Madison and the author of the book “Lawn People,” told me. “There becomes a kind of local social pressure to make sure you’re not letting down the neighborhood — you’re keeping up the property values. Those then become morally normative.”
This devotion has turned the U.S. into the undisputed global superpower of lawns. Around 40 million acres of lawn, an area almost as large as the state of Georgia, carpets the nation. Lawn grass occupies more area than corn. Each year, enough water to fill Chesapeake Bay is hurled collectively onto American lawns, along with more than 80 million pounds of pesticides, in order to maintain the sanitized, carpet-like turf. In aggregate, this vast expanse of manicured grass rivals the area of America’s celebrated national parks.
The typical suburban lawn is zealously mown, raked and bombarded with chemicals. Flowering plants that would typically appear in an untended meadow are sparse. For insects, reptiles, birds and many other creatures, these places are hostile no-go zones. Closely cut grass is neither habitat nor food for most insects.
Most of the houses around us are zealously mowed and bombarded with chemicals by landscaping companies, but not ours. We’ve surrendered to the dandelions, Creeping Charlie, wild violets, burdock, and other weeds because we simply don’t have the time or energy to fight them, nor the desire to use pesticides. Luckily our neighborhood isn’t fancy enough for that to matter much (though shoutout to the empty-nester two doors down who dotes on his pristine, carpet-like turf).
Would I love my lawn and garden areas to be as pristine as his? Absolutely. But the cosmetic appeal is rather fleeting compared to the costs in time, money, wasteful water use, and/or chemical exposure. I’d also love to transform at least part of our sizable lawn into a biodiverse garden, but that too takes an immense amount of work and dedication that we just don’t have in this time of life. So a weedy, grassy yard it is!
Continuing my unofficial series on problematic parenting clichés, there’s one I’ve heard a few times recently and must address:
“Little kids, little problems. Big kids, big problems.”
Setting the condescension aside, the idea is that all the challenging aspects of parenting babies and young children—e.g. diaper changes, loss of sleep, tantrums, potting training, keeping them from accidentally killing themselves—aren’t actually challenging compared to what parents of older kids and teenagers have to deal with like adolescent attitude, busy schedules, college applications, and tricky conversations about sex, drugs, technology, and so on.
Respectfully, this is a mound of malarkey.
Untruer words were never spoken
Obviously I’m slightly biased as the parent of young children. But as a former teenager myself, I’m clear-eyed about the challenges of that phase even if I haven’t yet been on the other side of it. So when I hear an older parent trot out that trite un-truism (which happened to me recently on two separate work calls), I’m inclined to diagnose them with early-onset gramnesia.
Which is understandable. If you’ve been out of this phase for a while, it’s easy to forget what the day-to-day is like. You can look back fondly on the cute pictures and innocent personalities without also feeling the toll of the daily grind that facilitates them. But for us currently in that stage, it’s a big problem if a nap gets skipped or a tantrum derails an outing or a car ride turns traumatic with a screaming toddler. Because all of those things directly affect our everyday life and psychological state.
Just go to any playground and look at the parents. While the ones with older kids (say, ages five and up) are reading or on their phones or otherwise checked out from the action, I am trailing my freshly minted two year old to make sure he doesn’t pick up garbage, try to put said garbage in his mouth, get bowled over by the bigger kids running around, or fall off a high spot on the playground. And this isn’t even overprotective helicopter parenting—it’s just life with a toddler. A joy and adventure, yes, but also constant.
Which is why I teared up at this reel from Oh Crap! author Jamie Glowacki, which validates what I already know to be true: that parenting almost always gets easier the more they age.
If you think little kid problems are small or insignificant compared to yours, then I hate to break it to you but in the grand scheme of things, no one besides you is concerned with your teen’s college search or team practice schedule or social media use.
Being a parent is hard. Period. Different stages present different joys and challenges—not big or small, just different. And if you ever want to gripe about them, no matter the age of your kids, I will validate your feelings and in solidarity send a ✊ or, more likely, a Katniss Everdreen salute. Because we parents always need the odds in our favor.
Ross Barkin ponders what kids of today lack compared to their 20th century predecessors:
When I consider the geniuses of that era—or any, really, before the last ten years or so—I think of time. Talented children, until the incursion of the smartphone and immersive videos games, had much of it.
One big reason for this:
Children could only be enchanted by gizmos and gadgets for so long. The television was stationary, rooted in the living room, and it might have only featured a few channels, depending on the decade. Movies, similarly, were confined to physical theaters. Even in my own childhood, in the 1990s and 2000s, video gaming was largely a social activity. I brought my friend over to play Nintendo Wii or we went to his house to battle in a Dragon Ball Z video game on the PlayStation 2. Unique among my peers, I didn’t own a video game console until I was a teenager, and this meant, to my benefit, I had a childhood free of such seductions.
I too did not own a video game console growing up, except a Game Boy (on which I did spend many maddening hours trying and failing to conquer the Toy Story game). That lack was something I lamented at the time but am grateful for today, because it meant video games weren’t constantly commandeering my time and attention. Instead they were a special occasion, something to be enjoyed with others. I have fond memories having a Halo party with my youth group friendsandplaying Ready 2 Rumble Boxing with my uncles on a PlayStation rented from Blockbuster.
Barkin spotlights Brian Wilson of The Beach Boys as an example of the kind of genius who had an abundance of time to be able to develop his talent. Then he asks what the Brian Wilsons of 2025 do with their weekends:
Brian was a preternaturally gifted child who deconstructed vocal harmonies on the radio and spent hours over his piano. A child today with such genius might tinker around with music but devote far more of his days to Minecraft, Fortnite, and MrBeast. The child might drown in a sludge bath of AI. The same could be true of the budding novelists, poets, and painters. All of these technologies are arrayed against dreams and imagination. The content—the YouTube, the video games, the TikTok videos—does all the imagining for you. The brain devolves into a vessel for passive consumption.
And that consumption happens (literally) right before their eyes:
For all the obsessing modern parents do over the fates of their children, they’re happy to toss out an iPad or a smartphone or a Nintendo Switch and let their boys and girls melt, slowly, in the blue light. A person close to me once suggested that wardens should start giving prisoners iPhones because there’s nothing that will more rapidly pacify an unruly and restless population. If iPhones were teleported back in time to the twentieth century, would we have a twentieth century?
Pacify, yes, but only temporarily since once you turn it off it’s like trying to quash a prison riot.
A while back we severely curtailed our now six year old’s screen time after finally getting sick of how it was negatively affecting his mood and behavior (and thus everyone else in the house)—not to mention time spent on creative endeavors. What used to happen almost every day after lunch plus some evenings is now maybe an hour on the weekend, and sometime none at all. No iPad, no more YouTube or garbage shows, the N64 every once in a while. Putting the TV away was a big help in removing the temptation, but just as important was holding firm on the boundary. It didn’t take long for him to accept the new normal and find other things to do like coloring/crafts, reading, and listening to Yotos.
Barkin’s post is about kids, but it’s just as applicable to us grownups too. I would benefit immensely from the same screen time limits imposed on my children—not because I’m a nascent genius but because I don’t want to melt in the blue light or drown in a sludge bath of AI either. I too want time enough at last.
We underwent several significant home improvement projects recently. I say “underwent” because we didn’t do the actual work but instead paid contractors who knew what they were doing.
One of those contractors was a local handyman who brought in his wife to help with the multi-day project. Their kids are grown but they enjoyed interacting with our young’uns.
In a moment when the boys were being particularly rambunctious, I asked if she missed this phase of having young kids.
“I’m glad we lived it,” she said.
In the moment I took that to be a polite way of saying “I’m grateful we went through it but also that it’s over.” Which was probably accurate to an extent. But I see it now as a richer sentiment: to be glad you got to experience something even though it was challenging, and that you really lived it—not just suffered through.
We have one of those all-in-one turntables that plays vinyl, CDs, Bluetooth, and the radio. One day my wife started putting on our local jazz station, WDCB 90.9 FM (“Chicago’s Home for Jazz”) and it’s been a nice burst of smooth vibes when we want a change from our usual rotation of kids music. I could always find something to play from my digital or vinyl collection of jazz records, but sometimes it’s nice to let serendipity take the wheel.
Black Bag. Felt great to see an honest-to-god movie in the theater with a delightfully twisty plot and inspired casting that made me feel as warm and fuzzy as the film’s lighting. Wouldn’t be surprised to find this on my best-of-2025 list.
The Demon of Unrest by Erik Larson. Turns out there was a lot of drama leading up to the Civil War…
Lincoln. Rewatched this after finishing The Demon of Unrest as a kind of Civil War bookend. Daniel Day-Lewis’s win for Best Actor might be the most deserving Oscar ever awarded.
The Pitt. Been watching this Max series that’s an unofficial ER reboot and my hat is off to anyone who chooses to become and remain an emergency nurse.
A Complete Unknown. I’m not a dyed-in-the-wool Dylan fan like many white dudes around my age and above, so perhaps that’s why I didn’t fall for this as hard as others, Chalamet’s excellent performance aside.
Parasite. Yes this is dramatic and tragic and twisted and all that, but it’s also so damn funny. “Leave it—free fumigation.” 💀💀💀
Mary Poppins Returns. No one can touch Julie Andrews’ singing voice, but Emily Blunt really nails the other Poppins vibes.
Richard Polt typecasting about why we need typewriters in our age of AI and authoritarianism:
When you choose to write with a typewriter, you are quixotically, nobly flying in the face of the assumption that good = fast, efficient, perfect, and productive. Type your gloriously imperfect, expending ineffiencient time and energy — and declare that you still care about human work, and that the process of creation and understanding still matters more to you than the slick products of the machines. …
As for authoritarianism, it is happy to use digital technology to watch us, punish us, and entice us. A soft totalitarianism, with hard pain for those who aren’t pacified by easy consumption and pointless posturing, is becoming the new model of political control. …
Again, typewriters offer one humble but real form of resistance. As in the days of samizdat behind the Iron Curtain, even in “the land of the free” there is a need to find words without compromising with the digital systems that are increasingly under tyrannical control.
Tyrannies have always failed to contain lovers and writers. We must love to write, and write what we love — with the writing tools that we love.
I just finished reading Erik Larson’s latest book The Demon of Unrest: A Saga of Hubris, Heartbreak, and Heroism at the Dawn of the Civil War. It’s about the military and diplomatic machinations surrounding the Fort Sumter crisis, including South Carolina’s role in fomenting secession and Lincoln’s journey to Washington D.C. and the presidency.
I saved a couple passages that I enjoyed for various reasons. Here’s one featuring General Winfield Scott, who was in charge of defending D.C. and the Capitol building during the contested electoral count process in February 1861:
The throng outside grew annoyed at being barred from entry and began firing off obscenities like grapeshot. If words could kill, one observer wrote, “the amount of profanity launched forth against the guards would have completely annihilated them.” Much of this tirade was aimed at General Scott. It had no effect. He vowed that anyone who obstructed the count would be “lashed to the muzzle of a twelve-pounder and fired out of the window of the Capitol.” Scott would then “manure the hills of Arlington with the fragments of his body.”
Love that FAFO energy from Scott. There was also this bit about President Buchanan’s Secretary of War John Floyd:
By now the war secretary had become a deeply controversial figure and an embarrassment to President Buchanan, which was saying something, since the administration itself was widely considered to be an embarrassment. Floyd was deemed by many to be a paragon of corruption, and a traitor to boot.
He had become embroiled in a financial scandal dating to 1858 that resulted in $870,000 in federal funds—equivalent to over thirty-two million in twenty-first-century dollars—being looted from the U.S. Treasury and the Department of Interior.
An embarrassing, corrupt administration with a controversial cabinet member looting federal funds? History doesn’t repeat itself at all…
And this exchange between General Beauregard and Major Whiting, who was scrambling to prepare the Confederate contingent surrounding Fort Sumter:
The island’s batteries had been ordered to be “in readiness,” Whiting wrote, but all he saw was confusion. “We are ready, perhaps, to open fire, but we are not ready to support it,” he told Beauregard on Thursday, April 11. “For God’s sake have this post inspected by yourself, or some one else competent, before you open fire. I am alone here, as you know, and heretofore have been exclusively occupied with the construction of batteries.” One newly arrived contingent of men was “helter-skelter,” he complained; all were volunteers. “There are no regulars here at all.” Beauregard tried to calm him. “Things always appear worst at first sight when not perfect,” he wrote. “We cannot delay now.”
Some mindful leadership from Beauregard right there. Too bad he was a traitor!
Also wanted to shoutout this quote from Captain Abner Doubleday, who was part of the Union garrison defending Fort Sumter:
Doubleday led the first group to the guns in the casemates that faced the Iron Battery at Cummings Point on Morris Island, due south. “In aiming the first gun fired against the rebellion I had no feeling of self-reproach,” he wrote, “for I fully believed that the contest was inevitable, and was not of our seeking.” As Doubleday saw it, he was fighting for the survival of the United States. “The only alternative was to submit to a powerful oligarchy who were determined to make freedom forever subordinate to slavery.”
While working from home the other day I had my work laptop out at the dining table with my six year old nearby. Since I’m usually hidden away in my home office, this quickly piqued his curiosity. I let him type out a short email I had to send, then opened up Paint and showed him how to use the mouse to select a color and draw.
The result is below. If you squint you can make out his attempt at a smiley face in the lower left corner:
Glad to see MS Paint live on in the next generation…
According to research by Neil Patel, 59.2% of traffic to blogs is driven by SEO. It’s the biggest single driver of traffic by far.
Neil found that if your site is 10 or more years old, 44% percentage of the pages on your site could be considered “irrelevant” by search engines. The more irrelevant pages your site has, the more it suffers in search ranking.
In other words, Google doesn’t understand what makes for good publishing on the web.
My question is: irrelevant how, and according to whom? Google’s almighty black box of an algorithm that has already changed in the time it took to write this sentence?
Imagine if your photo library was subjected to the same treatment. “Google thinks this cool photo I took is irrelevant, so I guess I’ll delete it even though it captures a meaningful moment in my life.”
I’ve been blogging for 18 years and not once in that time have I considered the SEO implications of my writing. I don’t even look at view stats. I suppose that could be considered a luxury since my blog is not a business and I don’t have a large enough readership to capitalize on. I’m also fully aware that my full archive of 1,200+ posts to date are important to no one but me.
Google is about now now now. That’s its business, but it’s none of mine. Blogging is about now and then. The experience of capturing what’s on your mind now and making connections with what’s come before.
I echo CJ’s advice:
Don’t worry about what Google wants. It’ll change tomorrow. As will Google’s dominance.
Post as much or as little as you want. It’s your place.
Post whatever keeps you interested and publishing for the long term.
Post whatever helps you build a stronger connection with your audience.
Looking back at your archives helps you re-discover connections you’ve forgotten. It helps your readers do the same.
I can’t remember where I saw the recommendation, but I decided to try The Only Plane in the Sky: An Oral History of 9/11 by Garrett Graff and found it a riveting read. Heavy, of course, but also very illuminating about how quickly and widely the September 11 attacks rippled beyond downtown Manhattan, affecting a lot of people in different ways and different places almost all at once.
I was about to turn 14 at the time. I saw the footage like everyone else and understood it to be a significant event, but I couldn’t have known all the details of the day that the book brings to life all these decades later.
For that reason I’m very grateful to Graff for this monumental work of oral history, which captures the kaleidoscopic nature of the crisis by weaving testimonies from the myriad people affected by the attacks, including:
people in the World Trade Center and Pentagon who managed to evacuate after the planes crashed (and even some who somehow survived the subsequent collapses)
firefighters and first responders at Ground Zero
people desperately waiting to find out whether their loved ones had survived
transcripts of calls and voicemails from passengers of the hijacked planes
air traffic controllers managing the unprecedented grounding of all aircraft across the United States
fighter pilots ordered to intercept Flight 93 and take it down by any means necessary, including crashing into it midair
Dick Cheney and White House staffers managing the crisis from an underground bunker
Congressional representatives and staffers scrambling away from the Capitol with reports of more hijacked airplanes on the way
Staffers with President Bush in Florida when they got news of the attacks, then on Air Force One as they flew between military bases before heading back to D.C.
One recurring motif that really stuck out to me was how often life or death came down to sheer luck, both good and bad.
One man had to leave his desk high up in the World Trade Center to retrieve a guest in the lobby, which allowed him to escape after the crash and avoid certain death. Another woman was standing at the copier instead of her desk when a plane struck and thus survived when all her other office mates nearby perished. And one firefighter fleeing one of the collapsing Twin Towers alongside a colleague turned one way and lived, while his colleague turned the other way and didn’t.
Call it luck or something else—we’re all a split-second away from death, often without knowing it. The Only Plane in the Sky honors those who were unlucky that day, and serves as a sobering reminder for the rest of us about the fragility of life and the extraordinary bravery of ordinary people.