Dwain Northey (Gen X)

https://www.cbsnews.com/news/naacp-travel-advisory-florida-says-state-hostile-to-black-americans/

Remember the good old days when there were only travel advisories and or ban for, what some would call, third word countries? Well now because of the vile vitriol of one Governor Ron DeSantis the state of Florida, a vacation destination, has received a travel advisory by the NAACP.

The wannabe future President has made the climate so venomous in Florida the anyone who is a part of any minority group does not feel safe in the state. Black, Brown, LGTBQ+, these are all groups that are under attack in the Sunshine State. The majority Republican legislature and their fearful leader has passed laws that make almost everything a jailable offence and the fact that the state has very loose gun laws and a stand your ground law makes it more dangerous than being a blonde female in central America.

Florida residents are able to carry concealed guns without a permit under a bill signed into law by Republican Gov. Ron DeSantis. The law, which goes into effect on July 1, means that anyone who can legally own a gun in Florida can carry a concealed gun in public without any training or background check. This with their ridiculous stand your ground law, ‘Florida’s “Stand-Your-Ground” law was passed in 2005. The law allows those who feel a reasonable threat of death or bodily injury to “meet force with force” rather than retreat. Similar “Castle Doctrine” laws assert that a person does not need to retreat if their home is attacked.’ Makes it really sketchy to go there.

This in top of the don’t say gay rule and the new trans ruling that just passed.

“Florida lawmakers have no shame. This discriminatory bill is extraordinarily desperate and extreme in a year full of extreme, discriminatory legislation. It is a cruel effort to stigmatize, marginalize and erase the LGBTQ+ community, particularly transgender youth. Let me be clear: gender-affirming care saves lives. Every mainstream American medical and mental health organization – representing millions of providers in the United States – call for age-appropriate, gender-affirming care for transgender and non-binary people.

“These politicians have no place inserting themselves in conversations between doctors, parents, and transgender youth about gender-affirming care. And at the same time that Florida lawmakers crow about protecting parental rights they make an extra-constitutional attempt to strip parents of – you guessed it! – their parental rights. The Human Rights Campaign strongly condemns this bill and will continue to fight for LGBTQ+ youth and their families who deserve better from their elected leaders.”

This law makes it possible for anyone to just accuse someone of gender affirming care to have their child taken from them this would include someone traveling from out of state. This alone justifies a travel ban to the Magic Kingdom for families.

Oh, and I haven’t even mentioned DeSantis holy war with Disney, the largest employer in the state. I really hope the Mouse eats this ass holes lunch.

Well that’s enough bitching, thanks again for suffering though my rant.

  • Argus Moon Shot

    Dwain Northey (Gen X)

    The Argus moon launch slipped into the world yesterday with all the thunderous cultural impact of a new McDonald’s value meal—briefly noticed, vaguely processed, and immediately overshadowed by whatever outrage, scandal, or geopolitical fever dream happened to be trending ten minutes later. Somewhere, a rocket pierced the sky, centuries of human curiosity strapped to its back, and the collective response was essentially, “Oh, neat… anyway.”

    And, just to be clear—this wasn’t some slick, logo-heavy, venture-capital-powered spectacle. This was a launch by NASA. You know, the same government agency that once made the Moon feel like the center of the universe. Not a corporate livestream with a countdown sponsored by a cryptocurrency exchange, not a billionaire’s side quest—good old-fashioned public science, funded by taxpayers, executed by engineers who probably still think in slide rules and orbital mechanics instead of brand partnerships.

    Sixty years ago, when humanity first decided to lob itself at the Moon, it was the event. Not an event—the event. Streets emptied, televisions glowed like sacred altars, and people gathered in hushed awe to watch a grainy broadcast that felt like science fiction clawing its way into reality. It wasn’t just a technological achievement; it was a statement. A declaration that we, as a species, could transcend gravity, politics, and perhaps even our own worst instincts—at least for a moment.

    Now? We multitask through it. A rocket launches, and we scroll past it on our phones while arguing about something else entirely. The Moon, once the ultimate destination, has become just another checkbox on humanity’s increasingly crowded to-do list—somewhere between “fix the economy” and “figure out what’s going on with that one app everyone suddenly hates.”

    Of course, it’s not entirely fair. The world today is a bit… noisy. Between economic uncertainty, political theater, and the general sense that everything everywhere is happening all at once, a moon launch struggles to compete. It’s hard to feel wonder when your attention span has been trained to refresh itself every six seconds. The Moon isn’t new anymore. It’s just… there. Waiting patiently while we rediscover it for the fifth or sixth time like a tourist who insists they’ve “found” Paris.

    And yet, beneath the collective shrug, something quietly extraordinary still occurred. Engineers, scientists, and dreamers—working under the banner of a public agency, not a corporate brand—spent years making that launch possible. Thousands of tiny, precise decisions stacked on top of one another until, at exactly the right moment, fire met fuel and gravity lost another argument. The miracle didn’t go away; we just got used to it.

    But let’s not pretend the story ends there. Because somewhere, inevitably, the narrative gears are already turning. It’s only a matter of time before the launch is retroactively claimed as the inevitable result of someone’s “tremendous leadership,” whether or not they could identify the rocket’s pointy end without assistance. The script practically writes itself: bold vision, unmatched foresight, possibly the greatest moon launch anyone has ever seen—people are saying it, many people.

    And that may be the strangest part of all. We’ve gone from a world where the Moon unified us, however briefly, to one where even a journey beyond Earth’s atmosphere risks becoming just another talking point in an endless, terrestrial argument. The stars are still there, indifferent and vast, but we insist on dragging them into our orbit.

    So the Argus launch came and went. A marvel of human ingenuity—public ingenuity—greeted with a collective nod and a quick return to whatever chaos was already in progress. Maybe that says less about the launch and more about us—about how wonder hasn’t disappeared so much as it’s been buried under everything else we’ve decided to care about.

    The rocket didn’t fail to inspire. We just forgot to look up.

  • SCOTUS Slippery Slope

    Dwain Northey (Gen X)

    The latest decision by the Supreme Court of the United States has the unmistakable feel of a legal Jenga move—one of those confident little tugs that seems harmless right up until the whole tower starts wobbling in ways nobody wants to be responsible for.

    The logic, at least on paper, is elegant: if something is “just talking,” then it falls neatly under the warm, protective umbrella of the First Amendment. No physical coercion, no direct action—just words. And words, as we all learned in civics class, are sacred. End of story. Curtain closed. Everyone go home.

    Except… not quite.

    Because once you establish that “just talking” is categorically protected—even in contexts where the intent is to fundamentally alter someone’s identity or well-being—you’ve opened a door that doesn’t politely stay in its hinges. It swings. Widely.

    Take psychotherapy. The entire field rests on the premise that words are not just harmless little puffs of air but powerful tools that can reshape thoughts, behaviors, and even a person’s sense of self. That’s the whole point. If words are powerful enough to heal, they are—awkwardly enough—powerful enough to harm. But under this new framing, where does that leave accountability? If a therapist’s words push someone toward destructive decisions, is that protected speech or professional malpractice? Apparently, we’re now supposed to believe it’s a philosophical gray area instead of, say, a lawsuit.

    Or consider financial advice. If your accountant gently “talks” you into a series of spectacularly bad, ethically questionable investments, are they offering protected speech or violating a fiduciary duty? After all, they didn’t physically move your money—perish the thought. They just… suggested things. Repeatedly. Persuasively. With charts.

    The absurdity here isn’t subtle—it’s baked into the premise. Society has long recognized that context matters. A random person shouting bad advice on a street corner is one thing. A licensed professional, operating within a framework of trust and ethical obligation, is another. We regulate doctors, lawyers, therapists, and financial advisors precisely because their words carry weight and authority. “Just talking” isn’t neutral in those settings—it’s the entire mechanism of influence.

    And that’s where the conversion therapy angle becomes especially strange. The ruling effectively says: you may not act physically to change someone’s identity, but you are perfectly free to try to talk them out of it—as long as you keep your hands to yourself. It’s a bit like saying you can’t push someone off a cliff, but you’re welcome to spend hours convincing them it’s a great idea to jump.

    What emerges is a kind of legal contradiction dressed up as consistency. On one hand, we acknowledge that certain practices are harmful enough to warrant prohibition. On the other, we carve out a massive exception for the exact mechanism through which those harms often occur. It’s harm—just… verbal harm. Which, conveniently, we’ve decided not to count the same way.

    To be fair, the court is grappling with a genuinely difficult boundary: how to protect free speech without endorsing harmful conduct. But by drawing the line so cleanly between speech and action, they may have simplified the problem to the point of distortion. In reality, the two are often inseparable—especially in professions built entirely on persuasion.

    The slippery slope concern isn’t that every bad conversation will suddenly become constitutionally protected chaos. Courts will still recognize fraud, malpractice, coercion, and other established exceptions. But the ruling does invite a wave of arguments from people eager to rebrand questionable conduct as “just speech.” Expect a parade of cases where defendants shrug and say, “I didn’t do anything—I just talked,” as if that settles the matter.

    So here we are: a legal landscape where words are simultaneously powerful enough to be feared, regulated, and monetized—and yet, in certain contexts, apparently too harmless to restrain. It’s a neat trick, really. And like all neat tricks, it works best until someone starts asking how it actually functions up close.

    Because if “just talking” is always protected, then we’re left with an uncomfortable implication: either words don’t matter nearly as much as we think—or they matter so much that pretending they don’t is going to cause problems in places far beyond this one case.

  • We are becoming the villain

    Dwain Northey (Gen X)

    For much of the 20th century, the United States cast itself—and was often cast by others—as the protagonist in the global story. In the grand narrative arc shaped by two world wars, the Cold War, and the spread of globalization, America played the role of the flawed but determined hero: stepping into conflicts late but decisively, positioning itself as a defender of democracy, and promoting an international order built on alliances, trade, and (at least rhetorically) shared values.

    Hollywood reinforced this identity. American films told stories where the U.S. saved the world from existential threats—Nazism, nuclear annihilation, rogue states, asteroids, aliens. Even when the country stumbled, the narrative often resolved with redemption. The hero might be bruised, even morally conflicted, but ultimately chose the right path. The world, in these stories, depended on America—and America, despite its imperfections, delivered.

    This self-image extended into the early 21st century, even as cracks began to show. The wars in Iraq and Afghanistan complicated the narrative. Global audiences became more skeptical. Yet even then, the dominant storyline—especially within the U.S.—remained that of a nation trying, however imperfectly, to do good or at least maintain order in a chaotic world.

    What’s changed in the 2020s is not just policy, but perception—and perception is what stories are made of.

    With the presidency of Donald Trump, America’s role in the global narrative has shifted in ways that feel almost cinematic in their symbolism. The language of alliances gave way to the language of transactions. Longstanding partnerships were questioned or strained. International institutions were dismissed or undermined. Decisions that once would have been framed as collective became unilateral, sometimes abrupt, often unpredictable.

    In storytelling terms, unpredictability is a trait more commonly assigned to antagonists than protagonists.

    The modern villain in global narratives is rarely defined by pure evil. Instead, they are powerful, self-interested, dismissive of norms, and willing to destabilize systems for their own ends. Increasingly, critics argue that this is how the United States is being written into the world’s story: not as the stabilizing force, but as the disruptive one.

    You can already see this shift emerging in media and cultural commentary. In international films, novels, and even journalism, the U.S. is more frequently portrayed as a chaotic superpower—volatile, inward-looking, and at times indifferent to the consequences of its actions abroad. The archetype is changing. The “hero who saves the day” is being replaced by the “former hero who has lost its way.”

    This transformation is not just about one leader, but about what that leadership represents. When a country appears to abandon the principles it once championed—cooperation, rule of law, democratic norms—it creates a narrative vacuum. And in storytelling, vacuums don’t stay empty; they are filled. Other nations step into the role of stabilizers. Meanwhile, the former hero risks becoming the cautionary tale.

    There’s also a deeper psychological shift at play. For decades, the idea of America as the “good guy” was not just exported—it was internalized globally. Undoing that perception doesn’t happen overnight, but once doubt sets in, it spreads quickly. In narrative terms, the audience begins to question the protagonist’s motives. And once the audience no longer trusts the hero, the story changes fundamentally.

    None of this means the transformation is permanent. Stories evolve. Characters fall and redeem themselves all the time. The United States still has immense cultural, economic, and political influence, and its identity on the global stage is not fixed. But the 2020s may well be remembered as the moment when the script flipped—when the country that once defined itself as the central hero of the global order began to be seen, in some corners of the world, as its antagonist.

    And if future filmmakers and writers look back on this era, they may not tell stories of America rushing in to save the day. They may tell stories of a world reacting to an America that had become unpredictable, powerful, and—at least for a time—something closer to the villain than the hero.

  • Alien Sci-Fi Mirror

    Dwain Northey (Gen X)

    Science fiction has always claimed to be about the future, but when it comes to aliens, it is often more honestly about the past—specifically, our past. Again and again, stories about extraterrestrials arriving on Earth follow a familiar script: they come, they conquer, they extract, they dominate. Whether framed as terrifying invasion or inevitable colonization, the narrative arc feels less like speculation about the unknown and more like a projection of what humanity has already done to itself.

    Consider iconic alien invasion stories like The War of the Worlds or films such as Independence Day. In both, technologically superior outsiders arrive with overwhelming force, treating Earth not as a place with intrinsic value, but as a resource to be harvested or an obstacle to be eliminated. These stories resonate because they tap into a deep, historically grounded fear—but that fear is not rooted in alien precedent. It is rooted in human behavior.

    History provides the template. The arrival of Spanish conquistadors in the Americas—figures like Hernán Cortés—was not a meeting of equals but a campaign of extraction and domination. Indigenous civilizations were dismantled, populations decimated, and entire cultures subordinated in pursuit of wealth and power. Similarly, the expansion of the United States across the continent involved systematic displacement, violence, and cultural erasure of Native American populations. These were not aberrations; they were expressions of a broader pattern: when humans encounter “new” territory, especially when backed by technological or organizational advantage, the result has often been conquest.

    Science fiction internalizes this history and replays it with a cosmic twist. Aliens become stand-ins for empire builders, their spacecraft the modern equivalent of caravels or cavalry. Even when the stories invert the power dynamic—casting humans as the underdogs—they rarely question the underlying assumption that contact between civilizations, especially unequal ones, leads to violence.

    But why do we assume this?

    Part of the answer lies in psychological projection. Faced with the unknown, humans tend to fill in the gaps with what they know best: themselves. We imagine alien motivations using human frameworks—competition, scarcity, expansion, domination—because those are the patterns we recognize. In doing so, we universalize behaviors that may, in reality, be contingent and culturally specific.

    There is also a narrative bias at work. Conflict drives storytelling. Peaceful coexistence, mutual curiosity, or non-interference—while entirely plausible—do not generate the same immediate tension as invasion and resistance. A species that has outgrown violence, or never developed it in the first place, presents a storytelling challenge. What is the plot if no one is trying to conquer anyone else?

    Yet some works have tried to break this mold. Films like Arrival imagine extraterrestrials not as conquerors but as communicators, beings whose intentions are complex, non-linear, and fundamentally non-hostile. Similarly, Close Encounters of the Third Kind portrays contact as awe-inspiring rather than apocalyptic. These stories suggest that our default assumptions are not inevitable—they are choices.

    If we take seriously the possibility that extraterrestrial life exists, we must also take seriously the possibility that it does not mirror us. Evolution on another planet could produce entirely different social structures, value systems, or modes of interaction. An advanced civilization might prioritize sustainability over expansion, observation over intervention, or cooperation over competition. It might have no concept of “taking” in the way humans historically have.

    In that light, the persistent image of hostile aliens begins to look less like a prediction and more like a confession. It reveals our anxieties about what happens when a more powerful force behaves the way we have behaved. The fear is not just that aliens would invade—it is that they would treat us the way we have treated others.

    Science fiction, at its best, does more than entertain; it holds up a mirror. When we imagine extraterrestrials as violent colonizers, we are not necessarily describing them—we are remembering ourselves. The challenge, then, is not just to imagine different kinds of aliens, but to imagine different versions of humanity: ones that do not assume conquest is the default outcome of encounter, and that can conceive of the unknown without immediately turning it into an enemy.

  • Donald versus NATO

    Dwain Northey

    There’s something almost quaint about NATO—a relic of a time when alliances were built not on vibes or vendettas, but on the mildly radical idea that preventing global catastrophe might be worth a standing group chat among nations. Conceived under Harry S. Truman, NATO has spent roughly seventy years doing the unglamorous work of not having World War III. No small feat. You’d think “decades without planetary annihilation” might earn at least a polite nod.

    But enter Donald Trump, who appears to view NATO less as a strategic alliance and more as an exclusive country club where some members are apparently late on their dues and therefore should be denied access to… collective security against existential threats. Because nothing says “stable global order” like treating mutual defense treaties the way one treats a gym membership you’re threatening to cancel out of spite.

    In this reinterpretation, Article 5—the cornerstone principle that an attack on one is an attack on all—is less a solemn commitment and more a negotiable perk, like valet parking or preferred tee times. Why honor decades of deterrence doctrine when you can instead ask, “But are they paying enough?” as though Russian tanks pause politely at borders to review balance sheets.

    And then comes the truly dazzling logic: if one decides to freelance a bit of international conflict—say, hypothetically, launching into a confrontation with Iran—surely the alliance exists to back that play, no questions asked. NATO, in this vision, is less “defensive pact” and more “on-call entourage,” ready to amplify whatever bold improvisation happens to be trending that week. It’s not about collective security anymore; it’s about collective validation.

    Of course, when those same allies hesitate—perhaps clinging stubbornly to the original concept of “defense” rather than “impromptu adventure”—the conclusion isn’t that maybe, just maybe, alliances aren’t designed to rubber-stamp unilateral decisions. No, clearly the problem is that NATO itself is defective. After all, if the orchestra won’t follow the conductor off a cliff, what good is the orchestra?

    So here we are, watching a seventy-year-old framework that helped stabilize half the planet being recast as dead weight because it refuses to function like a personal loyalty program. It’s an impressive feat: taking one of the most successful alliances in modern history and reducing it to a Yelp review about poor service and insufficient enthusiasm.

    But perhaps that’s the new doctrine: peace is overrated, alliances are transactional, and if the world won’t play along, well—there’s always the option of taking your ball and going home, declaring victory on the way out.

  • No Kings 3… RedHat Denial

    Dwain Northey (Gen X)

    The third installment of “No Kings”—because apparently once, twice, and now thrice is what it takes to gently remind certain people that crowns are not standard-issue American headwear—was, by any observable metric, a roaring success. The crowds were larger, louder, and more energized than before. Streets filled, voices carried, and the general vibe was less “quiet protest” and more “we’re going to need bigger sidewalks.”

    Naturally, this has led to the only logical conclusion from the right: absolutely no one showed up.

    Yes, despite the inconvenient presence of tens of thousands of humans occupying physical space, the official narrative seems to be that the event was a dismal flop. Empty. Deserted. Practically a ghost town—if you ignore the people, the signs, the chants, the photos, and, really, all available evidence. It’s a bold strategy, redefining “massive turnout” to mean “complete failure,” but when reality refuses to cooperate, why not just…replace it?

    And really, you have to admire the commitment. It takes a special kind of confidence to look at a sea of people and declare it a puddle. Or perhaps it’s not confidence so much as…nervousness. Because nothing says “we’re totally not worried” quite like urgently insisting that something very large is actually very small and definitely not worth paying attention to.

    Which brings us to the underlying truth peeking through all the denial: you don’t work this hard to dismiss something unless it’s making you uncomfortable. You don’t scramble to label it a failure unless, somewhere deep down, you recognize it as anything but.

    So yes, “No Kings Three” was a spectacular non-event—just ask the thousands of people who didn’t attend it. And if the reactions are any indication, the louder the denial gets, the clearer it becomes: someone, somewhere, is more than a little rattled.

  • More marking His Territory

    Dwain Northey (Gen X)

    There’s a certain line in democratic governance that most leaders, regardless of ego or ambition, have historically understood not to cross. It’s the line between public service and personal branding—between representing the state and imprinting oneself onto it. And few symbols are more sacred to that distinction than a nation’s currency.

    So when the idea emerges that Donald Trump would want his signature placed on U.S. currency, it doesn’t just raise eyebrows—it raises historical comparisons that most modern presidents have gone out of their way to avoid.

    American money has never been about the sitting president’s identity. It’s intentionally impersonal, institutional, and slow to change. The faces that appear on bills—Washington, Lincoln, Hamilton—are not there because they demanded it, but because history, over time, placed them there. Even the signatures that do appear belong to the Treasury Secretary and Treasurer, not the president. That’s by design. It reinforces a simple idea: the system matters more than the individual.

    And yet here we are, entertaining the notion that a president might want to leave a literal signature on the nation’s currency—as if the office were a brand to be stamped rather than a responsibility to be carried.

    Think about the contrast. Barack Obama didn’t demand his autograph on the dollar after navigating a financial crisis. Joe Biden didn’t suggest it during massive economic recovery efforts. Bill Clinton presided over a budget surplus without deciding the dollar needed a personal touch. Even towering historical figures like Franklin D. Roosevelt—who reshaped the federal government during the Great Depression—and John F. Kennedy, with all the cultural weight he carried, never treated U.S. currency as a canvas for self-immortalization.

    Why? Because there’s an unspoken understanding in democracies: the symbols of the state are not souvenirs for the leader.

    That’s what makes this idea feel less like a policy quirk and more like something pulled from a different governing tradition altogether. In authoritarian systems, leaders routinely place their names, faces, and symbols on everything—currency included—not just as identification, but as reinforcement. It’s a constant, everyday reminder of who’s in charge. Every transaction becomes a subtle act of acknowledgment.

    That’s not how the United States has traditionally operated. Here, the durability of institutions is supposed to outlast any one presidency. The dollar in your wallet shouldn’t feel like a campaign token.

    And that’s the deeper issue. It’s not about ink on paper—it’s about what that ink represents. A signature on currency suggests ownership, or at least a desire for permanence that bypasses the normal filters of history. It’s the difference between being remembered and insisting on being seen.

    In the end, the question isn’t just “how much more authoritarian can you get?” It’s why one would even want to test that boundary in the first place. Because once leaders start treating national symbols as personal billboards, the shift isn’t cosmetic—it’s philosophical.

    And philosophy, unlike currency, is much harder to take out of circulation.

  • No Kings 3

    Dwain Northey (Gen X)

    There’s something almost quaint about calling it “No Kings Three,” as if by the third installment we’re still pretending this is a one-off moment instead of what it clearly has become: a tradition of collective refusal. Not a holiday exactly—those tend to come with sales and themed cupcakes—but something closer to a civic ritual, where people show up in numbers so large they make aerial photography feel like a necessity rather than an artistic choice.

    Each time it happens, the same chorus emerges: this is historic. And to be fair, it is—again. That’s the inconvenient pattern. The first time, skeptics called it a fluke. The second time, they blamed momentum, novelty, maybe even good weather. By the third, the excuses start to sound like someone trying to explain away a tidal wave as an unusually enthusiastic ripple.

    Because the crowds don’t just return—they grow. Not incrementally, not politely, but in that uncomfortable, undeniable way that suggests something deeper is happening. Streets fill earlier. Signs get sharper. The tone shifts from curious participation to something more grounded, more resolved. People aren’t there just to witness anymore; they’re there because not showing up would feel like saying something they don’t believe.

    And what’s remarkable isn’t just the size—it’s the consistency. In an era where attention spans are allegedly shorter than a loading screen, here are hundreds of thousands—sometimes millions—of people repeating the same message, louder each time, as if testing whether sheer persistence can bend reality back toward something resembling accountability.

    Of course, there are always those who downplay it. The crowd estimates are questioned, the motivations dissected, the participants neatly categorized and dismissed. It’s the usual playbook: if you can’t ignore it, minimize it; if you can’t minimize it, redefine it. But numbers have a stubborn way of existing regardless of narrative preferences. When the same event keeps producing “historic” turnout, eventually the adjective stops being hype and starts looking like a baseline.

    “No Kings Three” isn’t just about rejecting a single figure or moment—it’s about the increasingly unfashionable idea that power should have limits, and that those limits are not self-enforcing. The crowds seem to understand this instinctively. They’re not waiting for permission, nor are they particularly interested in whether their presence is considered convenient.

    So yes, once again, it’s historic. Not because anyone planned it that way, but because people keep deciding—independently, repeatedly, and in ever-larger numbers—that it matters. And at some point, the real story isn’t that the crowds are historic.

    It’s that they won’t stop being historic.

  • Why can’t we just admit it?

    Dwain Northey (Gen X)

    It used to be comforting—almost quaint, really—to track the number of documented falsehoods told by Donald Trump during his first administration. There was a certain artisanal quality to it. Each lie carefully cataloged, lovingly preserved, like a vintage wine collection—if the wine were made of pure fiction and the label read “objective reality, but optional.”

    Back then, the tally crept upward with a kind of leisurely absurdity. Fact-checkers would gather, clipboards in hand, like birdwatchers spotting a rare species: “Ah yes, another one—note the plumage, completely detached from observable truth.” It was almost a civic hobby. Numbers like 10,000… 20,000… 30,000 lies were treated less like a crisis and more like a bizarre endurance sport. “Will he break his own record?” we wondered, as if he were an Olympic athlete in the category of Freestyle Fabrication.

    But now? Now we are witnessing something far more ambitious. The sequel has abandoned the slow-burn character development of the original and gone straight for blockbuster pacing. Why build a legacy lie by lie when you can industrialize the process?

    In what feels like roughly fifteen minutes—though technically it’s been around 14–15 months—the volume of misinformation has surged with the efficiency of a startup that just discovered venture capital and has decided truth is an unnecessary overhead expense. The first administration walked so the second could sprint wildly in the opposite direction of facts.

    There’s no longer time for fact-checkers to thoughtfully document each claim. They’re not analysts anymore; they’re emergency responders. Sirens blaring, they rush from one statement to the next, only to find three more have erupted behind them. At this point, the idea of “keeping count” feels adorably naive—like trying to count raindrops in a hurricane while the storm itself insists it’s sunny.

    And you have to admire the escalation strategy. Why merely repeat a lie when you can layer it, remix it, contradict it within the same sentence, and then deny ever saying it—all before lunch? It’s not just dishonesty anymore; it’s performance art. Post-truth jazz. Improvisational reality, where consistency is for amateurs and facts are more of a suggestion.

    What’s truly remarkable is the looming possibility that the first administration’s towering record—once thought unassailable—might soon be eclipsed. Not over four years, mind you. Not even close. We’re talking about a speedrun. A highlight reel. The “greatest hits” album of unreality, released before anyone even finished listening to the original discography.

    At this pace, historians may need to abandon traditional timelines altogether. Why bother with years or months when the more appropriate unit of measurement is the news cycle—or perhaps the hour? Future textbooks might read: “During a particularly productive Tuesday afternoon, the administration achieved what previously took weeks.”

    So here we are, watching history repeat itself—but faster, louder, and with even less regard for whether anyone is still pretending to keep track. The original lie count now stands not as a warning, but as a benchmark. A challenge. A number to beat.

    And judging by current performance, it won’t stand for long.

  • The great negotiator

    Dwain Northey (Gen X)

    It’s becoming increasingly clear that Donald Trump has unlocked a bold new diplomatic strategy: negotiations conducted entirely in his own head. It’s efficient, really—no translators, no inconvenient facts, and absolutely no risk of the other side contradicting him… unless, of course, they do so publicly, which keeps happening.

    Take Iran, for example. According to Donald, talks are going “beautifully,” progress is being made, and deals are practically signed in invisible ink. Meanwhile, officials from Iran are out here politely (and repeatedly) saying, “We have no idea who this man is talking to.” It’s less “backchannel diplomacy” and more “imaginary friend summit.”

    One has to admire the confidence. Most people, when caught claiming conversations that never happened, might dial it back a bit. Not Donald. He doubles down, adds details, maybe throws in a “they respect me very much,” and carries on as if the rest of the world just missed the memo… or reality.

    At this point, the only logical conclusion is that somewhere, in a very luxurious room, Donald is seated across from an empty chair, nodding thoughtfully, interrupting himself, and declaring victory over a negotiation that exists exclusively in his own narrative. Frankly, it’s the smoothest foreign policy he’s ever run—no opposition, no complications, and best of all, no actual participants.

    The only problem? Reality keeps RSVPing “not attending.”