April 12, 2026

  • Artemis II and the Pale Blue Dot

    Artemis II was a nine-day lunar flyby mission with a crew of four astronauts, launched on April 1, 2026. It was the first crewed NASA-led Artemis flight and the first human journey beyond low Earth orbit since Apollo 17 in 1972. 

    During their lunar flyby, the crew achieved the record for the farthest distance from Earth by humans, reaching 252,756 miles (406,771 km), surpassing Apollo 13’s previous record of 248,655 miles (400,171 km).

    Friday, they splashed down safely in the Pacific Ocean.

    Artemis II astronauts Jeremy Hansen, Christina Koch, Victor Glover, and Reid Wiseman are seen onstage Saturday at Ellington Field at Johnson Space Center in Houston.  – Ronaldo Schemidt/AFP/Getty Images

    “Victor, Christina and Jeremy, we are, we are bonded forever, and no one down here is ever going to know what the four of us just went through … And it was the most special thing that will ever happen in my life.” – Reid Wiseman

    This is the kind of story that’s easy to file under ‘space news’ – but for entrepreneurs, investors, and leaders, it’s also a case study in how fast the frontier moves when compounding technology meets long‑term conviction.

    As we move forward, we’ll talk more about the emerging business landscape around space (from connectivity and Earth‑observation data to in‑orbit manufacturing, commercial stations, logistics, and even space‑based energy). But today’s piece is really about something more fundamental: marking a milestone on the path and widening our sense of where we are and what’s possible.

    From Humble Beginnings …

    To appreciate how far we’ve come, I think it’s helpful to think about the early days of Space Travel. In 1977, the Voyager 1 launched into space.   Just over a dozen years later, the Voyager 1 spacecraft had traveled farther than any spacecraft/probe/human-made anything had gone before. It was approximately 6 billion kilometers away from Earth. At that point, the Voyager 1 was “told” by Carl Sagan to turn around and take one last photo of the Earth… a pale blue dot

    The resulting photo is impressive precisely because it shows so little in so much.

    A photo showing 1 blue pixel, the Earth, taken by Voyager 1 in 1977

    “Every saint and sinner in the history of our species lived there – on a mote of dust suspended in a sunbeam.”  – Carl Sagan

    Earth is in the far-right sunbeam – a little below halfway down the image. This image (and the ability to send it back to Earth) was the culmination of years of effort, technological advancement, and the dreams of mankind.

    Carl Sagan’s Pale Blue Dot speech is still profound and moving. Invest three minutes to watch and listen.

    Carl Sagan via YouTube
     

    Here’s the transcript:

    Look again at that dot. That’s here. That’s home. That’s us.

    On it everyone you love, everyone you know, everyone you ever heard of, every human being who ever was, lived out their lives.

    The aggregate of our joy and suffering, thousands of confident religions, ideologies, and economic doctrines, every hunter and forager, every hero and coward, every creator and destroyer of civilization, every king and peasant, every young couple in love, every mother and father, hopeful child, inventor and explorer, every teacher of morals, every corrupt politician, every “superstar,” every “supreme leader,” every saint and sinner in the history of our species lived there–on a mote of dust suspended in a sunbeam.

    The Earth is a very small stage in a vast cosmic arena.

    Think of the rivers of blood spilled by all those generals and emperors so that, in glory and triumph, they could become the momentary masters of a fraction of a dot. Think of the endless cruelties visited by the inhabitants of one corner of this pixel on the scarcely distinguishable inhabitants of some other corner, how frequent their misunderstandings, how eager they are to kill one another, how fervent their hatreds.

    Our posturings, our imagined self-importance, the delusion that we have some privileged position in the Universe, are challenged by this point of pale light. Our planet is a lonely speck in the great enveloping cosmic dark. In our obscurity, in all this vastness, there is no hint that help will come from elsewhere to save us from ourselves.

    The Earth is the only world known so far to harbor life.

    There is nowhere else, at least in the near future, to which our species could migrate. Visit, yes. Settle, not yet. Like it or not, for the moment the Earth is where we make our stand.

    It has been said that astronomy is a humbling and character-building experience. There is perhaps no better demonstration of the folly of human conceits than this distant image of our tiny world. To me, it underscores our responsibility to deal more kindly with one another, and to preserve and cherish the pale blue dot, the only home we’ve ever known.

    How powerful a statement from a grainy pixel.

    … To New Heights

    Today, we have people living in space, posting videos from the ISS, and high-resolution images of space and galaxies near and far. Artemis II shows we’re going back to the moon, and that that’s only the beginning. We also recently talked about the other new goals and explorations already on the proverbial docket.

    We take for granted the scale of the technological phase shift. The smartphone in your pocket has more computing power than the systems that first took us to the moon – and it has for decades.

    As humans, we’re wired to think locally and linearly. We evolved to live our lives in small groups, to fear outsiders, and to stay in a general region until we die. We’re not wired to think about the billions and billions of individuals on our planet, or the rate of technological growth – or the minuteness of that all compared to the vastness of space.  

    However, today’s reality necessitates that we think about the world, our impact, and what’s now possible for us.

    We’ve created better, faster ways to travel, instantaneous communication networks across vast distances, and megacities. Our tribes have gotten much bigger – and with that, our ability to enact massive change has grown as well. 

    Space was the proving ground for many of today’s breakthrough technologies. Now, similar waves are building in AI, medicine, genetic engineering, robotics, and even ‘world‑building’—not just in virtual environments, but in how we design cities, companies, and economies. As leaders, our job is to spot these trajectories early, place disciplined bets, and build systems that can adapt as the frontier moves.

    It’s hard to comprehend the scale of the universe and the scale of our potential – but that’s exactly why it’s worth exploring. The view from a ‘pale blue dot’ reminds us that most of what feels urgent today won’t matter in a decade, but the systems we build and the bets we make will. This week, ask yourself: where are you still thinking locally and linearly in a world that rewards global, exponential thinking? 

    Onwards!

  • When Does Helpful Become Too Much? Rethinking Our Relationship with AI

    I went to dinner with my good friend John Raymonds and our sons this week. John is a deep thinker and an experienced entrepreneur. Unsurprisingly, the conversation turned to AI. I thought I’d share some of the things we examined and discussed.

    John and I at a Texas steak house with our sons,

    As we all admitted to using AI more often and for more things, what stood out wasn’t our agreement, but rather the tension around our use cases. While we’re all very bullish on AI (excited, even), we kept circling two questions: “How much AI is too much?” and “How important is it to preserve your own voice (and thinking) in the wake of AI content generation?” Neither had clean answers, yet both felt increasingly important and relevant.

    How much AI is too much?

    When you see AI everywhere all the time, you might have passed the point of diminishing returns (or you might be passionate about the discovery of something as important as the discovery of fire, electricity, or the Internet).

    The real issue is that “too much AI” is not about volume of usage but about when AI starts to (a) complicate your process or (b) dilute your voice. Ultimately, you must consciously decide where the line is for you.

    Here’s an example we talked about, stemming from Zach and me writing this weekly commentary together. I’ve been experimenting with something I call “content pillars.” That is where I combine various sources (including research, articles, notes, recordings, etc.) on a subject, run them through various AI tools with layered prompts, and distill everything into a multi-faceted outline that provides a more complete, multi-dimensional view of the material and its meaning. The goal is to get a better sense of the big picture (and make it easier to spot patterns, overlaps, contradictions, tensions, and agreements). I include most of the process steps in the content pillar. The result is dense, sometimes overwhelming, but undeniably rich. I used to do a lot of that in my head. This consolidates all of that in one place, making it easy to save for reference or reuse. To me, this was a step forward.

    My son pushed back.

    At some point, he argued, the process becomes so complex that it requires its own layer of AI just to consume it. The time investment balloons. What started as a tool to simplify thinking and streamline our process can quietly become something that complicates it.

    He’s not wrong.

    But I don’t think that invalidates the process … or the end result. Beyond the output, there’s value in building and using these systems (exploring, experimenting, and stretching a different kind of mental muscle). The process itself becomes the product, at least in part.

    Still, the question lingers: if the tool designed to accelerate us begins to slow us down, where’s the line?

    I won’t really attempt to answer that here. However, I will note that I originally created the process to augment, automate, and extend parts of my work. Over time, I refined the process to the point where I wanted to share it. That’s when I was confronted by a stubborn truth I’ve battled many times: a process designed for you won’t necessarily please others.

    What I found is that very few people want information in the quantity, velocity, depth, or breadth that I would choose. In fact, it became clear to me that I wasn’t even the real audience anymore. As I continued to build these content pillars, they expanded as I began to view each pillar as a new, richer data set to feed the machine (rather than producing something I’d want to consume myself or share with others).

    But the point remains, if you design something to satisfy a machine, it shouldn’t surprise you that it doesn’t satisfy a human.

    How much does voice matter?

    The second tension was more subtle and subjective.

    Something shifts as AI becomes more embedded in writing, editing, and content generation. It doesn’t take a literary genius to see or feel it.

    Sentences smooth out. Paragraphs tighten. Structure improves. But sometimes, the voice flattens. Most of us can tell when something is written by AI, even if we can’t always tell when something is written with the help of AI.

    Yet, even tools like Grammarly optimize toward familiarity. They rely on proven patterns, common phrasing, and widely accepted “good writing.” The result is predictably better writing… but also just predictable writing.

    Of course, there’s a tradeoff.

    AI enables depth. It helps us see angles we might have missed, incorporate ideas we wouldn’t have found, and build more comprehensive pieces. It has also been vital in catching when we’re making assumptions or claims without backing them up. Our writing becomes more informed, more structured, and often more valuable to the reader.

    But at what cost?

    I see it firsthand. My son spends extra time pulling our writing back toward something that feels like us (restoring tone, rhythm, personality). It’s deliberate work, and it can be frustrating.

    Our previous rhythm was relatively painless, but we’d also plateaued in the caliber and tone of our articles.

    So the question becomes: Is added value worth a diluted voice? Or is voice itself part of the value we’re trying to create?

    Could we spend extra time improving our prompts so that our voice is more carefully curated? If the voice is there but we didn’t write it, is it still our article?

    Different generations, different instincts

    What became clear over dinner wasn’t just disagreement—it was a difference in posture.

    John and I are leaning in hard. There’s a kind of curiosity that borders on recklessness. We’re exploring, testing limits, and integrating AI into everything we can. Not because we have to, but because we want to see what’s possible. John even built a niche AI app recently, just to prove he could.

    There’s joy in that.

    Our sons, on the other hand, seem to play a different role. Not resistant or disengaged (they both use these tools extensively), but more measured. More aware of the tradeoffs and the nature of their parents. More willing to question whether efficiency is always the goal.

    If we are accelerating, they are steering.

    Perhaps its a result of them being so close to us that they end up playing “defense”. And maybe that balance matters more than either side being “right.”

    We hear it all the time: too much of a good thing becomes a bad thing.

    But AI complicates that idea. Because it’s not just a tool — it’s a multiplier of output, of speed, of ideas … and of noise.

    So how do we know when we’ve crossed the line?

    That is a question worth sitting with.

    Maybe it’s not about a universal threshold. Maybe it’s more personal, more situational. Maybe the better question isn’t “how much is too much?” but:

    • Is this helping me think more clearly, or just more quickly?
    • Is this enhancing my voice, or replacing it?
    • Am I using the tool, or adapting myself to fit the tool?

    There may not be definitive answers yet.

    But the act of asking — of pausing long enough to notice how these tools are shaping not just what we produce, but how we think — might be the most important habit we can build right now.