What It Means to Be a Builder in the Age of AI (Part 1)
On foundations, leverage, and what it really means to build now.
This is part one of a three-part essay on what it means to build in the age of AI. Parts two and three, publishing this week, move from the near-term shifts already underway to the longer-term implications of where this is headed.
Twice last month, three different people asked me the same question on TikTok: if I were eighteen today, knowing what we know now about AI, would I still go to college, and would I still study computer science?
I’ve been getting versions of this question for a while, and I understand why. A lot of people know me from social media. Some people see me here on Substack. I talk often about AI and building, and that naturally leads to curiosity about who I am, how I got here, and whether I’m speaking from experience or just surfing a wave.
What’s interesting is that the question has started coming from two directions.
Some people are asking as students, trying to decide what to study and how to prepare. Others are asking as founders, managers, and hiring leads, trying to understand how roles are changing and what they should actually be optimizing for when they build teams.
I was asked the same question again on my birthday, December 24th, via DM, and I had already planned to collect my thoughts and write about it during the holiday break.
A couple of days later, Andrej Karpathy shared a post on X that caught my attention. Karpathy is one of the most respected engineers of our generation, someone who has helped define modern machine learning and software practice. In the post, he wrote that he had never felt this far behind as a programmer.
He described the profession as being actively refactored, with the human contribution becoming more sparse and intermittent. He compared the moment to being handed a powerful, unfamiliar tool with no manual, and said the shift is already reshaping the profession.
“Roll up your sleeves to not fall behind.”
I wish I could say I was surprised by that framing. I wasn’t.
I’ve been building with AI tools for several years now, well before they were reliable or fashionable. I’ve been an early adopter as a practitioner, using them to make myself a better engineer and a better builder. Things like agents, prompts, protocols, context management, skills, workflows, and IDE integrations. These are not abstract concepts to me. They are the tools I use, the ones I write about, and the ones I talk about publicly.
So when Karpathy articulated the feeling so plainly, it didn’t read like a revelation. It read like recognition.
I’m bringing him up because when someone like Karpathy says this out loud, the question about studying computer science stops being a purely personal choice. It becomes a question about what this work is turning into.
And that forces a second question right behind it. If the best engineers in the world feel the ground shifting, what should everyone else be optimizing for, both as learners and as employers?
I don’t think there’s a clean, universal answer. Anyone offering one is probably oversimplifying something that’s still unfolding.
What I can offer is my own path. Not as a prescription, and not because it generalizes cleanly, but because it’s the only place I can speak from honestly.
Before I try to answer whether I’d still go to college, study computer science, or how I think about hiring today, it’s worth explaining how I got here. What I was optimizing for at the time. What I didn’t know yet. And what ended up mattering more than I expected.
AI is what brings this question up. Evolution and preparation are what it’s really about. Roles are changing. Skills are shifting. The work itself is being reshaped. And the way you position yourself, whether you are learning or hiring, is becoming the real differentiator.
So let me start at the beginning and work my way toward an answer.
The Old Arc of Computer Science
For most of my life, the arc of computer science was obvious. Progress was relentless. The surface area kept expanding. There was always more to learn, and the hierarchy of skill made sense. If you wanted to become a better engineer, you studied the details. Languages, protocols, syntax, systems, tools. You learned what was true, what was brittle, what was elegant, what was dangerous. You accumulated competence slowly, with scars.
Up until about two years ago, my answer would have been an emphatic yes. You should absolutely study computer science. If you enjoy puzzles, math, problem solving, critical thinking, building things, and taking on hard challenges, it was one of the most powerful skill sets you could acquire. It gave you leverage and range and a kind of quiet authority in a world increasingly built out of invisible rules.
Building felt like a superpower.
The Blinking Cursor
When I tell people I fell in love with computers at five years old, they usually don’t believe me. This was the late 80s. Computers weren’t lifestyle products. They were strange and intimidating.
The first computer I remember clearly was at my uncle’s house on Long Island, and the scene is lodged in my head the way certain childhood memories stay lodged: dim light, a kind of hush, the feeling that you’re awake at the wrong hour. The computer was on. The monitor glowed with a fuzzy, cinematic green. A blinking cursor sat there like a dare, like something I wasn’t supposed to touch.
It was Christmas Eve, and also my birthday. I should have been doing normal kid things. Instead I lay on the floor in front of the monitor and started pressing keys. Not with a plan, just with the instinct that something was waiting on the other side of the screen. My uncle showed me a few commands, some little combinations I could type that made the machine respond. I don’t remember what they were and it doesn’t matter. What mattered was the feeling: the computer wasn’t entertainment, it was a door. The cursor was a handle.
My family kept pulling me away for dinner, presents, dessert, and I kept running back.
Finding a Way In
I grew up in a modest household. We didn’t have much money, and as obsessed as I was with computers, we couldn’t afford one at home. So I used the ones at school. I’d sign myself up for every computer lab session I could, lingering whenever I was allowed, finding excuses to stay a little longer. That was how I got my fix.
By the time I finally had a computer of my own, around twelve, it felt less like receiving a device and more like being handed a private room inside the world. A place that was vast and open, but finally accessible. It felt like the keys were already there. I just didn’t know where they were yet. Now I had the time to look.
The following year, an uncle of mine, who ran a computer sales and repair business, asked if I wanted anything from a computer show he was going to. I told him I wanted books. He came back with a C++ book.
I had no formal programming background and no sense of how fast or slow I was supposed to move. I worked through it at my own pace, learning fundamentals, experimenting, and trying to get the machine to do what I wanted. Conditionals. Loops. The basic idea that logic could turn into behavior.
That was the first time the cursor stopped feeling like a dare and started feeling like an invitation. I wasn’t just pressing keys anymore. I was beginning to understand how things fit together.
Finding My People
When I was fifteen, I watched Hackers in ninth grade, and something clicked. It wasn’t just curiosity anymore. It felt like a door opening.
I started reading 2600: The Hacker Quarterly. I wandered into parts of the internet I didn’t fully understand. I downloaded scripts and programs that promised disruption: the digital equivalent of firecrackers. I was a script kiddie in the most literal sense. I didn’t really know what I was doing, but I knew what it gave me.
It expanded my world. I grew up in a small, unassuming suburb on Long Island, and suddenly there was a much bigger universe to explore. One where geography mattered less, where ideas traveled faster than people, and where curiosity felt like an advantage instead of a distraction.
I had a few friends who shared the fascination. We weren’t building companies or thinking about careers. We were learning how this world worked: its language, its culture, its norms. We were figuring out how people who understood machines thought, moved, and saw the world.
Looking back, a lot of it was naive and silly. But it mattered. For someone whose world had been pretty small, it was awakening in the only way that really counts: it made my universe feel expandable.
Flash, or the First Time It All Clicked
By seventeen, I was making websites with HTML and CSS. Then I discovered Flash.
Flash was a turning point because it let me combine things that had felt separate until then. Code and design. Logic and motion. It wasn’t just about making something work, it was about how it moved, how it felt, how people experienced it. I picked up ActionScript because I had to. There was something I wanted to build, and that was the tool in the way.
I started building websites for bands, beginning with my own. That eventually led to doing early web work for Taking Back Sunday, the last Flash project I ever shipped.
Building wasn’t something I thought of as a skill I was acquiring. It was how I showed up in the world, a way of thinking that reduced things to problems and solutions, and a habit of turning ideas into useful things.
Why Software Felt Too Obvious
Around that time, I was intentionally keeping a lot of doors open. I was playing in a touring band, working at a small software agency on Long Island, and moving through college without rushing to commit.
I didn’t start in computer science. I first moved between political science and graphic design, not because I lacked direction, but because I wasn’t ready to narrow my world too early.
Software, oddly enough, didn’t feel like the obvious answer. It was the thing I did for enjoyment. The place where time disappeared. Turning it into a profession felt almost too easy, like confusing comfort with conviction.
What’s easy to miss is that even while I was exploring academically, I was learning at a very real pace. At the software company, I worked on production software for municipalities, systems people actually depended on. I could see how decisions became outcomes.
I found the work deeply engaging. I was growing quickly. And still, it didn’t feel inevitable.
At that age, I assumed that whatever I committed to would need to feel hard in a different way.
Software felt obvious. And that made me hesitate.
Learning How to Think
Eventually, I made a choice. I transferred to Stony Brook to study computer science. It was the strongest program on Long Island, and I knew it would force clarity.
Up to that point, most of my relationship with computing had been intuitive. As a kid, it was curiosity. In my early work, it was instinct and experimentation. I could make things work, but I didn’t always understand why they worked, or how they would behave when conditions changed.
At Stony Brook, the work forced me to think differently.
Math and computer science trained me to reason in terms of correctness instead of confidence. Either something worked or it didn’t. Either a solution was valid or it wasn’t. There was no room for hand-waving or vibes.
Algorithms taught me how to think step by step, how to reason about efficiency, tradeoffs, and scale. Compilers and operating systems taught me how abstraction really works, and why building anything meaningful requires time, discipline, and a solid foundation underneath it.
That way of thinking didn’t stay in the classroom. Over time, I learned to step back from individual problems and see the systems underneath them, along with the constraints that actually determined what was possible.
I gravitated toward a small group of people in the applied math, computer science, and physics programs who operated with that same clarity. Not louder. Not more performative. Just more precise. Conversations with them were grounding. Ideas had to stand up on their own.
Being around that recalibrated me. It showed me how much of what we call intelligence later in life is often a mix of drive, presentation, and stamina. Those things matter. But they are not the same as the deeper kind, the kind that holds under scrutiny and keeps working when conditions change.
Computer science was hard, but not in the way I expected. The difficulty wasn’t endurance. It was precision. And in committing to it, I got something I didn’t know I was asking for: a framework for thinking that I still rely on today.
Learning How to Ship
Computer science taught me how to think clearly. Engineering taught me how to apply that thinking in the real world, with other people, under constraints that didn’t care about elegance.
I still remember the first pull request I ever opened. Until then, my code had mostly been mine. Suddenly, someone else had to read it, understand it, and decide whether it belonged. Correctness wasn’t enough anymore. Clarity mattered. Context mattered. I had to explain not just what I did, but why.
Reviewing someone else’s code flipped that perspective again. Now I was responsible for more than my own work. I had to learn how to give feedback that improved something without rewriting it in my own image. How to disagree without blocking progress.
Then came the first time something I built actually scaled.
The system worked. Then it got used. Load increased. Edge cases appeared. Behavior changed in ways I hadn’t anticipated. Problems stopped being purely technical and started involving users, timing, and tradeoffs. That was when it really clicked that engineering isn’t just applied computer science. The system includes people, expectations, and entropy.
This became real for me early on as a mobile engineer at Booz Allen Hamilton, building native iOS and Android applications. Shipping real products meant balancing performance, reliability, security, and user experience at the same time. There was rarely a single correct answer. There was only what worked well enough, for the right reasons, under the right constraints.
As I took on more responsibility, the work expanded again. Building software was no longer just about what I could produce myself. It became about building teams, reviewing work, setting standards, and helping other engineers succeed. I had to learn how to multiply my impact without becoming a bottleneck.
Looking back, this is what I needed computer science for.
It gave me rigor. It gave me a way to reason about systems, correctness, and foundations. Engineering taught me judgment. Together, they let me build things that could survive contact with the real world.
Why I Build
I’ve always been a builder. My family spent nearly a century in construction, building homes on Long Island. I grew up around the idea that you can take nothing and use your talents to turn it into something of value.
For me, the materials were abstract. Logic. Language. Systems. I used more brain than muscle.
The deeper reason was simple. I wanted to turn ideas into real things. I wanted to bring something into the world that did not exist before. At the time, computer science felt like the most reliable way to do that.
That instinct has never really changed. Builders build because it is part of who they are. They are uncomfortable leaving ideas unformed. They feel incomplete until something imagined becomes tangible. It is not about titles or tools. It is about the need to make, to shape, to bring a vision into the world and see it hold up.
Everything else came later. Product. Business. Leadership. Starting my own studio.
But the throughline was always the same. I was building.
Why I’m Telling You All This
I want to be clear about why I’m sharing this context.
This is not an “AI changed my life last week” story. I’ve lived through multiple eras of building. Web. Mobile. VR. I’ve watched tools rise and disappear. I’ve watched entire ecosystems collapse. More than once, I’ve had to rebuild my sense of identity around a new set of languages, tools, and technologies.
After Booz Allen Hamilton, I became head of product at a Techstars-backed startup as one of the first employees. We built live sports experiences for FOX, BBC, NFL, FIFA, and the NCAA. We were early in social VR for live entertainment, early enough to watch Facebook and Oculus move into the space and reshape it entirely. That work didn’t vanish. Much of what we built was later acquired by COSM. When people experience COSM today, they’re touching work that traces back to that era.
After that, I joined Big Human as head of product, inside the same environment that incubated Vine and HQ Trivia, working with startups and Fortune 500 companies to bring products to market.
Not long after, I left and started my own branding and software studio, which I’ve owned and run for the last five years.
That sentence compresses a long and demanding chapter into a single line, which doesn’t really do it justice. What matters for this story is how my role changed over time.
Running the studio meant leading product strategy, managing projects, selling, hiring, managing teams and clients, absorbing risk, and doing the many unglamorous things solo founders end up doing.
I never billed myself as a hands-on engineer at my studio. I hired people who were better than me and should have been. I led systems, architecture, and technical direction, and I could step in when needed, but my day-to-day reality shifted.
Over the last decade, I barely coded.
Development felt like a luxury.
And that’s what makes what happened next matter.
Where This Is Going
I’ve spent this time walking through my background because the moment we are in now does not exist in isolation. It sits on top of decades of assumptions about what it means to build, to learn, and to take this work seriously.
For most of my career, building required scarcity. Time mattered. Skill accumulation mattered. The distance between an idea and something real was long enough that many ideas never survived it. That distance shaped education, careers, and the way we measured competence.
That distance has changed.
In the next part, I want to talk about what it felt like when it did, and why that change matters. Not just for people already working in software, but for anyone trying to decide what to study, what to practice, and what to optimize for now.
I’ll describe how AI changed my relationship to building in a very practical way, how it shifted the bottleneck upstream, and why the people seeing the most leverage today tend to combine systems thinking, design sensibility, product judgment, and a real understanding of how humans actually use what we build.
That shift has implications for how we think about computer science and software development, and for what kind of foundation still holds when tools accelerate this quickly. That’s where the real answer to the original question begins.
In part two, I’ll explain what happened when AI gave me back a kind of building power I hadn’t felt in years, and why that experience reshaped how I think about computer science, software development, and preparation for what’s coming next.



Loved reading this - your journey and perspective! Can't wait for Part two!
Fascinating. That college question's dual angle sparks wonder.