I write about tech, power, culture, and staying human in a world that keeps trying to automate you.
I founded Daring Ventures to invest in software powering our human edge in the era of Big Noise.
Occasionally contrarian. Always real. Join the thousands* of readers who are tired of empty jargon, credential cosplay, and AI sludge.
So there’s a NY Mag piece making the rounds right now and you’ve probably seen it. I know I have. No less than a dozen people sent it to me.
It’s the one about how AI is ripping through higher education like termites through drywall. One kid at Columbia (of course 🙄) bragged about using ChatGPT to write 80% of his essays. Including the one he used to apply.
Another used it to write a paper about how the father of critical pedagogy argued that learning is what makes us truly human.
Welcome to college, where the final exam has become a ceremonial duel between rival language models and the humans stand off to the side like nervous seconds, clutching the handkerchiefs of tradition, pretending this still means something.
The student pastes in a prompt. The machine replies. The professor nods gravely. And both parties pretend that a real exchange just occurred as if this has anything to do with actual learning.
And sure, it’s funny. Is it concerning? Yes, if we can’t adapt.
Is it about time? Duh.
This isn’t just kids cheating. It’s the end of a system that forgot what it was for.
A system that replaced learning with credentialing, curiosity with compliance and is now shocked to discover that no one’s playing along anymore.
For someone like me, who has had a long, weird, borderline pathological relationship with school… I’ve got complicated feelings about it.
I’m telling you, me & education is like the worst, on-again / off-again relationship you know.
I talked about it before.
I was the smart kid until I wasn’t. I was the golden child until I flunked out. I got into Columbia by way of Santa Monica College and a thousand quiet humiliations. I’ve loved school. I’ve loathed school. I’ve felt seen by it. I’ve felt erased by it.
So yeah, it’s ironic that I’m defending education now. But not the old version. That one’s dead. I’m here to write about what’s next.
Because the thing no one’s saying out loud is this: education might actually matter more now than it ever did.
Not as credential. But as crucible.
Let’s be honest for most of history, knowing stuff was a cheat code. If you could spout facts, drop references, and write a passable five-paragraph essay, you got sorted to the top. School wasn’t about becoming capable. It was about performing capability.
Knowing things isn’t rare anymore. It isn’t special. It’s just noise.
And I should know. Because I lived and breathed this noise.
Contrary to what my grad year and compulsive use of SPF 45 moisturizer might suggest, I’m old. Ish. I came up in the golden era of “gifted and talented” programs.
That weird educational wrinkle where they’d take a bunch of precocious kids and tell them they were better than everyone else because they knew who Dionysus was and could do long division without crying.
This knowledge-as-cheat-code game? I was built for it. I crushed Continental Math League. I lived for the Knowledge Bowl buzzer.
I memorized castle layouts and area codes and Greek demigods the way other kids knew everything about Dan Marino and Brett Favre
I hung around with kids who flexed around other kids (but especially grownups) by doing weird shit like wandering around mumbling a memorization of pi out to like a zillion digits—yes, I knew someone who really did this.
Also, plot twist: he later became my TA when I was at Columbia. Someone I grew up with and competed with but I was an undergrad and he was grading my work as Columbia PhD candidate after 4 years of undergrad at Yale. The version of me from five years earlier would have had an existential shane-fueled crisis just dropped the class. The version of me at Columbia me just thought, “huh, small world,” and appreciated that he was actually a great TA. The power of personal growth!
Anyway, back then, I was this precocious little shit who was told as much by adults every single day because of my suspiciously advanced vocabulary and knack for using words like “verisimilitude” in everyday conversation.
It wasn’t intelligence exactly. It was performance. It was intellectual cosplay. And I executed with such flair that it kept a lot of people transfixed even after flunking Algebra I.
But alas, it all came crashing down.
I hit the limits of my cheat codes. The tricks stopped working.
I went from top of the class to academic probation to dropout
It turns out that knowing a lot of things isn’t the same as knowing how to do anything.
It also doesn’t help you build. Or lead. Or endure.
These are the shifts AI is accelerating.
The edge now is in making meaning from the noise. In seeing what doesn’t quite fit. In choosing what matters.
This is where humans still win. Not in storing facts, but in noticing. In connecting. In asking the kind of questions that don’t fit neatly inside a rubric.
You can teach a machine to pass a test. You can fine-tune a model to echo wisdom. But the gut-check moment when someone says something that just feels true? That’s still ours.
AI can flood the zone with polished content that’s fluent, structured, and even insightful. But it doesn’t care. It doesn’t notice when something’s off, or when a contradiction is worth sitting with instead of solving.
Discernment is what happens when judgment meets context. It’s what lets you say, “That doesn’t sit right,” even when the spreadsheet says it should. It’s why some people make great decisions in uncertain situations, and others cling to the prompt like it’s gospel.
You see it in the founder who spots a gap in a saturated market because something feels unfinished. Honestly, you see it in anyone who’s lived long enough to recognize that truth doesn’t always look efficient.
None of this is luddism, by the way. Highlighting the importance of discernment isn’t anti-AI. It’s how you turn AI into leverage. It’s the real edge when machines can generate everything but perspective.
I’ve said it before and I’ll say it again: predictive models cannot handle stuff that is, literally by definition, unpredictable.
That’s what we need to teach. Not just what to know, or even how to think. It’s about learning how to trust your lens. How to develop it. How to be responsible for it. How to filter what’s knowable and decide what’s worth knowing.
Nuance, subjectivity, uncertainty. You know, human stuff.
The most dangerous thing we did to education has absolutely nothing to do with letting AI into the classroom.
No, it was pretending the classroom was the point.
For decades, we turned education into a prestige delivery system. A sorting hat for the professional caste system. It stopped being about capability and started being about credential.
The degree wasn’t a proxy for potential. It was a performance and everyone knew the script. Get the grades, say the right things, flash the right resume, and you got waved through.
If you couldn’t afford the right prep or couldn’t code-switch into Ivy dialect, tough luck. The whole thing was a baroque little aristocratic obstacle course dressed up as meritocracy.
So of course students are handing the baton to ChatGPT now. The game was always about playing it safe, not getting curious. Efficiency became a virtue. Original thought was a liability.
But maybe the collapse of that credential circus is exactly what needed to happen. Because now we can stop pretending it worked. Now we can build something real.
Hasn’t the death of higher ed been predicted a zillion times before?
Yes, but that’s not what I’m saying.
This isn’t the end of education. It’s the end of its weakest form.
We’re not mourning the collapse of a golden age—we’re watching American higher ed shuffle on like the final tour of a certain psychedelic jam band: half the original spirit gone, what’s left lip-syncing through bloated anthems in stadiums sponsored by credit card companies.
The ideals? Fractured. The soul? Long sold.
Internal power struggles backstage while the audience, mostly boomers high on nostalgia among other things, forks over obscene prices for merch and memories in hopes they can inspire their kids to get on the bus.
The golden age ended years ago. Now we’re just sweeping up the last of confetti and institutional incense presented by a venture-backed CPG brand.
And good riddance. Because what’s coming next might actually be worth showing up for.
Now we get to teach people how to think with nuance. Now we get to reward weird connections and stubborn curiosity. Now we get to tell the truth: that learning is supposed to be uncomfortable sometimes.
That ambiguity is a feature, not a bug. We get to say that it’s okay not to know. It’s our superpower..
Our future isn’t in being the best at remembering things, or the fastest at doing math, or the most efficient at spitting out an answer.
The human edge is finding meaning in the mess. That’s where the good stuff lives.
It’s where it always has lived. There’s just infinitely more information to synthesize now.
There’s just way, way more mess.
If we want education to survive but as a force that actually builds people — we’ve got to change the blueprint.
That means reimagining everything: learning modalities, assignments, evaluations, even the basic assumptions about what disciplines are and who gets to draw the lines between them.
No more filler prerequisites and siloed tracks. No more 101-level PowerPoints on topics you could learn faster from a YouTube rabbit hole. Everyone’s got a creative agency, a call center, and an ops team in their browser now. You want to learn marketing? Go market something. You want to study writing? Build an audience. The world already gave you the tools. School should teach you how to use them with judgment, not rehearse some outdated playbook.
Imagine coursework built around thesis-level complexity from day one. Not in a punitive, sink-or-swim way. In a way that forces you to think across boundaries. Assignments that don’t ask for answers, but for hypotheses worth testing.
This shift isn’t just about content. It’s about form. Evaluation needs to evolve away from rubrics and regurgitation, toward originality, synthesis, and self-awareness. We should be teaching students how to build frameworks, not just fill them in.
Aren’t the Institutions to Blame?
In a word? Yep.
And look, I say this as someone who’s benefitted from the academy. My dad’s a tenured professor. I grew up in the shadow of that system, and I know how vital it is to protect spaces for critical thought, for unpopular ideas, for deep, slow inquiry. For people who teach foreign language pedagogy and other weird, niche shit. Someone’s gotta do it.
But I also know that tenure isn’t a get-out-of-relevance-free card. Institutions have a duty to evolve, not just persist. And right now, (at least) half the problem is venture capital the academy itself with it ts hopeless insularity, its groupthinky myopia, its yucky credential fetishization, and its potentially fatal allergy to accountability.
So let’s fix that. Let’s build an academy worthy of the tools we have and the questions we face. One that doesn’t pretend to compete with AI on speed or output. One that that uses it to make education more abstract, more interdisciplinary, and way more real.
This isn’t just about education.
It’s about the creeping complacency showing up everywhere around us. It’s in our companies, in our content, and increasingly in our decisions and who we trust to make them.
It's about the people who look at AI and think, "Well, it’s been a hell of a run." As if the endgame is to be gently automated into oblivion, sipping Soylent while the models do the thinking and everyone spends their UBI on OnlyFans or the metaverse (probably both).
Uh, newsflash: You are only obsolete if you act like it.
But seriously, go ahead. Have an LLM do all your work. Treat learning like latency.
And sure, you might be gaining some modest, incremental leverage in the short term. But life isn’t a short-term game.
In the long run, you’re not gaining any leverage at all. You’re giving it away.
This is the moment to get sharper, not softer. To become more human, not less.
Because in a world of infinite simulacra, the question isn’t whether you’ll use AI. The question is whether it’s using you.
The future won’t be about proving what you know.
You’ll need to prove you’re not just a wrapper on ChatGPT.
*might be a little less than that ;)