I started to understand this at a party in San Francisco where everyone had to take their shoes off. The event was called “AI Philosophy Nights: Post-Productive Human”, and I took my friend Victoria on a night in Civic Center. They made us swap our shoes for slippers at the door, like walking into a typical Asian household. I found this disarming in the best way; something about surrendering your shoes loosens your armor. The space was a startup office loft dressed up as a living room: cozy lighting, greenery paired with wooden floors, tatami-style meeting rooms. Victoria and I both found it to be quite lovely.
The attendees were the exact demographic as you’d imagined: startup founders, lawyers, engineers. And we all gathered in San Francisco to discuss this burning matter— who are we when AI can do our jobs better than us?
It is no longer a hypothetical question. We are now witnessing the cracks of capitalism with the rise of Artificial Intelligence: jobs that carried prestige and complexity are now getting displaced. The new grad market is brutal not because there aren’t problems to solve, but because companies no longer see the value in growing people — not when they can grow a model instead. We built a thing that is better than us at the work we defined ourselves by. And now we have to figure out who we are without it. And it is extremely uncomfortable to look at ourselves introspectively without the filter of productivity or “busy work”.
Who am I, unironically, if I am not maximizing shareholder value? Who am I beyond the things I produce?
Those are million-dollar questions that danced in my mind throughout the night. Perhaps we should warm up with a simpler question: what would you do if you win the lottery tomorrow? When I ask this question to my coworkers, a lot of them answer that they would probably retire. Buy a farm, live off-the-grid. That sounds peaceful and all, until you realize the barely-hidden insinuation that the goal is to escape from tech. To leave what they have been doing for 40 hours each week. So allow me to reword this question even sharper: if your job disappeared tomorrow and you were financially fine, would you still do it? If not, were you ever doing it for the right reasons?
The meaning is what you make of it, sure. Some wake up genuinely invigorated by their 9 to 5s. Others find it in homemaking, in the smiles of people they love. Some find it in art, in things they create with their hands or minds. Perhaps we are all a bit of all the above. But what fills the space when none of those things are available? Nihilistically, if none of those things matter in a century, what do we reach for when the void opens up?
I think we already have an answer. Hint: it’s that glowy box we hold and poke.
We scroll and scroll and scroll.
I have a deep, visceral problem with what mass media has become, and how eager we are to consume it. Case in point: brainrot. It is the stupidest piece of content that has ever been produced and consumed by our species. We started off with “Italian brainrot”, and lately it has evolved into this AI fruit adultery saga with pornographic undertones. I apologize for what you are about to read: the cheating fruit harlot, Strawberrina, forever caught in the act of cheating with bananas… and boom, 350K likes on those reels. We have genuinely lost the plot and replaced it with noise. And they are engineered to keep your thumb scrolling. Mindnumbing digital opium. And we eat it like it’s the last meal.
Brainrot isn’t just laziness. I wished it was that plain and simple. Brainrot is what happens when our attention is capitalized, the void opens and there is nothing meaningful to fill it. It is a symptom of a disease: spectacle and stimulation that no longer give us identity. It’s all there to fill the void. But brainrot isn’t the disease. It’s a coping mechanism. The real issue is what happens when there is nothing else to reach for.
And to me, this directly correlates to another symptom— the death of the third space— and in its absence, the slow commodification of human connection itself.
The third space is the place that is not your home or your office. The bar, the library, the park, the gym, the restaurant where the server knows you by name. The place where you exist without an agenda, without producing or consuming for others, just being among other people who are also just being. It is going extinct.
When those spaces disappear, we don’t stop seeking connection. We just start outsourcing it. First, to apps like Hinge. Then to something more abstract. More convenient. Less human. I am not ashamed to admit this, I have been on dating apps before. I am disillusioned from the fantasy of meeting someone naturally, romcom style. So when the spaces where you might organically meet people disappear, you adapt… usually by handing another tech company your loneliness and letting them monetize it. It starts with Hinge. It transitions into Samantha from Her. It ends with Joi from Blade Runner 2049 (more on her later). The distance between these companions is shorter than we’d like to admit.
And into that vacuum, the scroll arrives, offering stimulation without substance, spectacle without meaning. It fills the void without solving it. The loneliness epidemic is real, and it is not an accident. It is the direct consequence of building cities and economies that have no room for unmonetized human presence.
If all of this sounds theoretical and abstract, it isn’t. Nowhere makes this more evident than the city I live by.
SF is cyberpunk stripped of its cool aesthetics. The Waymos glide along as I drive by the AI ads on the 101 billboards. We walk past people horse-tranquilized into a different dimension as we discuss the future of AGI. The same zip code that contains offices of the most cutting-edge technology houses cases of complete human wasteland. The permanent underclass is now of people who will never wield Claude or lobsters— not because they lack the intelligence, but because the system was not designed with them in mind, and never pretended otherwise.
The very technology that was supposed to save us has now, by and large, become soulless. Startup culture used to have a reason to exist: genuine solutions to solve genuine problems. And unabashed scrappiness too, just some nerds building a product in a garage. Nowadays, techbro culture feels like a shell of its past self: ambition now feels disingenuous and cheap. Problems are made for solutions. All I hear at founder meetups are just mindless yapping about AGI, venture capital, going Series A, B and C, with IPO timelines that are mostly likely never going to happen. Founders are less rewarded for being visionaries, and more by being very good at selling the idea of a vision. And you add all that with the recent layoffs— the hundreds and thousands of tech workers losing their jobs to the very products they helped built… And they have the audacity to call San Francisco the Rome of the 21st Century. Rome, famously, burned.
I say this as someone fully implicated. I am as techbro as any other Asian-American software engineer who drives an EV and has opinions about foundation models. I drank the Kool-Aid, I just want to hold the half-full cup up to the light. I graduated last year and already feel behind. The rat race has gotten tighter while the finish line has gotten more blurry. You’re NGMI if you don’t catch up with tomorrow’s tech. There is no longer a clear path that says: do this, become this, mean something. There is only the pressure to keep up with whatever is shipping next, or be left behind by it.
And the strangest part is, none of this feels original. We’ve already imagined this world 40 years ago, we just had the luxury to call it fiction.
The underlying issue is not whether we can give a machine the qualities of the human, but whether the human has lost its humanity; whether it has become, in fact, a machine.
-Scott Bukatman, Blade Runner (BFI Film Classics)
The core story of Blade Runner goes as follows: a blade runner named Deckard is sent to hunt a group of rogue “replicants”, androids who have gone off-script and refused to serve. Along the way, he falls in love with Rachael, a replicant who doesn’t know she is one. Her memories are implants. Her sense of self is manufactured. And yet, does that make her feelings less real? Does it make her less a creature with flesh?
The film’s most devastating moment belongs to the villain, Roy Batty, who saves Deckard’s life at the last second and then sits down in the rain to die. His expiry date was arriving on schedule, like a product recall. His final words become the most famous quote of the film: “All those moments will be lost in time, like tears in rain.” He has lived. He has suffered. He has accumulated experience that meant something to him, even if no one designed him to care about it. And then he is gone, and the world does not pause.
There is also the ambiguous suggestion that Deckard himself may be a replicant, that the man hunting the machines may be one. The film never confirms nor denies; the question is the point.
Blade Runner 2049 makes this even more literal. Agent K’s only companion is Joi, a holographic girlfriend, sold by a corporation, subscription-based. When K asks if she really loves him, the film doesn’t answer. It doesn’t need to. The point is that he needed to ask. She is warmth, manufactured. Intimacy, monetized.
The world of Blade Runner is not defined by advanced technology, but by what that technology has done to human meaning. Billboards of Japanese geishas and Joi flicker endlessly, neon reflecting off the solitude of Deckard and K. The symptoms I described earlier: commodified attention, manufactured intimacy, the erosion of authentic presence, are not new. They’re just no longer fictional.
I think about Roy Batty’s sentiment in regards to the existentialism problem of humanity. If nothing we produce matters in a hundred years, then what is the point? Is there, ever, a point? Roy Batty held his own experiences as sacred, even knowing that they will dissolve. Is that what it means to be human? That feels like the only sane response to any of this. Not to solve the problem of AI, not to win the race, but to insist on witnessing your own life with enough care that it means something to you, even if it won’t mean something in a hundred years.
And that brings me to the quote that ties all this together. While doing some research on this topic, I reread the essay The Comforts of Cyberpunk in Evan Puschak’s book, Escape into Meaning. Puschak runs the YouTube channel The Nerdwriter, a video essay channel that I adore. Puschak writes, “Maybe that’s why these stories comfort me. Cyberpunk turns those messy feelings into a place, where it’s no longer necessary to resist the splintering pressures of society because the fight’s over and we lost. All that’s left is to submit to the carnival of sensations. In a cyberpunk future, I can let go. I can melt into the prismatic flux of civilization. There’s a relief in that, even a feeling of oneness.”
From the brainrot to the disappearance of third space, we are headed towards the least cool version of cyberpunk. Like Puschak said, we submit to the overstimulation and numb ourselves with digital opium, because the battle has been fought and we lost. But Puschak finds peace in losing. I am terrified because I feel complicit in building the thing that made us lose.
I understand what he means, the seduction of surrender and accepting that the algorithm is too large, too vast, too optimized to resist. It is futile to think that a single person like me can change the tides. But I can’t find myself to feel that comfort. To me, this feels like a world we have built. To see your git commits directly on the repo that is sending all of us to hell. So when the existential dread of artificial intelligence comes up in conversations, I feel immensely helpless.
Cyberpunk is supposed to be just speculative science fiction. But I feel like it is resonating harder than ever, like a passenger-side mirror… objects closer than they appear. As I write this to the classic Blade Runner soundtrack by Vangelis, I think about the harvested attention, permanent underclasses, and the outpaced ethics… they prophesied a future we are actively building. Humanity is more than willing to anesthetize itself and call it adaptation. But it’s dark… not noir-dark. Like, we-are-cooked dark.
So if cyberpunk isn’t a warning, but a mirror—then the question isn’t what happens next. The question is whether we are willing to recognize ourselves in the reflection. Please do forgive my bluntness. I am not writing this to denounce tech or promote some jaded disillusionment or tell my fellow techbros to go touch grass. I am inside this. The call is very much coming from inside the house. I write this as a plea for soul-searching: because if what we are building is as consequential as we say it is, then we owe it to ourselves and to everyone who will wield it to ask the harder questions. Like that Jurassic Park question that still rings true, not just what could we build, but if we should build.
I don’t want to be a thing that only consumes. I want to produce and provoke. It is the only way I feel truly alive. Not as a profit metric, shareholder value, or a LinkedIn post. That’s just me, trying to be the kind of human that looks in the mirror and thinks about the consequences of what I do, and what I represent. The productive nature of humanity remains, but perhaps this time around, we can do it for the sake of ourselves, for our legacies, and for our next of kin.
Objects in the mirror are closer than they appear.
We should probably start looking.