In the mid 90s, when Java and its virtual machine started making the rounds outside of academia, the reaction from the Serious Software People™ was swift and predictable: this is a toy. You can’t do anything real with it. You certainly can’t run production systems on it. And you definitely - absolutely - cannot trust a garbage collector to manage memory for you.
I was an early-career developer at that time, and I was excited about Java. My computer science coursework had been all C and C++ (both languages requiring detailed memory management) but in my senior year, that one cool professor (there’s always one) taught a class on Java and I loved it.
I loved it because I could build faster.
But then I graduated and took a job doing C++. My colleagues were Serious Software People™. These folks had built careers wrestling with malloc and free, tuning memory allocation down to the byte.
For the vast majority of developers at that time, efficient memory use wasn’t just good hygiene - it was a survival skill. RAM was expensive. CPUs were slow. Cache misses were the devil.
In that world, trusting a virtual machine to manage memory for you was like flying a plane with your eyes closed and hoping the autopilot knew what turbulence was.
And yet: Java won. Not everywhere, not for everything - but it reshaped the field. It opened the door to enterprise software, Android, big data, and more. We stopped spending our lives chasing segmentation faults and started doing…other things. Bigger things. More ambitious things.
Here’s the twist, though: Java didn’t win despite being a higher-level abstraction. It won because it was. And that was the uncomfortable part.
Because if you were one of those engineers who had spent years mastering the fine art of pointer arithmetic and memory profiling, Java’s abstractions felt like an erasure of your value. Your hard-won skills weren’t just unnecessary - they were almost a liability. The thing that had defined you as a software engineer was being swept away by a garbage collector.
Sound familiar?
We’re hearing a lot of the same fear now with generative AI. It’s going to write all the code. It’s going to replace junior developers. It’s going to make real engineers obsolete. It’s too dumb to trust, too weird to debug, and too generic to do anything important.
Maybe. But probably not.
Because the lesson of Java - and really of every leap in abstraction before and since - is that the thing we’re scared of losing is rarely the thing that actually makes us valuable. Manual memory management wasn’t our gift to software. It was the tax we paid for working close to the machine.
What Java did was remove that tax. It let us focus on system design, on algorithms, on architecture, on user experience. It allowed more people to build software, which meant more ideas got built. And the engineers who adapted to that shift - who learned to live without malloc - ended up working on more interesting problems.
That’s the path in front of us now. AI coding assistants are not magic, and they’re not coming for your job tomorrow. What they are coming for is the boring stuff. The boilerplate. The repetitive code generation. The tedious edge-case handling. The parts of our job that feel like memory management used to feel - necessary, but soul-crushing.
So the question is not, Will AI take my job? The question is, What part of my job will AI take - and what does that free me up to do instead?
What will we build when we don’t have to write yet another controller class by hand? What systems become thinkable when the scaffolding is free? What ideas have we shelved because we didn’t have the bandwidth to explore them?
What happens to our ambition when the constraints shift?
The developers who succeeded in the post-VM world weren’t the ones who clung to manual memory management - they were the ones who welcomed the shift and rode the wave upward. They let go of a hard-won skill not because it didn’t matter, but because it was no longer the best place to spend their energy.
That’s where we are now, again. AI isn’t replacing us. It’s relieving us - of repetition, boilerplate, tedium.
And in their place: space to think bigger.
The next era of software won’t be written by AI. It’ll be written with it - by people who choose to let go of what no longer defines us, and reach for what might.