Your Father Spent His Life Savings on Claude Code and We Shipped Nothing
March 2026
On AI slop, borrowed thinking, and the skills that matter when everyone has the same tools.
The meme everyone is laughing at is also the most accurate professional warning of 2026.
"Your father spent his life savings on Claude Code and we shipped nothing."
Funny. Also a confession that a lot of teams are not ready to make out loud.
We have confused access to tools with the ability to think
There is a version of the AI story that is genuinely exciting. Tools that handle the tedious parts. Faster drafts. Cleaner code. Research that used to take hours done in minutes.
Then the dark side ...
Somewhere along the way, a different story took over. One where the tool is the strategy. Where generating output became a substitute for solving the problem. Where the speed of production disguised the complete absence of thought.
That story is also real.
The Rise of AI slop
We now have industrial scale slop, too much of it and at some point a change is coming , I am an optimist and believe in change we are at a genuine paradigm shift. As more people learn how to leverage ai tools, we will have a good understanding of where it fits in and when it doesn't. Its still funny when you receive a cold email and its like the person on the other end just discovered new found powers that need serious refinement. Hey, full disclosure, I have been suckered in too. Failing, but having fun figuring out the tools.
"Slop at scale"
When the AI starts citing other AI. Perplexity citing a ChatGPT summary of a Claude interpretation of a Reddit post from 2019, presented to you with full confidence and zero sources.
The Tool Trap
There is a specific failure mode that has become an epidemic: reaching for an AI tool before understanding what you are actually trying to solve.
It looks like productivity. It feels like momentum. It produces things: documents, plans, code, content, strategies, roadmaps, and frameworks with familiar-sounding names that everyone nods at in the meeting and nobody ever references again.
Somewhere we skipped whats valuable. The hard, uncomfortable, slow work of understanding the real problem.
When you skip that step, the AI does not save you. It just gives you a well-formatted version of not knowing what you are doing. The output is coherent. The thinking is not yours. The slide deck is beautiful. And the result is a lot of shipped nothing.
This is also what vibe coding has quietly become for a significant portion of people doing it. Not "I understand this system and I am using AI to move faster." More "I have asked the AI to build me a thing (which only you thought was a good idea), it has given me something that compiles, and I have absolutely no idea what will happen when someone uses it." The README is immaculate. The error handling is maybe in v2 ?
Your father did not lose his life savings because the tools were bad. He lost them because nobody in the room did the thinking first.
The skills that are now worth more, not less
Here is what the doom narrative gets wrong. AI does not compete with human capability. It does not. It amplifies it.
Which means if you bring sharp judgment, the AI helps you think faster. If you bring no judgment, the AI helps you be confidently wrong at scale. Both outcomes are available to you. Most people are not choosing deliberately between them.
Judgment. Knowing which problem is actually worth solving. Knowing when the output is wrong even when it sounds right. Knowing the difference between a good answer and a good-looking answer. AI can generate both with equal confidence and roughly the same font. Only you can tell them apart. And judgment without experience is just confidence. Not the same thing.
Taste. Knowing what good looks like before you can articulate why. The thing that makes you read a first draft and know immediately that something is off, even if you cannot name it yet. Taste is accumulated experience. It is not a setting you can configure, and it is absolutely not available in the free tier.
Context. The same solution lands completely differently depending on who the customer is, what happened last quarter, what the team is afraid of, and what the person asking actually needs versus what they said they need. AI has access to a vast amount of information. It has no access to your context, your read on the room, or the fact that Dave always says yes in the meeting and no in the follow-up email.
Relationships. The most valuable conversations you will have this year will not be with a model. They will be with people who trust you enough to tell you things they have not told anyone else. Relationships are why you find out about the problem before it becomes a crisis. You cannot automate them. You cannot prompt your way into them. And they get worse, not better, if you are clearly not paying attention.
The ability to sit with a problem. This is quietly becoming the rarest skill in professional life. The willingness to not immediately reach for a tool. To stay in the discomfort of not yet knowing. To think before you generate. Most people have outsourced this so completely they have forgotten they used to be able to do it. The silence before the answer used to mean thinking. Now it mostly means typing into a chat window and hoping.
There is a lot more to being human and I have just touched the surface.
What over-reliance actually costs you
The risk is not that AI will take your job. It is that if you stop thinking, you will stop being the kind of person whose job is worth having.
Skills atrophy. Quietly and without announcement. The analyst who uses AI to write every report stops being able to write the one report that cannot be delegated. The strategist who uses AI to produce every plan stops being able to spot when a plan is wrong. The developer who uses AI to write every function stops understanding the system well enough to know when it is quietly on fire.

You do not notice it happening. The output keeps coming. The meetings keep running. The calendar stays full.
And then one day someone asks you a hard question, the kind that requires genuine understanding and no obvious prompt, and you realise the tool cannot answer it.
The Opportunity
AI has made the production of output cheap. Which means output, on its own, is worth almost nothing.
What is worth something is judgment about which output matters. Taste about what good looks like. Relationships that give you access to problems worth solving. The ability to sit with a hard question long enough to understand it.
These are not soft skills. They are the only skills with a real moat right now, because everyone has access to the same tools and very few people are doing the thinking that makes those tools worth using.
The ones who will look back at this period and feel good about it are not the ones who generated the most. They are the ones who kept thinking, stayed curious, and used the tools to go further rather than to go instead.
Double down on the things AI cannot fake.
Not because AI is the enemy. Because it is the amplifier, and right now most people are pointing it at nothing. We can do better.
Subscribe to CyberDesserts for practical insights on technology, security, and the skills that actually matter.
Member discussion