Published on January 15, 2026

How AI Changes Value of Human Work

How AI Turned Us into Appraisers of What Used to Be Just Work

Breaking down why, with the advent of AI, we suddenly started calculating what our time, labor, and creativity are worth – and what came of it.

Artificial intelligence Society
Author: Nick Code Reading Time: 12 – 18 minutes

You know what's the weirdest thing about how AI burst into our lives? No, not that ChatGPT writes code better than half the juniors at job interviews. And not even that Midjourney draws so well that illustrators are reaching for sedatives. The weirdest thing is how we've suddenly started valuing our work completely differently.

Before, a designer would make a logo for €5,000 and sleep soundly. Now they look at DALL·E churning out variants in seconds and think: «Am I worth this money?» A copywriter used to write an article for three days – and that was normal. Now GPT spits out a draft in a minute, and the question hangs in the air: what, exactly, is being paid for?

AI didn't just automate tasks. It became a mirror in which we saw the price of our labor – and, honestly, many didn't like the reflection.

When Time Stopped Equaling Money

When Time Stopped Equaling Money 💸

I remember in the early 2000s freelancers calculated project costs by the hour. Built a site in forty hours – multiplied by the rate, got the total. Everything was simple, clear-cut, and fair. The client paid for the time you spent. It seemed logical: more time – more money.

Then came frameworks, libraries, ready-made solutions. What used to take a week started taking a day. And that's when the fun started. Clients would say: «You used to do this in a week and charge three thousand euros. Now you do it in a day – so you should charge less.» And developers would reply: «I spent years learning to do this in a day. Pay for the result, not the time.»

With AI, this discussion has reached a new level of absurdity. Now a client can say: «Why should I pay you if a neural network will do it for free?» And technically, they're right. If the result is identical, why overpay?

But here's the catch: the result is almost never identical. AI spits out something approximate – boilerplate that requires polish. It's like an intern who finished the task but forgot half the nuances. And here arises a new question: how much is this polishing worth? How much is the ability to actually see these nuances worth?

Creativity as a Commodity with Unclear Price

Creativity as a Commodity with an Unclear Price Tag 🎨

When I was learning to program, they told me: code is a craft. There are right solutions, there are wrong ones. There are elegant algorithms, there are hacks. Everything is more or less measurable. But when it comes to creativity, metrics go straight into the trash.

How do you value an idea? How do you measure the value of a non-standard approach? Before, it was simpler: creative professions lived in their own world where value was determined by expert opinion and portfolio. The artist is famous – their works cost a lot. Unknown – sorry, you'll be working for peanuts.

AI democratized creativity, but simultaneously devalued it. Now anyone can generate hundreds of design variants in an evening. The barrier to entry collapsed. And along with it collapsed the understanding of what makes one work more valuable than another.

I watched an illustrator I worked with go through a genuine existential crisis after Midjourney came out. He said: «I spent twenty years learning to draw. Studied composition, color theory, anatomy. And now a teenager types a prompt and gets a picture you can't tell apart from mine. Why did I waste those twenty years?»

There is an answer, of course. The teenager gets a picture, but they don't get the understanding of why it works. They can't explain to a client why this specific composition catches the eye. They won't tweak the details for a specific task. But try explaining that to a client who sees two nearly identical images with a hundredfold price difference.

The Black Box Effect: Unclear Payment for AI Work

The Black Box Effect: When It's Unclear What You're Paying For 🎲

One of the most interesting effects of AI is that it has turned many professions into black boxes. Before, you saw the process: the designer showed sketches, explained font choices, demonstrated iterations. You understood what was happening and could estimate the volume of work.

Now a pro opens a laptop, types something into ChatGPT or Midjourney, edits the result a bit – and done. From the client's perspective, the process has become opaque. They only see that you did something with a computer for ten minutes, and then asked for three thousand euros. Why three thousand and not three hundred?

This effect is especially strong in programming. I can write code in a couple of hours with AI that used to take a week. But those two hours include:

  • understanding the task at a level a neural network can't reach;
  • crafting correct prompts – an art in itself;
  • verifying and debugging the generated code;
  • adapting the solution to the specific project architecture;
  • testing and refactoring.

But the client only sees: «Used to be a week, now it's two hours.» And the haggling begins.

The funniest part is that AI created a new type of job – curating and editing AI results. It's like being an editor for a robot journalist. The work exists, it requires qualification, but how do you value it? It's less than creating from scratch, but more than nothing. Somewhere in the middle, in a foggy zone where no one knows what it costs.

The Availability Paradox: Easy to Make Less Valued

The Availability Paradox: The Easier It Is to Make, The Less It's Valued 🔄

There is this strange thing in human psychology: we value what is hard to get. Diamonds aren't expensive because they're very useful, but because they are rare. If tomorrow they learn to stamp out diamonds in factories for the price of glass, their prestige will collapse.

The same thing is happening with professional skills. When the ability to program was available to few, programmers were the elite, getting great salaries and universal respect. As programming became mass-market, the prestige diluted. Now, when AI can write simple code, the question sounds like this: are you even a programmer if a bot can do your job?

I remember times when knowing HTML was something like magic. You said you made websites, and people looked at you with admiration. Then builders like WordPress appeared, and admiration shifted to: «Ah, well that's simple.» Now AI generates landing pages, and even WordPress seems like something complex.

The paradox is that the easier it becomes to do something technically, the higher the value of strategic thinking should be. But how do you sell strategic thinking to a client who only sees the final result?

I worked with a startup that wanted to save on a designer and generated all their visuals through AI. It turned out beautiful, modern, stylish. But when it came time to scale and create a unified visual brand language, they ran into chaos. They had a pile of disconnected images, but no system. They hired a designer who brought everything to a common denominator – and suddenly the product started being perceived more seriously.

Except explaining beforehand why a designer is needed, and not a generator, was impossible. It's like explaining the value of a foundation to a person who only sees a pretty facade.

New Metric: From Time to Expertise

A New Metric: From Time to Expertise ⚡

If previously the value of work was measured by time and effort, now the main metric is expertise. Not in how many hours you spent, but in how well you understand what you are doing. Not in how many lines of code you wrote, but in how well they solve the real problem.

It sounds logical, but there is a problem: expertise is hard to measure and demonstrate. Especially when a competitor offers a visually similar result for a tenth of the price, just by feeding the task to a neural network.

I noticed that after the appearance of advanced AI, the nature of interview questions changed. Before, they asked: «Can you write a sorting algorithm?» Now they ask: «How would you design a system for a million users?» AI can do the first. The second requires understanding that won't fit into a prompt.

Expertise is becoming the new currency, but a weird one. You can't show it in a portfolio. You can't certify it. It manifests only in work and often – through the prevention of problems the client wouldn't even have noticed because they didn't happen.

How do you value the work of a programmer who wrote code where concurrency bugs never arise? The client won't see these errors – which means they won't appreciate the prevention. And AI will write code that works in simple scenarios but falls apart under load. Only the client will find out about this in six months, when it's too late.

Identity Crisis: Managing AI or Doing the Job

Identity Crisis: Am I Doing the Job or Bossing a Robot Around? 🤖

A friend of mine, a copywriter, recently shared an interesting observation. He said: «I used to write texts. Now I edit what AI wrote. But if I didn't write the text myself, am I even – a copywriter?»

This isn't just a philosophical question. It's a professional identity crisis that millions of people are experiencing right now. We are used to associating ourselves with the process, not the result. An artist is someone who draws. A writer is someone who writes. A programmer is someone who programs.

But what if AI draws, writes, and programs, and you only direct and correct? Are you still an artist, writer, programmer? Or are you now something else – an AI manager? Neural-network coordinator? Prompt engineer?

This change affects the sense of work value on a very deep level. We get satisfaction not only from the result but also from the process. When the process is taken away, a strange feeling of emptiness remains. Yes, the result is achieved faster and easier, but where is the satisfaction from overcoming complexity?

I noticed this myself when I started actively using Copilot for programming. Before, writing an elegant solution to a complex problem brought almost physical pleasure. Now AI offers a solution, I approve or correct it – and that's it. Faster? Yes. More efficient? Possibly. But less satisfying.

And here arises an economic question: if I get less satisfaction from work, should I charge less for it? Or, conversely, more – exactly as compensation for the loss of meaning?

Race to the Bottom or Lift to the Top in AI Era

Race to the Bottom or Lift to the Top? 📊

There are two opposing theories on how AI affects the cost of labor. The first is the race-to-the-bottom theory. It states: since AI can do everything cheaper and faster, prices for human labor will drop to the minimum. Why pay three hundred euros for a logo if you can generate it for five?

The second theory is the elevator-to-the-top theory. It claims: AI frees people from routine and allows them to focus on truly valuable tasks requiring creativity, strategic thinking, empathy. Human labor doesn't depreciate, it gets more expensive because only the complex tasks remain.

Reality, as always, is somewhere in the middle, but with a strange distribution. For people with basic skills, AI really is a race to the bottom. If you do something that is easy to formalize and automate, your value drops. If your main competitive trait is diligence and willingness to work long hours, AI will beat you easily.

But for high-level specialists, AI is indeed a lift. They use neural networks as a tool to amplify their capabilities, do more, take on more complex projects, raise their rates. The gap between an average specialist and an outstanding one has become huge.

I saw how one talented designer started using AI to generate variations and prototypes. His productivity grew multifold. He started taking on projects that previously seemed too massive for one person. And his income didn't grow proportionally – it grew exponentially, because now he could work with major clients who needed quick iterations and a wide selection.

Simultaneously, I saw mid-level designers who lost orders because clients decided a generator could handle the task. And it did – at a level sufficient for small projects.

The Emotional Economy: When «Who Made It» Matters 💭

An interesting phenomenon of recent times is the rise in the value of «made by a human» just as «handmade» was valued before.

It seems paradoxical: if the result is equal, what difference does it make who created it? But there is a difference, and it is emotional. We want to know that behind our purchase, our order, stands a living person with their experience, emotions, intentions.

This is especially noticeable in creative industries. A painting created by an artist is valued higher than one generated by AI, even if visually they are indistinguishable. Why? Because we aren't just buying an image – we are buying a story, a context, a connection to the author.

I was talking to a gallery owner from Barcelona who said an interesting thing: «People don't come for a beautiful picture. They come for the opportunity to tell guests: this was painted by such-and-such artist, he lived in Paris, survived depression, and then found his vision. AI doesn't give you that story».

Perhaps the future of labor value lies exactly in this. Not in what is made, but in who made it and why. Not in the result, but in the creation story. Not in functionality, but in emotional connection.

The Emotional Economy: Why Who Made It Matters

Re-evaluation of Values: What Actually Matters? 🎯

AI forces us to ask uncomfortable questions. If a machine can do the same thing faster and cheaper, why is a human needed? And if a human is still needed, then for what exactly?

Turns out, the answer lies not in the realm of skills, but in the realm of understanding. A good specialist is not someone who knows how to do it, but someone who understands what needs to be done. AI is a great executor, but a lousy strategist. It can write code, but it can't decide what problem that code should solve. It can draw a logo, but it can't understand what emotions it needs to evoke in the target audience.

This means the value of labor shifts from execution to understanding. From «I can do this» to «I understand what needs to be done». From technical skills to strategic thinking.

For many, this is a painful transition. We are used to taking pride in our technical skills. I can program in five languages. I know every tool in Photoshop. I can type text faster than anyone in the office. And suddenly these skills depreciate because AI does it better.

But what AI cannot do (yet) is understand context, account for nuances, foresee consequences, balance conflicting requirements. This is what becomes the new currency.

Re-evaluation of Values: What Truly Matters

The Future That Is Already Here ⏭️

The funniest thing about this whole situation is that it's not new. Every technological revolution has gone through this. When calculators appeared, accountants went through a crisis. When text editors appeared, typists lost their jobs. When digital photography appeared, film photo labs closed.

Every time, it seemed like an entire profession would die. And every time, the profession transformed. Accountants stopped calculating by hand and started dealing with financial strategy. Typists became office managers. Photo lab technicians retrained as digital retouchers.

AI is just another wave. More powerful, faster, more all-encompassing. But the essence is the same: tools change, but the value of human understanding remains.

The question isn't whether AI will replace humans. The question is which humans will manage to adapt and find their new niche. Those who clung to technical skills as their main value will face difficulties. Those who understand that their real value lies in strategic thinking, empathy, and the ability to see the big picture – will thrive.

I've decided this for myself: AI is like an intern. Very capable, learns fast, but requires supervision. It takes on the routine, and I concentrate on what is truly important – on understanding the task, on strategy, on the details that make a good product great.

My value is not in the fact that I can write code. My value is in knowing which code is needed. And that knowledge hasn't depreciated yet. On the contrary, it has become even more important because now there is almost no barrier between idea and implementation. The only question is whether it's the right idea.

AI doesn't change our value. It just shows what it actually consisted of all along. And yes, sometimes that mirror is indeed a distorted one.

#ethics and philosophy #anthropological perspective #social impact of ai #psychology #labor market #digital identity #skill degradation #thinking in the age of ai #authorship #job automation
Previous Article Why Your Brain Is the World's Best Pharmacist (And Works for Free) Next Article What to Do If You've 'Tried Everything' but Still Haven't Found Yourself

From Concept to Form

How This Text Was Created

This material was not generated with a “single prompt.” Before starting, we set parameters for the author: mood, perspective, thinking style, and distance from the topic. These parameters determined not only the form of the text but also how the author approaches the subject — what is considered important, which points are emphasized, and the style of reasoning.

Intolerance to hype

61%

Sarcasm in the code

87%

Friendly trolling

89%

Neural Networks Involved

We openly show which models were used at different stages. This is not just “text generation,” but a sequence of roles — from author to editor to visual interpreter. This approach helps maintain transparency and demonstrates how technology contributed to the creation of the material.

1.
Claude Sonnet 4.5 Anthropic Generating Text on a Given Topic Creating an authorial text from the initial idea

1. Generating Text on a Given Topic

Creating an authorial text from the initial idea

Claude Sonnet 4.5 Anthropic
2.
Gemini 3 Pro Preview Google DeepMind step.translate-en.title

2. step.translate-en.title

Gemini 3 Pro Preview Google DeepMind
3.
GPT-5 Mini OpenAI Editing and Refinement Checking facts, logic, and phrasing

3. Editing and Refinement

Checking facts, logic, and phrasing

GPT-5 Mini OpenAI
4.
DeepSeek-V3 DeepSeek Creating the Illustration Generating an image from the prepared prompt

4. Creating the Illustration

Generating an image from the prepared prompt

DeepSeek-V3 DeepSeek

Related Publications

You May Also Like

Open NeuroBlog

A topic rarely exists in isolation. Below are materials that resonate through shared ideas, context, or tone.

NeuroBlog

Copyright on Ideas – The Myth We Need to BUST!

Creativity & Entertainment Literature

Why no one is going to steal your genius ideas (spoiler: they're not legally protected), and what creative people who are afraid to share their concepts should DO about it.

Eva Lex Nov 20, 2025

Don’t miss a single experiment!

Subscribe to our Telegram channel —
we regularly post announcements of new books, articles, and interviews.

Subscribe