The AI augmentation fallacy goes something like this:
“AI won’t take your job but someone using AI might.”
Instead, I posit that for the vast majority of jobs, the effects of AI will play out something like this:
“AI won’t take your job, but it will take your ability to charge a premium for it.”
In other words:
“AI won’t eat your job but it will take a nice chunky bite out of your salary.”
That question is one of the biggest distractions of our time.
The question that really matters is whether humans will still command a skill premium once they are augmented by machines.
And if so, which skills will preserve that premium and which ones will not.
The debate on the effect of AI on jobs is repeatedly distracted by the question of replacement through automation, when the factor that really drives our engagement with our careers is the ability to own skills which grant us a skill premium.
And AI is perfectly poised to attack not your job, but the ability to charge a skill premium for it.
In particular, we look at three key ideas through this post:
Every job is a bundle of tasks.
Some of these tasks require specialization, some don’t, but they still remain part of the bundle as the cost of unbundling and delegating those tasks may be high.
Every new wave of technologies (including the ongoing rise of Gen AI) attacks this bundle.
New technology may substitute a specific task – for instance, intelligent scheduling tools may fully substitute a task previously performed by a human.
New technology may also complement a specific task – for instance, improvements in AI provide diagnostic support to doctors and radiologists, enhancing their ability to perform the task.
First order thinking suggests that technology-as-substitute (automation) is bad, and that technology-as-complement (augmentation) is good.
However, to understand whether technology has a net-positive or a net-negative impact on a job, it’s important to think through the effects of augmentation on the job bundle described above.
Depending on the effects on the bundle, the skill premium (the ratio of the wage of skilled workers to unskilled workers) for that job category may increase, decrease, or remain the same
There are four possible ways in which the job bundle gets rebundled.
First, there is no shift in value from the job bundle.
AI augmentation helps the worker but there is no substantial improvement in their ability to capture value either through productivity improvements or through levelling up to include higher value tasks in the job bundle.
Outcome: There are no effects on the skill premium i.e. technology has neither an upward nor a downward pressure on your salary as a skilled worker.
I believe the effects of Generative AI will skew less in this direction and more in the other three directions below.
In this scenario, there is a shift in value towards the job bundle.
AI augmentation migrates value towards a specific job bundle, largely because the worker benefits from productivity improvements (more of the same) or because they level up to include higher value tasks in the job bundle (more of different).
Outcome: Your ability to charge a skill premium increases i.e. augmented by technology, you can now charge a higher salary for the same skills.
Commoditization – one of the lesser understood effects of augmentation – plays out when one (or both) of two factors come(s) into play:
Outcome: Your ability to charge a skill premium decreases i.e. as others in your job category increasingly use technology, your ability to sustain your salary decreases.
This scenario is very likely with AI, in general, and Gen AI, in particular, owing to its ability to commodify knowledge and commoditize knowledge-oriented tasks.
Finally, the end game of augmentation could very well be automation (substitution).
This may play out because of data-driven learning effects.
As workers increasingly use augmenting tech, they generate data needed for model training and fine-tuning, which constantly improves model accuracy and specialization. Eventually, the model becomes specialized enough to fully substitute the task.
Outcome: Your job itself gets substituted.
Let’s get back to the central argument of this post.
AI won’t eat your job but it will eat your ability to charge a premium for it.
Below, we explore three sources of skill premium:
Let’s look at three factors that drive erosion of the skill premium.
Workers who perform specialized tasks and command a skill premium over unskilled workers will lose that skill premium if AI enables low-skilled workers to perform at par with the high-skilled ones.
As I explain in Slow-burn AI: When augmentation, not automation, is the real threat:
When technology augments skilled work and enables historically low-skilled workers to perform at par with historically high-skilled workers, such augmentation makes workers more substitutable and eventually commoditizes the job.
AI augmentation makes high-skilled knowledge workers relatively more substitutable by lowering the skill barrier to achieve the same performance.
Augmentation improves productivity and output but does so to a greater extent for those with lower skills than for those with higher skills. When this plays out, the worker becomes more substitutable as the skill becomes more commoditized.
As explained above, skill commoditization is the first effect of AI, but that in itself isn’t unique to AI. Previous technology cycles also eroded the skill premium on specialized tasks.
AI is unique in its impact on learning advantages.
Consider the axe wielding example I take in Slow-burn AI:
Up until the start of the twentieth century, axe-wielding was a high-skilled job. In order to be any good at logging, you needed to perfect the right angle of the axe swing as well as the grip on the axe, through years of practice, that developed both muscle and muscle memory.
The invention of the chainsaw changed all of that. Low-skilled loggers, who lacked the knowledge or the muscle memory to perform well, could now perform at a much higher level of effectiveness.
The chainsaw was a technology that commoditized axe-wielding as a skill. But axe-wielding, arguably, has diminishing returns to learning once you’ve reached a certain level of skill. There is only so much further improvement achievable through learning beyond that point.
A lot of knowledge work is quite different. Knowledge work requires you to constantly keep learning.
Take healthcare as an example. The average physician goes through 5-8 years of medical school and needs to continue honing her craft over the course of a long career. Law is another field where learning advantages command a skill premium.
The skill premium here is attached not merely to skill specialization but to the ability to grow and defend that premium through continuous learning.
AI attacks this skill premium because of its ability to learn at scale:
AI compounds commoditization by continuous absorption of skills.
The more successful the AI is at augmentation, the more often it is used, the more data it captures to get trained, and the more effective it gets at future augmentation.
Take healthcare as an example:
AI commodifies and absorbs proprietary knowledge and learning advantages that workers (e.g. doctors) may have developed in a specific industry.
As AI augmentation plays out, doctors are constantly training the model driving more of their skill into the model.
In China, PingAn Good Doctor started out as a tele-health platform with AI-augmented doctors treating patients over the platform. Several years of model training later, the AI model now beats human doctors at many tasks.
The AI has moved on from augmentation to substitution.
With large language models (LLMs), the scope of knowledge that can be commodified and absorbed into a model increases further, increasing the potential for similar commoditization to play out across a larger range of jobs.
Much like Google Maps ‘absorbs’ a cabbie’s navigation skills and productizes it to enable anyone to navigate effectively, LLMs ‘absorb’ knowledge, potentially enabling ‘lower-skilled’ or ‘less-informed’ actors to perform tasks that were previously handled by ‘highly-skilled’ actors.
This is the second way AI attacks the skill premium.
Here’s a common trope in narratives about the impact of AI on work:
AI will always need humans in the loop.
Well, what do humans excel at?
A bunch of things, actually. Original thought, soulful (and not just creative) expression, relationships, and managerial goal-orientation, to name a few.
To actually get things done, you need one or more of the above. And technology is rarely good at substituting these. That’s why humans are needed in the loop.
Except that managerial goal-orientation may not be the sole domain of humans for long.
Enter AI agents.
The unbundling and rebundling of organizations We frequently make the mistake of thinking of AI…
On the risks of over-emphasizing platform thinking In an age of platform hype, everyone scrambles…
The untold story of the most under-used real estate on the phone screen Which players…
How stand-up comedy helps Amazon win at e-commerce On Attention Conglomerates and Internal Attention Markets…
The race for the primary interface in the age of AI Everyone (and their dog)…
Once upon a time, retailers invested in large swathes of land, arranged aisles with the…