The question is... is this based on existing capability of LLMs to do these jobs? Or are companies doing this on the expectation that AI is advanced enough to pick up the slack?
I have observed a disconnect in which management is typically far more optimistic about AI being capable of performing a specific task than are the workers who currently perform that task.
And to what extent is AI-related job cutting just an excuse for what management would want to do anyway?
I do not see anything in this study that accounts for the decline in economic activity. Is it AI replacing the jobs, or is it that companies are not optimistically hiring, which disproportionally impacts entry level jobs?
Agree, I think the high cost of full time hires for entry level software jobs (total comp + onboarding + mentoring) vs investing in AI and seeing if that gap can be filled is a far less risky choice at the current economic state.
6-12 months in, the AI bet doesnt pay off, then just stop spending money in it. cancel/dont renew contracts and move some teams around.
For full time entry hires, we typically dont see meaningful positive productivity (their cost is less than what they produce) for 6-8 months. Additionally, entry level takes time away from senior folks reducing their productivity. And if you need to cut payroll cost, its far more complicated, and worse for morale than just cutting AI spend.
So given the above, plus economy seemingly pre-recession (or have been according to some leading indicators) seems best to wait or hire very cautiously for next 6-8 months at least.
Even then why hire a junior dev instead of a mid level developer that doesn’t need mentoring? You can probably hire one for the same price as a junior dev if you hire remotely even in the US.
Or H1B / outsourcing replacement. There are data points showing tech companies hiring thousands of foreign workers while layout off domestic employees. It should factor in to these analyses of displaced junior developers.
Exactly this. 2023Q1 was when the interest rate hike from the previous year really kicked in with full force. It was the first hiring market I ever saw in well over a decade where the employers were firmly in the drivers seat even for seniors.
I can imagine that there were a decent number of execs who tried chatgpt, made some outlandish predictions and based some hiring decisions upon those predictions though.
This paper looks kinda trashy - confusing correlation with causation and clickbaity.
An alternate explanation is that, even if AI does not have any chance of entirely replacing an employee, not even junior hires, in the hands of competent seniors AI does substantially improve their productivity and eliminates whole classes of tasks that were traditionally where juniors cut their teeth.
So companies reduce junior hiring because their work is relatively less valuable, and they can meet their goals by shuffling existing resources. When they can't do that, they go for seniors since the immediate bang for the buck is higher (ofc, while depleting the global pipeline that actually produces seniors in the long run, in a typical tragedy of the commons)
to me it is just market pressure to exploit the high stress on laborers atm. the level of uncertainty today is only a problem if you don't have a ton of existing capital, which everyone in charge does. so they are insulated and can treat people poorly without repercussions. in a market that prefers short term profits, they will then do this in order to make more money right now.
companies must do this, 'cause if they don't then their competition will (i.e. the pressure)
of course, we can collectively decide to equally value labor and profit, as a symbiotic relationship that incentivizes long term prosperity. but where's the memes in that
One issue we're running into at my job: we're struggling to find entry-level candidates whoaren't lying about what they know by using an LLM.
For the tech side, we've reduced behavioral questions and created an interview that allows people to use cursor, LLMs, etc. in the interview - that way, it's impossible to cheat.
We have folks build a feature on a fake code base. Unfortunately, more junior folks now seem to struggle a lot more with this problem
We came up with some simple coding exercises (about 20 minutes total to implement, max) and asked candidates to submit their responses when applying. Turns out one of the questions regularly causes hallucinated APIs in LLM responses, so we've been able to weed out a large percentage of cheaters who didn't even bother to test the code before submitting.
The other part is that you can absolutely tell during a live interview when someone is using an LLM to answer.
This is the big question. It could be any combination of the following and it likely depends on the company/position too:
- Generative AI is genuinely capable enough to replace entry level workers at a lower cost.
- The hype around Generative AI is convincing people who make hiring decisions that it is capable enough to replace entry level workers at a lower cost.
- The hype around Generative AI is being used as an excuse to not hire during an economic downturn.
This was close to my first thought as well. I don't think we're far enough along the LLM adoption curve to actually know how it will affect the business case and thus employment long term. In the last couple of years of LLM/AI honeymoon, the changes to accommodate the technology may obscure direct and second order effects.
I've yet to see any company show actual factual revenue increase from employing AI. So I doubt they have created the internal analysis structures to ensure what exactly they're using AI for.
The question of reasoning doesn't really matter once you show its happening.
When you constrict the market like they have done, you naturally get distortions, and the adversarial nature of the market fails to perform economic calculation potentially leading to grave consequences. Even at this point, whipsaws would be quite destructive. I know people who have abandoned their careers due to lack of job availability for the foreseeable future. They were searching for years with no recovery.
When you destroy a pipeline of career development for short-term profit which is possible because of the decoupled nature of money-printing/credit facility, decisions made are psychologically sticky. When there is no economic benefit for competency, the smart people leave for a market where this is possible.
The smart people I know right now are quietly preparing for socio-economic collapse as a result of runaway money-printing. When you have a runaway dangerous machine, you can either step back and let it destroy itself (isolating yourself), or you can accelerate the breakdown of its dependent cycles. Many choose the former, since the latter carries existential risk for no real benefit in the short-term but the latter would result in the least amount of causalties.
70% of the economy covers the white-collar market, which will be gone soon, not because the jobs can be replaced by AI, but because the business leaders in consolidated industry all decide to replace workers becoming the instrument of their own demise through deflationary economics.
I am confused about how to feel about the data the paper is based on. If you look at the paper, the data description is:
"Our primary data source is a detailed LinkedIn-based resume dataset provided by Revelio Labs ...
We complement the worker resume data with Revelio’s database of job postings, which tracks recruitment activity by the firms since 2021 ...
The final sample consists of 284,974 U.S. firms that were successfully matched to both employee position data and job postings and that were actively hiring between January 2021 and March 2025.3 For these firms, we observe 156,765,776 positions dating back to 2015 and 245,838,118 job postings since 2021, of which 198,773,384 successfully matched with their raw text description."
They identified 245 million job postings from 2021 forward in the United States? I mean the U.S. population is like 236 million for the 18-65 age group (based on wikipedia, 64.9% of 342 total population).
And they find a very small percentage of firms using generative AI:
"Our approach allows us to capture firms that have actively begun
integrating generative AI into their operations. By this measure, 10,599 firms, about 3.7 percent of our sample, adopted generative AI during the study period."
Maybe I am wildly underestimating just how much LinkedIn is used worldwide for recruiting? As a tech person, I'm also very used to seeing the same job listing re-listed by what seems to be a large number of low-effort "recruiting" firms on LinkedIn.
I think for trying to figure out how generative AI is affecting entry-level jobs, I would have been much more interested in some case studies. Something like find three to five companies (larger than startups? 100+ employees? 500+?) that have decided to hire fewer entry-level employees by adding generative AI into their work as a matter of policy. Then maybe circling back from the case studies to this larger LinkedIn dataset and tied the case study information into the LinkedIn data somehow.
> For these firms, we observe 156,765,776 positions dating back to 2015 and 245,838,118 job postings since 2021, of which 198,773,384 successfully matched with their raw text description."
I'm obviously misreading this somehow. How do you have 156m positions dating back to 2015, but far more than that number in a smaller timeframe?
I think it is just poorly worded. From another point in the paper:
"Our analysis draws on a new dataset that combines LinkedIn resume and job-posting data from Revelio Labs. The dataset covers nearly 285,000 U.S. firms, more than 150 million employment spells from roughly 62 million unique workers between 2015 and 2025, and over 245 million job postings."
I guess we can read that as saying the authors identified 62 million workers who held 150 million positions over the 2015-2025 time window.
I'm still deeply skeptical about the underlying data. The 62 million represents a huge percentage of employed people in the U.S. in any of the years 2015-2025. This source shows 148 million/yr to 164 million/yr employed over that timeframe:
On the other hand, I also saw estimates saying LinkedIn has approximately 30% of the U.S. workforce with a profile on the platform. Which is wild to me.
Interesting. However just because this is true right now doesn't mean it will be true going forward. Unique to the current moment is that there are simultaneously (1) high interest rates and a challenging economy (2) a narrative that AI adoption should enable cutting junior roles. This could lead to companies that would anyway be doing layoffs choosing to lay off or not hire juniors, and replace with AI adoption.
To really test the implied theory that using AI enables cutting junior hiring, we need to see it in a better economy, in otherwise growing companies, or with some kind of control (though not sure how this would really be possible).
>Unique to the current moment is that there are simultaneously (1) high interest rates and a challenging economy (2) a narrative that AI adoption should enable cutting junior roles.
I'm not disputing your point, but I'm curious: given that the main headline measures that we tend to see about the US economy right now involve the labour market. How do you establish the counterfactual?
When there are downturns in tech, companies squeeze out junior people. This happens often. After 2008 a whole cohort of top talent from software engineering schools were lucky to get lower paying QA jobs and only a few landed software development positions. There were chief economists for banks writing about the underemployed generation (generalized to all white collar) and how they can’t get started or have the same opportunities.
I think we might be seeing this now but headlines get more clicks with AI taking our jobs.
Generative AI may automate some entry-level tasks, but young professionals are not just “replaceable labor.” They bring growth potential, adaptation, and social learning. Without frameworks to manage AI’s role, we risk undermining the very training grounds that prepare the next generation of experts.
I agree that high turnover is a real constraint. That’s why the answer isn’t “10 years of apprenticeship” but designing scaffolds that combine learning with contribution in a shorter timeframe. Things like short rotations, micro-credentials, or mentorship stipends let juniors add value while they’re still on the job. Even if they leave after a few years, the investment isn’t wasted — both sides still capture meaningful returns.
How do they bring more “growth potential” than a mid level developer with 3-5 years of experience? The average tenure of a developer is 2-3 years. I expect that to increase going forward slightly as the job market continues to suck. But why would I care about the growth of the company when my promotion criteria is based on delivering quarterly or yearly goals? Those goals can much more easily be met by paying slightly more for a mid level developer who doesn’t do negative work both directly and by taking time away from the existing team?
You’re absolutely right that mid-level hires buy immediate productivity. But “growth potential” isn’t just romanticism — it’s an investable trajectory. With the right project design, feedback loops, and domain exposure, juniors can grow into “multipliers” — people who combine technical skills with adaptability or domain expertise. That’s a kind of return you rarely get from simply adding another mid-level hire. In practice, resilient organizations balance both: mid-levels for immediate throughput, and juniors for long-term strength.
Of course I meant 3-5 years of experience not 35 years of experience. :) I just edited it.
You’re not “investing” in anyone if their tenure is going to be 2-3 years with the first one doing negative work.
And why should juniors stay? Because of salary compression and inversion, where HR determines raises. But the free market determines comp for new employees, it makes sense for them to jump ship to make more money. I’ve seen this at every company I’ve worked for from startups, to mid size companies, to boring old enterprise companies to BigTech.
Where even managers can’t fight for employees to get raises at market rates. But they can get an open req to pay at market rates when that employee leaves.
And who is incentivize to care about “the organization” when line level managers and even directors or incentivized to care about the next quarter to the next year?
I hear you — salary compression and inversion, along with short tenure, are very real structural problems. It’s understandable that managers and even directors end up focused only on the next quarter.
My broader point is that when these short-term incentives dominate, organizations (and societies) lose the capacity to build for the long term. That’s exactly why governance frameworks matter: they help create safeguards against purely short-term dynamics — whether in HR policy or in AI policy.
We had some marketing folks give us a company-wide demo of Chat GPT and some other Gen AI tools and showed us how cool it is and how quick they can make stylish and sophisticated pitch decks and marketing materials now.
And the entire time I'm watching this I'm just thinking that they don't realize that they are only demonstrating the tools that are going to replace their own jobs. Kinda sad, really. Demand for soft skills and creatives is going to continue to decline.
LLMs are good at creating single use documents, like a pitch deck used for one prospective customer (never to be used again). But for long lived documents, on which future work builds atop, the bar is higher and the value of LLMs is more grey.
And then the customers use gen AI to summarize the same pitch decks/marketing materials so they don't have to look at them. Let's cut out the middle man and just send the prompt instead.
Those marketing people may also be engaging in small talk about sports teams on the phone with customers, doing sushi and karaoke with customers, procuring coke and hookers for customers... LLMs can't yet do that. Unless the customer is also an LLM of course.
In the late 90s you weee considered a prodigy if you understood how to use a search engine. I had so many opportunities simply because I could find and retain information.
So LLMs have solved this. Knowing a framework or being able to create apps is not a marketable skill any longer. What are we supposed to do now?
It’s the soft skills that matter now. Being well liked has always been more important in a job than being the best at it. We all know that engineer who knows they are hot shit but everyone avoids because they are insufferable.
Those marketing people don’t need to spend a week on their deck any longer. They can work the customer relationship now.
Knowing how to iterate with an LLM to give the customer exactly what they need is the valuable skill now.
but i guess to me the question is: if you're management, do you expect your workers to do more/work faster (like a TAS in a way)? or do you expect to replace your workers entirely?
I personally think we're still a ways from the latter...
New Harvard's study (62M workers, 285k firms) shows firms adopting generative AI cut junior hiring sharply while continuing to grow senior roles — eroding the bottom rungs of career ladders and reshaping how careers start.
What was the incentive for companies to train juniors into seniors in the past, post job-hopping era? Curious to know if that incentive has warped in the past two decades or so as someone who's starting their career now.
Cheap labor. It doesn't take that much to train someone to be somewhat useful, in mmany cases. The main educators are universities and trade schools. Not companies.
And if they want more loyalty the can always provide more incentives for juniors to stay longer.
At least in my bubble it's astonishing how it's almost never worth it to stay at a company. You'd likely get overlooked for promotions and salary rises are almost insultingly low.
There is no incentive now because the social contract is broken and there is too much mobility. Best you can do is find a supportive boss / a company that provides training opportunities.
In 10 years where do the senior dev's come from? Real question. Seems like with lower entry level jobs now, in 10 years there won't be seniors to hire.
Even if we grant this is going to be a problem, it makes no sense for any individual company to do anything about it. Why take on the cost of training a junior when they can bail in a few years? This is especially true if you're not a big tech company, which puts you at risk of having your junior-turned-senior employees poached by big tech.
And most of my friends and colleagues would take a full remote role that pays half what big tech, 5 days in office pays. Add in an extra week of PTO and you have a great pitch to devs.
You can incentivize people to stay with things other than salary.
Salary plays a part of course, but there is a lot of other aspects that make staying at a job worthwhile.
In 10 years, the management (or "leadership" if you like the taste of boot) responsible for doing the cutting will have moved on to something else, with no consequences for them.
Junior devs eventually will have been brought up with agentic coding, etc. Hopefully whatever the "new way" becomes is how they'll be taught.
Currently part of the problem is the taboo using AI coding in undergrad CS programs. And I don't know the answer. But someone will find the right way to teach new/better ways of working with and without generative AI. It may just become second nature to everyone.
While agentic coding can make you productive, it won't teach you to deeply understand the source code, algorithms, or APIs produced by AI. If you can't thoroughly audit any source code created by an AI agent, then you are definitely not a senior developer.
This is just not true. I have witnessed people who would have been called dabblers or tinkerers just a few years ago become actual developers by using cursor. They ask a few key questions when they get stuck about engineering best practices and really internalize them. They read the code they are producing and ask the assistant questions about their codebase. They are theorycrafting using AI then implementing and testing. I have witnesses this with my own eyes and as AI has gotten better they have also been getting more knowledgeable. They read the chains of thought and study the outputs. They have become real developers with working programs on their github. AI is a tool that teaches you as it is used if you put in the effort. I understand many folks are 'vibe coding' and not learning a single thing and I don't know if thats the majority or the minorty, but the assertion that all people learn nothing from use of these tools is false.
You're talking about people who put in a significant non-trivial effort to thoroughly understand the code produced by the AI. For them, AI was just one path to becoming proficient developers. They would have gotten there even before the AI boom. I was not talking about such highly-motivated people.
As an occasional uni TA, I'm leaning toward banning LLM for easy coursework while allowing it on more difficult & open-ended ones.
Pretty sure it's a self-destructive move for a CS or software engineering student to pass foundational courses like discrete math, intro to programming, algorithm & data structure using LLM. You can't learn how to write if all you do is read. LLM will 1-shot the homework, and the student just passively reads the code.
On more difficult and open coursework, LLM seems to work pretty well at assisting students. For example, in the OS course I teach, I usually give students a semester-long project on writing from scratch x86 32-bit kernel with simple preemptive multitasking. LLM definitely makes difficult things much more approachable; students can ask LLM for "dumb basic questions" (what is pointer? interrupt? page fault?) without fear of judgement.
But due to the novelty & open-ended nature of the requirement ("toy" file system, no DMA, etc), playing a slot machine on LLM just won't cut it. Students need to actually understand what they're trying to achieve, and at that point they can just write the code themselves.
I hand-wrote code in the late 00s. Java, assembly, C. The graders gave us some grace since we couldn't test, but you were expected to be pretty accurate. Hell, one quiz was just 20 identical pages on which we iterated through the Tomasulo algorithm.
Agentic coding is like leading and instructing a team of a bunch of very dumb but also very smart junior devs. They can follow instructions to the T and have great memory but lack common sense and have no experience. The more experienced and skilled their leadership, the better chance of getting a good result from them, which I don’t think is a good job (yet?) for an entry level human SWE.
My guess is that at some the code itself, in a language humans easily comprehend, will become superficial as we delegate more and more of the logic to AI development. Perhaps in the near future AIs will be writing things at a much lower level by default and the entire act of programming as we know it goes away.
Kind of like that meme or how two AIs talking to each other spontaneously develop their own coding for communication. The human trappings become extra baggage.
This is the same reason they force you to do the math by hand in undergrad and implement functions that are already in the standard libraries of most languages. Because you don't know anything yet, and you need to learn why the more automated stuff works the way it does.
Not even a little bit. Where I work we regularly churn through kids just out of college and most of them don't have Clue One how to operate anything on their computer.
Yeah, growing up in the 80s or 90s might have had you uniquely well-positioned to be "good with computers", because "the computer that has games and the internet" was (in some sense) the same as "the computer that adults are supposed to use for work".
That's not true anymore in the smart phone / tablet era.
5-10 years ago my wife had a gig working with college kids and back then they were already unable to forward e-mails and didn't really understand the concept of "files" on a computer. They just sent screenshots and sometimes just lost (like, almost literally) some document they had been working on because they couldn't figure out how to open it back up. I can't imagine it has improved.
Might have been the case before. But these days, kids are brought up on locked-down content-focused machines (e.g. ipads). They struggle with anything harder than restarting an app.
When my little cousin was three and already knew how to use the phone by himself people were claiming he was gonna be a tech wizard and everybody was talking about digital natives. But when he got to high school he didn't know how to turn a computer on. How useful is it to be god tier at getting results from LLMs, if you have zero clue if the result you got is any good?
I suppose the idea is that those junior developers who weren't hired will spend 10 years doing intensive, unpaid self-study so that they can knock on the door as experienced seniors by that time.
Are you serious? How on earth are these people going to eat or pay rent for 10 years? As well, most companies would laugh you out the door if you were applying a senior role without any experience working in the role.
I'm not laughing at all. I'm definitely not making fun of those who may be affected by this. My sarcasm was directed at people or companies planning to implement such ideas.
> In 10 years where do the senior dev's come from?
From company interns. Internships won't go away, there will just be less of them. For example, some companies will turn down interns because they do not have the time to train them due to project load.
With AI, now employed developers can be picky on whether or not to take on interns.
They will be promoted but they won't have the requisite experience. We'll have people in the highest positions with the usual titles, but they will be severely underqualified. Enshitification will ensue.
"Seniority-bias" is the wrong word. It's "skill-bias", or better yet, "tacit-knowledge bias".
The reason why old hands (who have been in the game long enough to be promoted to something approximating "Senior") are so deadly with AI is because they know all the traps/pitfalls to watch out for. Sure, the AIs may become good enough in 5-10 years to have learned all these pitfalls, but make hay while the sun is shining!
"Senior EMs" who are pure HR managers/non-coders will be priced-out of their jobs in an AI agentic world, as (surprise!) you can't report AI agents to HR. Juniors who can use AI effectively will thrive and climbs the ranks rapidly.
The pessimistic reading is well-represented, so here's another: AI changes the definition of "entry-level", but it doesn't eliminate the class of professional labor that experienced workers would rather not do.
Until AI can do literally everything we can, that class of work will continue to exist, and it'll continue to be handed to the least experienced workers as a way for them to learn, get oriented, and earn access to more interesting problems and/or higher pay while experienced folks rest on their laurels or push the state of the art.
In the future, there will be two kinds of companies:
1. Those that encourage people to use AI agents aggressively to increase productivity.
2. Those that encourage people to use AI agents aggressively to be more productive while still hiring young people.
Which type of company will be more innovative, productive, and successful in the long run?
How much cheaper are younger developers overall? If you look at the delta between junior and mid level developers in most industries - ie enterprise dev not BigTech or adjacent - it’s really not that much
It's a down market and you might be better able to retain them if the market turns up again (or maybe not, I'm old and mostly just want a remote gig where I have time to parent my child).
In an up market it’s going to be worse. HR is still going to hold the line on raises and not keep up with the market.
I’m old too - 51. But I consistently tell young people to chase money. Target BigTech or adjacent companies, “grind leetCode”, avoid startups and Monopoly money “equity”, etc.
One person I mentored when they were an intern in 2021 and when they came back the next year, is 25 years old making $220K (Solution Architect not a developer) and I couldn’t be happier for them. They make the same as I make now. But I’ve already raised two (step)kids, bought and sold the big house in the burbs, etc and love working remotely.
I told them to do whatever it takes to be seen, promote themselves, build a network internally and with clients and make all the money they can.
I don't understand how you want innovation and productivity in a world with rapidly increasing population. We need less and less people while producing more and more people. Where am I wrong?
That world was 30 years ago. In 2025 world average total fertility rate is 2.2, which is a shade above replacement rate (2.1). And 2.2 is a 10% drop since 2017 alone (when it was 2.46).
Because life expectancy is higher, the population will continue to increase. But not "rapidly".
Depends what kind of people you need. For innovation you generally need exceptional people, and the probability of finding such a person is proportional to the total number of people available.
Kinda feels like a lot of companies think they can be option 1 because someone else will be option 2, and then they'll hire the young people away after they become experienced.
Any job that doesn't creatively generate revenue will be systematized and automated as soon as possible. AI agents are just an acceleration factor for this
The tragedy of the commons: companies acting in their self interest at the expense of the industry by drying up the workforce pipeline. The next generation will pay, like when America stopped producing hardware.
It has been like this for decades by now, and your precious government loves this because they also consist of old people who hate the younger generations. Any and every time the government intervenes it is to stomp down on the youth and nothing else.
I heard an argument that the valuation of AI/ AI-adjacent firms only makes sense if 1) companies grow their aggregate top-line revenues by multiple trillion dollars of 2) this amounts to the potential savings from gutting knowledge-based employment across an economy.
Of course in the long run a chronically underemployed economy will have little demand for products and services, but that is beyond the scope of companies who, in general, are focused on winning short term and zero-sum market capture. However I believe that while a billion dollar valuation is a market and strategy problem, a trillion dollar valuation is a political problem - and I would hope that a mandate of broad gainful employment translates to political action - although this remains to be seen.
I wonder how these findings would hold up if the analysis could control for the widespread corporate strategy of replacing domestic junior roles with dedicated offshore teams?
It’s really hard to adjust for economic factors here. I am in agreement with skepticism in this thread as job numbers got revised downward heavily in both 2024 and 2025 (too negatives in some months) indicating a poor economic situation.
What happens when companies refuse to hire, even though there is an obvious need? It has to lead to reduced growth. If the majority of companies do this, I would think it would lead to a severe deflationary cycle.
If a few companies do this it is probably fine, but more interesting if most companies do this. It seems that it would be akin to a self inflicted depression with severe deflation.
This makes me think the conversation around AI and jobs is too focused on total employment, when the real story is how it shifts opportunities within companies. If juniors are getting fewer chances to enter, that could create long-term bottlenecks for talent growth.
If entry-level roles are shrinking, how should companies rethink talent development?
Without the traditional “bottom rungs,” how do we grow future seniors if fewer juniors ever get the chance to start?
When were companies ever thinking about talent development, especially for SWE? We had some loose "mentorship" roles but IME most folks are left to their own devices or just learn by bandwagoning things from reddit.
I think open source contributions/projects will still be a way to gain verifiable experience.
Other than that, I guess developing software in some capacity while doing a non-strictly software job - say, in accounting, marketing, healtcare, etc. This might not be a relevant number of people if 'vibe coding' takes hold and the fundamentals are not learned/ignored by these accountants, marketers, healthcare workers, etc.
If that is the case, we'd have a lot of 'informed beginners' with 10+ years of experience tangentially related to software.
Edit: As a result of the above, we might see an un-ironic return to the 'learn to code' mantra in the following years. Perhaps now qualified 'learn to -actually- code'? I'd wager a dollar on that discourse popping up in ~5 years time if the trend of not hiring junior devs continues.
I'm looking forward for the weird inflective trend of "organic" programs, "humane" dev treatment, and software development taking a long time being seen as a mark of quality rather than stagnation or worry. :)
I'm half-joking, but I wouldn't be surprised to see all sorts of counterpoint marketing come into play. Maybe throw in a weird traditional bent to it?
> (Pretentious, douche company): Free-range programming, the way programming was meant to be done; with the human touch!
All-in-all, I already feel severely grossed out any time a business I interact with introduces any kind of LLM chatbot shtick and I have to move away from their services; I could genuinely see people deriving a greater disdain for the fad than there already is.
The plan seems to be to hope that AI will be able to replace the senior ICs in the near future. They're certainly gutting the ranks of management today in a way that presupposes there will be far fewer ICs of all levels to manage soon.
That's much longer than a quarterly earnings report away, which makes it "somebody else's problem" for the executives pushing these policies. There's no reason to expect these people to have a long-term strategy in mind as long as their short-term strategy gives them a golden parachute.
Job market is not obligated to ensure sustained supply of talented individuals, so, I think, short term they'd just keep chasing unicorns. Long term, governments will be forced to financially incentivize young hires. Or something terrible happens and everything rolls back to 1945.
presumably this also means the relative value of seniors is now increasing, as the pipeline to replace them is smaller.
its like how the generic "we take anyone" online security degree has poisoned that market -- nothing but hoards of entry level goobers, but no real heavy hitters on the mid-to-high end. put another way, the market is tight but there are still reasonable options for seniors.
Agree, increased value and demand for seniors. But how will the market solve the generation of new seniors if juniors are getting less opportunities?
Take the software development sector as example: if we replace junior devs by AI coding agents and put senior devs to review the agent's work, how will we produce more seniors (with wide experience in the sector) if the juniors are not coding anymore?
Who cares? This is a once in a lifetime opportunity to finally gatekeep software engineering the way lawyers and finance professionals do with their fields! Enjoy the windfall in 5 years!
The economic turmoil in the US is hollowing out the entry level jobs, AI is just the cover companies are using. The constant tariff changing means that companies have to be very pessimistic in their long term planning, as any assumptions they make can be turned on their head with no notice.
> the largest effects in wholesale and retail trade
Hard for me to believe that AI in its current state is hollowing out junior shop assistant and salesperson roles. Either those jobs were already vulnerable to "dumb" touchscreen kiosks or they require emotional intelligence and embodied presence that LLMs lack.
Salary wise, we are in a race to the bottom.
Massive outsourcing of jobs to India and similar where wages are much lower than comparable US employees coupled with Ai replacing entry level roles.
On a longer time scale you have humanoid robots potentially coming as well. Self driving cars and trucks are going to torpedo trucking and low pay positions such as Uber // taxi driver. The wealth is getting centralized amongst a couple dozen high tech companies.
All in all I am pretty negative on the prospects for young people as they enter the workforce and additionally for older tech people as they are currently operating in an environment where losing your job very possibly means the end of the road.
Another way to look at it is that legacy jobs have no future therefore there is no point bringing in the next generation into a dying system.
Another way to look at it is that hiring is fine, and that the vain entitled generation we all suspected was going to emerge feels that a job should absolutely be available to them, and immediately.
Another way to look at it is that journalism has been dead for quite a while, and writing about the same fear-based topics like “omg hiring apocalypse” is what makes these people predictable money (along with other topics).
Another way to look at it is that we raised a generation of narcissistic parents and children that have been going “omg grades”, “omg good college”, “omg internship”, “omg job” for so long that that these lamentations feel normalized. A healthy dose of stfu was never given to them. Neurotic motherfuckers.
Today they're admitting AI is hollowing out entry-level jobs. The reality is that it can and will replace mid-level and eventually even quite senior jobs.
Why?
It's already doing a lot of the loadbearing work in those mid-level roles too now, it's just a bit awkward for management to admit it. One common current mode of work is people using AI to accomplish their work tasks very quickly, and then loafing a bit more with the extra time. So leaders refrain from hiring, pocket the savings, and keep a tight lid on compensation for those who remain.
At some point they'll probably try to squeeze the workforce for some additional productivity, and cut those who don't deliver it. Note that the "ease" of using AI for work tasks will be a rationale for why additional compensation is not warranted for those who remain.
It's pretty clear this is happening.
The question is... is this based on existing capability of LLMs to do these jobs? Or are companies doing this on the expectation that AI is advanced enough to pick up the slack?
I have observed a disconnect in which management is typically far more optimistic about AI being capable of performing a specific task than are the workers who currently perform that task.
And to what extent is AI-related job cutting just an excuse for what management would want to do anyway?
I do not see anything in this study that accounts for the decline in economic activity. Is it AI replacing the jobs, or is it that companies are not optimistically hiring, which disproportionally impacts entry level jobs?
Agree, I think the high cost of full time hires for entry level software jobs (total comp + onboarding + mentoring) vs investing in AI and seeing if that gap can be filled is a far less risky choice at the current economic state.
6-12 months in, the AI bet doesnt pay off, then just stop spending money in it. cancel/dont renew contracts and move some teams around.
For full time entry hires, we typically dont see meaningful positive productivity (their cost is less than what they produce) for 6-8 months. Additionally, entry level takes time away from senior folks reducing their productivity. And if you need to cut payroll cost, its far more complicated, and worse for morale than just cutting AI spend.
So given the above, plus economy seemingly pre-recession (or have been according to some leading indicators) seems best to wait or hire very cautiously for next 6-8 months at least.
Even then why hire a junior dev instead of a mid level developer that doesn’t need mentoring? You can probably hire one for the same price as a junior dev if you hire remotely even in the US.
A lot of big tech companies are being very opportunistic and reducing hiring/laying-off under the guise of A.I. but really it's weak economy.
Or H1B / outsourcing replacement. There are data points showing tech companies hiring thousands of foreign workers while layout off domestic employees. It should factor in to these analyses of displaced junior developers.
The usual outsourcing capitals of the world also report considerable job losses.
Exactly this. 2023Q1 was when the interest rate hike from the previous year really kicked in with full force. It was the first hiring market I ever saw in well over a decade where the employers were firmly in the drivers seat even for seniors.
I can imagine that there were a decent number of execs who tried chatgpt, made some outlandish predictions and based some hiring decisions upon those predictions though.
This paper looks kinda trashy - confusing correlation with causation and clickbaity.
An alternate explanation is that, even if AI does not have any chance of entirely replacing an employee, not even junior hires, in the hands of competent seniors AI does substantially improve their productivity and eliminates whole classes of tasks that were traditionally where juniors cut their teeth.
So companies reduce junior hiring because their work is relatively less valuable, and they can meet their goals by shuffling existing resources. When they can't do that, they go for seniors since the immediate bang for the buck is higher (ofc, while depleting the global pipeline that actually produces seniors in the long run, in a typical tragedy of the commons)
to me it is just market pressure to exploit the high stress on laborers atm. the level of uncertainty today is only a problem if you don't have a ton of existing capital, which everyone in charge does. so they are insulated and can treat people poorly without repercussions. in a market that prefers short term profits, they will then do this in order to make more money right now.
companies must do this, 'cause if they don't then their competition will (i.e. the pressure)
of course, we can collectively decide to equally value labor and profit, as a symbiotic relationship that incentivizes long term prosperity. but where's the memes in that
One issue we're running into at my job: we're struggling to find entry-level candidates whoaren't lying about what they know by using an LLM.
For the tech side, we've reduced behavioral questions and created an interview that allows people to use cursor, LLMs, etc. in the interview - that way, it's impossible to cheat.
We have folks build a feature on a fake code base. Unfortunately, more junior folks now seem to struggle a lot more with this problem
We came up with some simple coding exercises (about 20 minutes total to implement, max) and asked candidates to submit their responses when applying. Turns out one of the questions regularly causes hallucinated APIs in LLM responses, so we've been able to weed out a large percentage of cheaters who didn't even bother to test the code before submitting.
The other part is that you can absolutely tell during a live interview when someone is using an LLM to answer.
This is the big question. It could be any combination of the following and it likely depends on the company/position too:
- Generative AI is genuinely capable enough to replace entry level workers at a lower cost.
- The hype around Generative AI is convincing people who make hiring decisions that it is capable enough to replace entry level workers at a lower cost.
- The hype around Generative AI is being used as an excuse to not hire during an economic downturn.
Could even still be other things too.
This was close to my first thought as well. I don't think we're far enough along the LLM adoption curve to actually know how it will affect the business case and thus employment long term. In the last couple of years of LLM/AI honeymoon, the changes to accommodate the technology may obscure direct and second order effects.
I've yet to see any company show actual factual revenue increase from employing AI. So I doubt they have created the internal analysis structures to ensure what exactly they're using AI for.
The question of reasoning doesn't really matter once you show its happening.
When you constrict the market like they have done, you naturally get distortions, and the adversarial nature of the market fails to perform economic calculation potentially leading to grave consequences. Even at this point, whipsaws would be quite destructive. I know people who have abandoned their careers due to lack of job availability for the foreseeable future. They were searching for years with no recovery.
When you destroy a pipeline of career development for short-term profit which is possible because of the decoupled nature of money-printing/credit facility, decisions made are psychologically sticky. When there is no economic benefit for competency, the smart people leave for a market where this is possible.
The smart people I know right now are quietly preparing for socio-economic collapse as a result of runaway money-printing. When you have a runaway dangerous machine, you can either step back and let it destroy itself (isolating yourself), or you can accelerate the breakdown of its dependent cycles. Many choose the former, since the latter carries existential risk for no real benefit in the short-term but the latter would result in the least amount of causalties.
70% of the economy covers the white-collar market, which will be gone soon, not because the jobs can be replaced by AI, but because the business leaders in consolidated industry all decide to replace workers becoming the instrument of their own demise through deflationary economics.
How does one concretely prepare for socio-economic collapse, beyond taking a long position in canned goods?
Is it happening?
I am confused about how to feel about the data the paper is based on. If you look at the paper, the data description is:
"Our primary data source is a detailed LinkedIn-based resume dataset provided by Revelio Labs ...
We complement the worker resume data with Revelio’s database of job postings, which tracks recruitment activity by the firms since 2021 ...
The final sample consists of 284,974 U.S. firms that were successfully matched to both employee position data and job postings and that were actively hiring between January 2021 and March 2025.3 For these firms, we observe 156,765,776 positions dating back to 2015 and 245,838,118 job postings since 2021, of which 198,773,384 successfully matched with their raw text description."
They identified 245 million job postings from 2021 forward in the United States? I mean the U.S. population is like 236 million for the 18-65 age group (based on wikipedia, 64.9% of 342 total population).
And they find a very small percentage of firms using generative AI:
"Our approach allows us to capture firms that have actively begun integrating generative AI into their operations. By this measure, 10,599 firms, about 3.7 percent of our sample, adopted generative AI during the study period."
Maybe I am wildly underestimating just how much LinkedIn is used worldwide for recruiting? As a tech person, I'm also very used to seeing the same job listing re-listed by what seems to be a large number of low-effort "recruiting" firms on LinkedIn.
I think for trying to figure out how generative AI is affecting entry-level jobs, I would have been much more interested in some case studies. Something like find three to five companies (larger than startups? 100+ employees? 500+?) that have decided to hire fewer entry-level employees by adding generative AI into their work as a matter of policy. Then maybe circling back from the case studies to this larger LinkedIn dataset and tied the case study information into the LinkedIn data somehow.
> For these firms, we observe 156,765,776 positions dating back to 2015 and 245,838,118 job postings since 2021, of which 198,773,384 successfully matched with their raw text description."
I'm obviously misreading this somehow. How do you have 156m positions dating back to 2015, but far more than that number in a smaller timeframe?
I think it is just poorly worded. From another point in the paper:
"Our analysis draws on a new dataset that combines LinkedIn resume and job-posting data from Revelio Labs. The dataset covers nearly 285,000 U.S. firms, more than 150 million employment spells from roughly 62 million unique workers between 2015 and 2025, and over 245 million job postings."
I guess we can read that as saying the authors identified 62 million workers who held 150 million positions over the 2015-2025 time window.
I'm still deeply skeptical about the underlying data. The 62 million represents a huge percentage of employed people in the U.S. in any of the years 2015-2025. This source shows 148 million/yr to 164 million/yr employed over that timeframe:
On the other hand, I also saw estimates saying LinkedIn has approximately 30% of the U.S. workforce with a profile on the platform. Which is wild to me.Ah okay, positions (actual jobs) vs job postings. Makes sense. Thanks!
You're misreading, 156m positions, and 245m job postings. A single position can have multiple job postings created for it.
Interesting. However just because this is true right now doesn't mean it will be true going forward. Unique to the current moment is that there are simultaneously (1) high interest rates and a challenging economy (2) a narrative that AI adoption should enable cutting junior roles. This could lead to companies that would anyway be doing layoffs choosing to lay off or not hire juniors, and replace with AI adoption.
To really test the implied theory that using AI enables cutting junior hiring, we need to see it in a better economy, in otherwise growing companies, or with some kind of control (though not sure how this would really be possible).
>Unique to the current moment is that there are simultaneously (1) high interest rates and a challenging economy (2) a narrative that AI adoption should enable cutting junior roles.
I'm not disputing your point, but I'm curious: given that the main headline measures that we tend to see about the US economy right now involve the labour market. How do you establish the counterfactual?
When there are downturns in tech, companies squeeze out junior people. This happens often. After 2008 a whole cohort of top talent from software engineering schools were lucky to get lower paying QA jobs and only a few landed software development positions. There were chief economists for banks writing about the underemployed generation (generalized to all white collar) and how they can’t get started or have the same opportunities.
I think we might be seeing this now but headlines get more clicks with AI taking our jobs.
Interest rates are not high. High-er than the post 2008 era yes. Not historically high though.
Isn't this just economic uncertainty, after math of the hiring spree following covid, macro economic conditions (interest rate, etc).
It still feels too early to predict the outcome, winds may change again.
Generative AI may automate some entry-level tasks, but young professionals are not just “replaceable labor.” They bring growth potential, adaptation, and social learning. Without frameworks to manage AI’s role, we risk undermining the very training grounds that prepare the next generation of experts.
That's a lot to invest in someone at a large comparative loss, in a world where employees don't last more than a couple years before job hopping.
I agree that high turnover is a real constraint. That’s why the answer isn’t “10 years of apprenticeship” but designing scaffolds that combine learning with contribution in a shorter timeframe. Things like short rotations, micro-credentials, or mentorship stipends let juniors add value while they’re still on the job. Even if they leave after a few years, the investment isn’t wasted — both sides still capture meaningful returns.
How do they bring more “growth potential” than a mid level developer with 3-5 years of experience? The average tenure of a developer is 2-3 years. I expect that to increase going forward slightly as the job market continues to suck. But why would I care about the growth of the company when my promotion criteria is based on delivering quarterly or yearly goals? Those goals can much more easily be met by paying slightly more for a mid level developer who doesn’t do negative work both directly and by taking time away from the existing team?
You’re absolutely right that mid-level hires buy immediate productivity. But “growth potential” isn’t just romanticism — it’s an investable trajectory. With the right project design, feedback loops, and domain exposure, juniors can grow into “multipliers” — people who combine technical skills with adaptability or domain expertise. That’s a kind of return you rarely get from simply adding another mid-level hire. In practice, resilient organizations balance both: mid-levels for immediate throughput, and juniors for long-term strength.
Of course I meant 3-5 years of experience not 35 years of experience. :) I just edited it.
You’re not “investing” in anyone if their tenure is going to be 2-3 years with the first one doing negative work.
And why should juniors stay? Because of salary compression and inversion, where HR determines raises. But the free market determines comp for new employees, it makes sense for them to jump ship to make more money. I’ve seen this at every company I’ve worked for from startups, to mid size companies, to boring old enterprise companies to BigTech.
Where even managers can’t fight for employees to get raises at market rates. But they can get an open req to pay at market rates when that employee leaves.
And who is incentivize to care about “the organization” when line level managers and even directors or incentivized to care about the next quarter to the next year?
I hear you — salary compression and inversion, along with short tenure, are very real structural problems. It’s understandable that managers and even directors end up focused only on the next quarter.
My broader point is that when these short-term incentives dominate, organizations (and societies) lose the capacity to build for the long term. That’s exactly why governance frameworks matter: they help create safeguards against purely short-term dynamics — whether in HR policy or in AI policy.
[dead]
We had some marketing folks give us a company-wide demo of Chat GPT and some other Gen AI tools and showed us how cool it is and how quick they can make stylish and sophisticated pitch decks and marketing materials now.
And the entire time I'm watching this I'm just thinking that they don't realize that they are only demonstrating the tools that are going to replace their own jobs. Kinda sad, really. Demand for soft skills and creatives is going to continue to decline.
Dev jobs too.
LLMs are good at creating single use documents, like a pitch deck used for one prospective customer (never to be used again). But for long lived documents, on which future work builds atop, the bar is higher and the value of LLMs is more grey.
And then the customers use gen AI to summarize the same pitch decks/marketing materials so they don't have to look at them. Let's cut out the middle man and just send the prompt instead.
Developers always automate themselves out of a job
Those marketing people may also be engaging in small talk about sports teams on the phone with customers, doing sushi and karaoke with customers, procuring coke and hookers for customers... LLMs can't yet do that. Unless the customer is also an LLM of course.
I have the opposite expectation actually
In the late 90s you weee considered a prodigy if you understood how to use a search engine. I had so many opportunities simply because I could find and retain information.
So LLMs have solved this. Knowing a framework or being able to create apps is not a marketable skill any longer. What are we supposed to do now?
It’s the soft skills that matter now. Being well liked has always been more important in a job than being the best at it. We all know that engineer who knows they are hot shit but everyone avoids because they are insufferable.
Those marketing people don’t need to spend a week on their deck any longer. They can work the customer relationship now.
Knowing how to iterate with an LLM to give the customer exactly what they need is the valuable skill now.
but i guess to me the question is: if you're management, do you expect your workers to do more/work faster (like a TAS in a way)? or do you expect to replace your workers entirely?
I personally think we're still a ways from the latter...
Experienced that too. They're basically forced to give those demos
New Harvard's study (62M workers, 285k firms) shows firms adopting generative AI cut junior hiring sharply while continuing to grow senior roles — eroding the bottom rungs of career ladders and reshaping how careers start.
What was the incentive for companies to train juniors into seniors in the past, post job-hopping era? Curious to know if that incentive has warped in the past two decades or so as someone who's starting their career now.
Same as always?
Cheap labor. It doesn't take that much to train someone to be somewhat useful, in mmany cases. The main educators are universities and trade schools. Not companies.
And if they want more loyalty the can always provide more incentives for juniors to stay longer.
At least in my bubble it's astonishing how it's almost never worth it to stay at a company. You'd likely get overlooked for promotions and salary rises are almost insultingly low.
There is no incentive now because the social contract is broken and there is too much mobility. Best you can do is find a supportive boss / a company that provides training opportunities.
Companies are organisms
How does that answer the question?
In 10 years where do the senior dev's come from? Real question. Seems like with lower entry level jobs now, in 10 years there won't be seniors to hire.
Even if we grant this is going to be a problem, it makes no sense for any individual company to do anything about it. Why take on the cost of training a junior when they can bail in a few years? This is especially true if you're not a big tech company, which puts you at risk of having your junior-turned-senior employees poached by big tech.
Give your juniors reasons to stay at your company? It's not hard if the company cares at all.
>It's not hard if the company cares at all.
It's pretty hard for a non-big tech company to pay big tech level salaries.
And most of my friends and colleagues would take a full remote role that pays half what big tech, 5 days in office pays. Add in an extra week of PTO and you have a great pitch to devs.
I'll believe it when I see it reflected in applicant resumes. (east coast tech firm)
You can incentivize people to stay with things other than salary. Salary plays a part of course, but there is a lot of other aspects that make staying at a job worthwhile.
In 10 years, the management (or "leadership" if you like the taste of boot) responsible for doing the cutting will have moved on to something else, with no consequences for them.
Junior devs eventually will have been brought up with agentic coding, etc. Hopefully whatever the "new way" becomes is how they'll be taught.
Currently part of the problem is the taboo using AI coding in undergrad CS programs. And I don't know the answer. But someone will find the right way to teach new/better ways of working with and without generative AI. It may just become second nature to everyone.
While agentic coding can make you productive, it won't teach you to deeply understand the source code, algorithms, or APIs produced by AI. If you can't thoroughly audit any source code created by an AI agent, then you are definitely not a senior developer.
This is just not true. I have witnessed people who would have been called dabblers or tinkerers just a few years ago become actual developers by using cursor. They ask a few key questions when they get stuck about engineering best practices and really internalize them. They read the code they are producing and ask the assistant questions about their codebase. They are theorycrafting using AI then implementing and testing. I have witnesses this with my own eyes and as AI has gotten better they have also been getting more knowledgeable. They read the chains of thought and study the outputs. They have become real developers with working programs on their github. AI is a tool that teaches you as it is used if you put in the effort. I understand many folks are 'vibe coding' and not learning a single thing and I don't know if thats the majority or the minorty, but the assertion that all people learn nothing from use of these tools is false.
You're talking about people who put in a significant non-trivial effort to thoroughly understand the code produced by the AI. For them, AI was just one path to becoming proficient developers. They would have gotten there even before the AI boom. I was not talking about such highly-motivated people.
Yeah and when you account for the amount of investment that has gone into the current generation of LLMs... it makes zero financial sense.
Some people dont want to hear that, but...
who are these mythical jr devs and where do i find them
i just want devs who actually read my pr comments instead of feeding them straight into an llm and resubmitting the pr
As an occasional uni TA, I'm leaning toward banning LLM for easy coursework while allowing it on more difficult & open-ended ones.
Pretty sure it's a self-destructive move for a CS or software engineering student to pass foundational courses like discrete math, intro to programming, algorithm & data structure using LLM. You can't learn how to write if all you do is read. LLM will 1-shot the homework, and the student just passively reads the code.
On more difficult and open coursework, LLM seems to work pretty well at assisting students. For example, in the OS course I teach, I usually give students a semester-long project on writing from scratch x86 32-bit kernel with simple preemptive multitasking. LLM definitely makes difficult things much more approachable; students can ask LLM for "dumb basic questions" (what is pointer? interrupt? page fault?) without fear of judgement.
But due to the novelty & open-ended nature of the requirement ("toy" file system, no DMA, etc), playing a slot machine on LLM just won't cut it. Students need to actually understand what they're trying to achieve, and at that point they can just write the code themselves.
Exactly. When I was in school even in 80's, it was common to have to hand write out a program during an exam. You had to know stuff in the old days.
I hand-wrote code in the late 00s. Java, assembly, C. The graders gave us some grace since we couldn't test, but you were expected to be pretty accurate. Hell, one quiz was just 20 identical pages on which we iterated through the Tomasulo algorithm.
Agentic coding is like leading and instructing a team of a bunch of very dumb but also very smart junior devs. They can follow instructions to the T and have great memory but lack common sense and have no experience. The more experienced and skilled their leadership, the better chance of getting a good result from them, which I don’t think is a good job (yet?) for an entry level human SWE.
My guess is that at some the code itself, in a language humans easily comprehend, will become superficial as we delegate more and more of the logic to AI development. Perhaps in the near future AIs will be writing things at a much lower level by default and the entire act of programming as we know it goes away.
Kind of like that meme or how two AIs talking to each other spontaneously develop their own coding for communication. The human trappings become extra baggage.
This is the same reason they force you to do the math by hand in undergrad and implement functions that are already in the standard libraries of most languages. Because you don't know anything yet, and you need to learn why the more automated stuff works the way it does.
> Junior devs eventually will have been brought up with agentic coding
But if they're not hired...?
Kids are always the best with technology. The generation in high school right now will be god tier at getting results from LLMs.
Not even a little bit. Where I work we regularly churn through kids just out of college and most of them don't have Clue One how to operate anything on their computer.
Yeah, growing up in the 80s or 90s might have had you uniquely well-positioned to be "good with computers", because "the computer that has games and the internet" was (in some sense) the same as "the computer that adults are supposed to use for work".
That's not true anymore in the smart phone / tablet era.
5-10 years ago my wife had a gig working with college kids and back then they were already unable to forward e-mails and didn't really understand the concept of "files" on a computer. They just sent screenshots and sometimes just lost (like, almost literally) some document they had been working on because they couldn't figure out how to open it back up. I can't imagine it has improved.
Very senior teachers and educators have been noting this observation for a decade-plus now. Computer literacy has tanked.
Might have been the case before. But these days, kids are brought up on locked-down content-focused machines (e.g. ipads). They struggle with anything harder than restarting an app.
When my little cousin was three and already knew how to use the phone by himself people were claiming he was gonna be a tech wizard and everybody was talking about digital natives. But when he got to high school he didn't know how to turn a computer on. How useful is it to be god tier at getting results from LLMs, if you have zero clue if the result you got is any good?
How are high school kids that barely know the basics of the topic going to get “god tier” results from LLMs?
I can see how that would seem logical, but imo the issue with it not following the normal trend is the inconsistency and inaccurate results of LLMs.
I think a large chunk of people are naively assuming exponential growth and no longer needing senior devs in 10 years.
There will still be plenty of seniors, they will be in the 40s and 50s. The problem you speak of is much further out, once those seniors retire.
I suppose the idea is that those junior developers who weren't hired will spend 10 years doing intensive, unpaid self-study so that they can knock on the door as experienced seniors by that time.
Are you serious? How on earth are these people going to eat or pay rent for 10 years? As well, most companies would laugh you out the door if you were applying a senior role without any experience working in the role.
> Are you serious?
No, I was being sarcastic.
Laugh all you want, for some people it’s more real than you think.
I'm not laughing at all. I'm definitely not making fun of those who may be affected by this. My sarcasm was directed at people or companies planning to implement such ideas.
> In 10 years where do the senior dev's come from?
From company interns. Internships won't go away, there will just be less of them. For example, some companies will turn down interns because they do not have the time to train them due to project load.
With AI, now employed developers can be picky on whether or not to take on interns.
From the bread line, after they've all been displaced by AI, if you happen to need one for God knows what reason, the CEOs are hoping
Seniors that are 30 years old will still be around and now then they will make more money.
In 10 years a lot of the senior developers might be cut as well due to strongly improved AI.
They will be promoted but they won't have the requisite experience. We'll have people in the highest positions with the usual titles, but they will be severely underqualified. Enshitification will ensue.
"Seniority-bias" is the wrong word. It's "skill-bias", or better yet, "tacit-knowledge bias".
The reason why old hands (who have been in the game long enough to be promoted to something approximating "Senior") are so deadly with AI is because they know all the traps/pitfalls to watch out for. Sure, the AIs may become good enough in 5-10 years to have learned all these pitfalls, but make hay while the sun is shining!
"Senior EMs" who are pure HR managers/non-coders will be priced-out of their jobs in an AI agentic world, as (surprise!) you can't report AI agents to HR. Juniors who can use AI effectively will thrive and climbs the ranks rapidly.
The pessimistic reading is well-represented, so here's another: AI changes the definition of "entry-level", but it doesn't eliminate the class of professional labor that experienced workers would rather not do.
Until AI can do literally everything we can, that class of work will continue to exist, and it'll continue to be handed to the least experienced workers as a way for them to learn, get oriented, and earn access to more interesting problems and/or higher pay while experienced folks rest on their laurels or push the state of the art.
In the future, there will be two kinds of companies:
Which type of company will be more innovative, productive, and successful in the long run?The ones in group 2 that also aggressively prune expensive, complacent seniors might even win in the short run!
Young people are cheap and they love AI!
How much cheaper are younger developers overall? If you look at the delta between junior and mid level developers in most industries - ie enterprise dev not BigTech or adjacent - it’s really not that much
It's a down market and you might be better able to retain them if the market turns up again (or maybe not, I'm old and mostly just want a remote gig where I have time to parent my child).
In an up market it’s going to be worse. HR is still going to hold the line on raises and not keep up with the market.
I’m old too - 51. But I consistently tell young people to chase money. Target BigTech or adjacent companies, “grind leetCode”, avoid startups and Monopoly money “equity”, etc.
One person I mentored when they were an intern in 2021 and when they came back the next year, is 25 years old making $220K (Solution Architect not a developer) and I couldn’t be happier for them. They make the same as I make now. But I’ve already raised two (step)kids, bought and sold the big house in the burbs, etc and love working remotely.
I told them to do whatever it takes to be seen, promote themselves, build a network internally and with clients and make all the money they can.
[dead]
I don't understand how you want innovation and productivity in a world with rapidly increasing population. We need less and less people while producing more and more people. Where am I wrong?
Recently there was a splash in the news about the US death rate projected to exceed the birth rate by 2031.
Many of the largest countries are experiencing similar declines, with fewer and fewer countries maintaining large birth rates.
I don't see developed countries to have a "rapidly increasing population" problem.
Practically the whole planet is experiencing population decline now. The poster you're replying to is basing his argument on an obsolete worldview.
This isn't accurate. Birth rates are declining but population is still increasing in most countries.
Birth rates are of course the only thing which matters. If your car runs out of gas, it will still roll for a small stretch, but it is irrelevant.
I don't know what you're saying.
>Practically the whole planet is experiencing population decline now.
That's objectively false.
Yeah, it felt like we went from 7B to 8B really quickly, but I guess it's over now.
> in a world with rapidly increasing population
That world was 30 years ago. In 2025 world average total fertility rate is 2.2, which is a shade above replacement rate (2.1). And 2.2 is a 10% drop since 2017 alone (when it was 2.46).
Because life expectancy is higher, the population will continue to increase. But not "rapidly".
According to some cutting edge research[1], we are already bellow replacement rate worldwide.
[1]: https://www.youtube.com/watch?v=f7_e_A_vFnk
Depends what kind of people you need. For innovation you generally need exceptional people, and the probability of finding such a person is proportional to the total number of people available.
Kinda feels like a lot of companies think they can be option 1 because someone else will be option 2, and then they'll hire the young people away after they become experienced.
...I was afraid someone would point out this potentially slimy scenario.
Any job that doesn't creatively generate revenue will be systematized and automated as soon as possible. AI agents are just an acceleration factor for this
Source?
The tragedy of the commons: companies acting in their self interest at the expense of the industry by drying up the workforce pipeline. The next generation will pay, like when America stopped producing hardware.
This is cause for government intervention.
It should be, but the government loves AI - incidentally it’s also the one thing currently propping up the economy.
It has been like this for decades by now, and your precious government loves this because they also consist of old people who hate the younger generations. Any and every time the government intervenes it is to stomp down on the youth and nothing else.
I heard an argument that the valuation of AI/ AI-adjacent firms only makes sense if 1) companies grow their aggregate top-line revenues by multiple trillion dollars of 2) this amounts to the potential savings from gutting knowledge-based employment across an economy.
Of course in the long run a chronically underemployed economy will have little demand for products and services, but that is beyond the scope of companies who, in general, are focused on winning short term and zero-sum market capture. However I believe that while a billion dollar valuation is a market and strategy problem, a trillion dollar valuation is a political problem - and I would hope that a mandate of broad gainful employment translates to political action - although this remains to be seen.
I wonder how these findings would hold up if the analysis could control for the widespread corporate strategy of replacing domestic junior roles with dedicated offshore teams?
It’s really hard to adjust for economic factors here. I am in agreement with skepticism in this thread as job numbers got revised downward heavily in both 2024 and 2025 (too negatives in some months) indicating a poor economic situation.
The new career path for devs has them start with prompt engineering. I can't wait for the chickens to come home to roost.
What happens when companies refuse to hire, even though there is an obvious need? It has to lead to reduced growth. If the majority of companies do this, I would think it would lead to a severe deflationary cycle.
Won't the lunch of the companies that refuse to hire/grow be eaten by other companies that don't?
They burn out employees until a failure mode is reached.
If a few companies do this it is probably fine, but more interesting if most companies do this. It seems that it would be akin to a self inflicted depression with severe deflation.
From your lips to Blackrock's ears.
I'm sorry, I don't understand.
This makes me think the conversation around AI and jobs is too focused on total employment, when the real story is how it shifts opportunities within companies. If juniors are getting fewer chances to enter, that could create long-term bottlenecks for talent growth.
If entry-level roles are shrinking, how should companies rethink talent development? Without the traditional “bottom rungs,” how do we grow future seniors if fewer juniors ever get the chance to start?
When were companies ever thinking about talent development, especially for SWE? We had some loose "mentorship" roles but IME most folks are left to their own devices or just learn by bandwagoning things from reddit.
I think open source contributions/projects will still be a way to gain verifiable experience.
Other than that, I guess developing software in some capacity while doing a non-strictly software job - say, in accounting, marketing, healtcare, etc. This might not be a relevant number of people if 'vibe coding' takes hold and the fundamentals are not learned/ignored by these accountants, marketers, healthcare workers, etc.
If that is the case, we'd have a lot of 'informed beginners' with 10+ years of experience tangentially related to software.
Edit: As a result of the above, we might see an un-ironic return to the 'learn to code' mantra in the following years. Perhaps now qualified 'learn to -actually- code'? I'd wager a dollar on that discourse popping up in ~5 years time if the trend of not hiring junior devs continues.
I'm looking forward for the weird inflective trend of "organic" programs, "humane" dev treatment, and software development taking a long time being seen as a mark of quality rather than stagnation or worry. :)
I'm half-joking, but I wouldn't be surprised to see all sorts of counterpoint marketing come into play. Maybe throw in a weird traditional bent to it?
> (Pretentious, douche company): Free-range programming, the way programming was meant to be done; with the human touch!
All-in-all, I already feel severely grossed out any time a business I interact with introduces any kind of LLM chatbot shtick and I have to move away from their services; I could genuinely see people deriving a greater disdain for the fad than there already is.
The plan seems to be to hope that AI will be able to replace the senior ICs in the near future. They're certainly gutting the ranks of management today in a way that presupposes there will be far fewer ICs of all levels to manage soon.
That's much longer than a quarterly earnings report away, which makes it "somebody else's problem" for the executives pushing these policies. There's no reason to expect these people to have a long-term strategy in mind as long as their short-term strategy gives them a golden parachute.
Job market is not obligated to ensure sustained supply of talented individuals, so, I think, short term they'd just keep chasing unicorns. Long term, governments will be forced to financially incentivize young hires. Or something terrible happens and everything rolls back to 1945.
presumably this also means the relative value of seniors is now increasing, as the pipeline to replace them is smaller.
its like how the generic "we take anyone" online security degree has poisoned that market -- nothing but hoards of entry level goobers, but no real heavy hitters on the mid-to-high end. put another way, the market is tight but there are still reasonable options for seniors.
then again we live under capitalism
Agree, increased value and demand for seniors. But how will the market solve the generation of new seniors if juniors are getting less opportunities?
Take the software development sector as example: if we replace junior devs by AI coding agents and put senior devs to review the agent's work, how will we produce more seniors (with wide experience in the sector) if the juniors are not coding anymore?
Who cares? This is a once in a lifetime opportunity to finally gatekeep software engineering the way lawyers and finance professionals do with their fields! Enjoy the windfall in 5 years!
That’s a problem years away, so… don’t think about it?
The economic turmoil in the US is hollowing out the entry level jobs, AI is just the cover companies are using. The constant tariff changing means that companies have to be very pessimistic in their long term planning, as any assumptions they make can be turned on their head with no notice.
Entry-level jobs are also hard to get when there is a hiring slowdown; and in the US there is a hiring slowdown.
Is this a case of "correlation does not imply causation?"
> the largest effects in wholesale and retail trade
Hard for me to believe that AI in its current state is hollowing out junior shop assistant and salesperson roles. Either those jobs were already vulnerable to "dumb" touchscreen kiosks or they require emotional intelligence and embodied presence that LLMs lack.
Salary wise, we are in a race to the bottom. Massive outsourcing of jobs to India and similar where wages are much lower than comparable US employees coupled with Ai replacing entry level roles.
On a longer time scale you have humanoid robots potentially coming as well. Self driving cars and trucks are going to torpedo trucking and low pay positions such as Uber // taxi driver. The wealth is getting centralized amongst a couple dozen high tech companies.
All in all I am pretty negative on the prospects for young people as they enter the workforce and additionally for older tech people as they are currently operating in an environment where losing your job very possibly means the end of the road.
What if, new entry level jobs are just more about using AI.
Would similar research in other countries also return similar results?
The US is going through a lot of upheaval, which whether you think is positive or negative, is unique, and a confounding factor for any such research.
These studies would be more meaningful if this was a time of economic boom.
Entry-level jobs get "hollowed out" in a stagnant economy regardless of "AI".
AI = not hiring because no new work but spin as a "AI" . Markets are hungry of any utterance of the the word AI from the CEO.
so ridiculous. but we've collectively decided to ignore BS as long as we can scam each other and pray you are not the last one holding the bag.
Pretty much lol.
You have to somehow have the discipline to avoid getting caught up in the noise until the hype starts to fade away.
soon there will be entire new industries of "Bullshit Jobs" to keep the masses occupied
they rather pay people to sit in a room pressing a button every hour than have them loitering around on UBI
either that or in the pod
Would the entire tracking, advertisement and general enshittification industry already count as bullshit jobs?
[dead]
Another way to look at it is that legacy jobs have no future therefore there is no point bringing in the next generation into a dying system.
Another way to look at it is that hiring is fine, and that the vain entitled generation we all suspected was going to emerge feels that a job should absolutely be available to them, and immediately.
Another way to look at it is that journalism has been dead for quite a while, and writing about the same fear-based topics like “omg hiring apocalypse” is what makes these people predictable money (along with other topics).
Another way to look at it is that we raised a generation of narcissistic parents and children that have been going “omg grades”, “omg good college”, “omg internship”, “omg job” for so long that that these lamentations feel normalized. A healthy dose of stfu was never given to them. Neurotic motherfuckers.
The second point has some truth to it from what I've been hearing from talking to senior folks, taking on new grads within the finance sector.
Today they're admitting AI is hollowing out entry-level jobs. The reality is that it can and will replace mid-level and eventually even quite senior jobs.
Why?
It's already doing a lot of the loadbearing work in those mid-level roles too now, it's just a bit awkward for management to admit it. One common current mode of work is people using AI to accomplish their work tasks very quickly, and then loafing a bit more with the extra time. So leaders refrain from hiring, pocket the savings, and keep a tight lid on compensation for those who remain.
At some point they'll probably try to squeeze the workforce for some additional productivity, and cut those who don't deliver it. Note that the "ease" of using AI for work tasks will be a rationale for why additional compensation is not warranted for those who remain.
Lol this is so far from reality. Where have you actually seen this fella?
Im tired of reading all these claims with no primary evidence to support it.