In investment banking, junior analysts populate financial models and verify due diligence documents. Standardized procedures, mostly. But somewhere in the repetition, over months and years, something else accumulates: the instinct for when numbers don't add up, when a deal structure feels wrong, when a client's story has a gap. That instinct develops from having done the standardized work long enough, under supervision, to build a feel for its edges.
This is the pipeline that's thinning.
Burning Glass Institute data shows that between 2018 and 2024, the share of jobs requiring three years of experience or less dropped sharply across AI-exposed fields: software development from 43% to 28%, data analysis from 35% to 22%, consulting from 41% to 26%. Revelio Labs found entry-level job postings in the U.S. dropped 35% between January 2023 and June 2025. Companies aren't necessarily hiring fewer people. They're skipping new graduates and hiring experienced workers instead.
The labor market implications are real and immediate. Underneath them, something else is happening. The work those junior employees did also served as the mechanism through which organizations learned what they didn't know. Junior workers processing exceptions, making recoverable mistakes, escalating edge cases. Institutions discovered what their formal processes missed through exactly this kind of friction.
Economist Enrique Ide formalized this in a working paper on automation and intergenerational knowledge transfer. His central finding is counterintuitive: improvements in entry-level automation can increase output immediately while reducing long-run growth, even without reducing total employment. When AI handles entry-level tasks, experts operate more independently, which removes the interaction through which tacit knowledge diffuses to the next generation. Novices get reallocated away from the most productive practitioners. Best practices stop spreading.
Beane and Anthony, writing in Organization Science, documented what they call "Inverted Apprenticeship." Senior practitioners use AI and robotic systems to develop their own expertise while becoming less dependent on juniors. Surgical residents sidelined from complex procedures. Junior analysts distanced from senior decision-makers. The technology teaches the expert. The junior watches from further away.
The dynamic persists even in responsible-looking deployments. BCG observes that augmentation diffuses faster than substitution, precisely because humans stay in the loop to manage context and edge cases. Encouraging, on the surface. Over time, though, being kept in the loop only for exceptions gradually strips away the pattern recognition that comes from seeing the whole workflow operate. The person who only ever sees failures loses their sense of what normal looks like.
Michael Polanyi's old observation holds: we can know more than we can tell. You can document a procedure. You cannot document the instinct that tells a practitioner when the procedure doesn't apply. The apprenticeship pipeline emerged as a byproduct of how work was organized. Nobody designed it as a training program. And nobody is building a replacement for it, because most organizations didn't realize the function existed. They saw entry-level labor. They optimized it away. What disappeared was the institutional feedback loop that told them what they were getting wrong.

