Macro And Financial Transmission Channels: Labor Displacement And Private Credit
Sources: 1 • Confidence: Medium • Updated: 2026-04-11 19:08
Key takeaways
- James van Geelen argues aggregate labor metrics (e.g., software job postings up YoY) may mask composition shifts toward AI/ML roles and that datasets like JOLTS do not cleanly measure white-collar displacement by occupation mix.
- The “Citrini scenario” Substack post spread widely enough that sell-side research and economists reported clients asking about it and it became a major market talking point.
- James van Geelen argues the AI capability curve has continued accelerating rather than leveling off (rather than following a sigmoid plateau).
- James van Geelen interprets Anthropic’s release of prepackaged AI tool suites as a method to close the user capability gap by providing ready-made workflows that prompt new use cases.
- James van Geelen argues AI monetization is uncertain due to a race with sharp price compression while providers still need sufficient paying customers to earn ROI on heavy compute spend.
Sections
Macro And Financial Transmission Channels: Labor Displacement And Private Credit
- James van Geelen argues aggregate labor metrics (e.g., software job postings up YoY) may mask composition shifts toward AI/ML roles and that datasets like JOLTS do not cleanly measure white-collar displacement by occupation mix.
- James van Geelen argues private credit is less susceptible to bank-run dynamics due to more permanent capital structures, but regulatory changes affecting treatment on life insurer balance sheets are a key incremental risk.
- James van Geelen states many high-value AI use cases are economically framed as substituting for tasks currently performed by paid human labor.
- James van Geelen posits AI disruption could stress private credit via defaults in disrupted industries and among high-FICO white-collar borrowers, and he states Apollo reduced software lending earlier (around early 2025) as software risk emerged.
- James van Geelen expects the transition window for AI-driven labor disruption is more like 5–15 years and notes the article used an aggressive three-year extrapolation.
- James van Geelen suggests productivity-led disinflation and wealth creation could expand government fiscal capacity to stabilize disruption outcomes, conditional on policymakers preparing a monitoring and response framework.
Narrative Contagion And Reflexivity In Markets
- The “Citrini scenario” Substack post spread widely enough that sell-side research and economists reported clients asking about it and it became a major market talking point.
- The scenario piece was motivated as a narrative to connect year-to-date market moves including bond rallies and selloffs in software, fintech, and private equity.
- Joe Weisenthal and James van Geelen suggest market reactions to viral AI scenarios and rebuttals indicate elevated uncertainty and stress among investors regarding AI impacts.
- James van Geelen cites Paul Krugman drawing an analogy between the “War of the Worlds” panic occurring during the Depression and viral AI fears resonating in a broader climate of anxiety.
Capability And Cost Trajectories As Discontinuity Drivers
- James van Geelen argues the AI capability curve has continued accelerating rather than leveling off (rather than following a sigmoid plateau).
- A rebuttal noted the world is short on GPU/wafer capacity, and van Geelen argues algorithmic and infrastructure improvements could still expand effective compute and keep capability improving.
- James van Geelen claims AI agent autonomy on intellectually complex tasks rose from about two minutes to roughly 8–16 hours over about two years.
- James van Geelen argues inference cost per cognitive task fell roughly 10–30x over the past year, enabling tasks to become economical within a few quarters.
Enterprise Adoption: Intensity, Packaging, And Procurement Inertia
- James van Geelen interprets Anthropic’s release of prepackaged AI tool suites as a method to close the user capability gap by providing ready-made workflows that prompt new use cases.
- James van Geelen notes enterprise reaction timelines to agentic AI may be slower because large organizations do not change vendors or systems quickly.
- James van Geelen argues enterprise displacement risk is better modeled by intensity of AI use inside existing platforms than by adoption breadth S-curves.
- James van Geelen says “agentic AI” remained mostly a buzzword during early budget resets and only saw a major perceived capability jump by late November.
Monetization Constraint And Price Compression For Model Providers
- James van Geelen argues AI monetization is uncertain due to a race with sharp price compression while providers still need sufficient paying customers to earn ROI on heavy compute spend.
- James van Geelen argues AI capability improvements are economically constrained by the need for vendors to find paying customers and demonstrate ROI on large upfront compute and training spend.
- James van Geelen claims Minimax is relatively comparable to top models while being about 90% cheaper.
- James van Geelen expects markets are valuing AI-related companies assuming compute capacity will keep expanding to meet demand, though the pace is uncertain.
Watchlist
- James van Geelen argues private credit is less susceptible to bank-run dynamics due to more permanent capital structures, but regulatory changes affecting treatment on life insurer balance sheets are a key incremental risk.
- Joe Weisenthal argues policy response is a major wild card, and the discussion asserts there is little substantive discussion in Washington, D.C. about AI’s real economic impacts despite widespread private-sector concern.
Unknowns
- What specific evaluation or dataset supports the claimed increase in agent autonomy from minutes to 8–16 hours, and does it generalize across task domains and real enterprise environments?
- What evidence supports the magnitude and timing of the claimed 10–30x inference cost decline per cognitive task, and what is the relevant unit of analysis (per token, per task, per successful outcome)?
- To what extent will physical compute constraints (GPU/wafer supply and power) bind progress versus being offset by algorithmic/infrastructure efficiency improvements?
- How fast is “intensity of AI use” actually rising inside major enterprise suites, and how does that translate into measurable labor substitution versus augmentation?
- Are labor market aggregates masking white-collar displacement via occupational composition effects, and if so, in which occupations and income cohorts is it emerging first?