Rosa Del Mar

Daily Brief

Issue 40 2026-02-09

Us China Ai Competition And Diffusion

Issue 40 Edition 2026-02-09 10 min read
General
Sources: 1 • Confidence: Medium • Updated: 2026-02-09 16:44

Key takeaways

  • There is vigorous debate about whether the U.S. and China are in a new Cold War, with deep trade and supply-chain interdependence as a key complication.
  • State-level AI regulation is a major risk vector, with roughly 1,200 bills being tracked across all 50 states, creating fragmentation pressure.
  • Public stated opinions about AI are more negative than revealed preferences, with many people reporting panic while continuing to use AI products.
  • The outcome of open-source versus closed-source AI is unresolved, and both may coexist at scale.
  • Current shortages in AI infrastructure inputs (e.g., GPUs, data center capacity) are likely to trigger replication/buildout that reduces per-unit costs over time.

Sections

Us China Ai Competition And Diffusion

  • There is vigorous debate about whether the U.S. and China are in a new Cold War, with deep trade and supply-chain interdependence as a key complication.
  • In Washington, bipartisan sentiment over roughly the past decade has shifted toward treating China as a more serious geopolitical foe, including in AI.
  • AI frontier development is framed as primarily a U.S.-versus-China race, with a strategic focus on which country’s AI proliferates globally.
  • DeepSeek’s release surprised Washington observers due to perceived quality, ability to run on smaller local hardware, and being open-source.
  • Since the ‘DeepSeek moment’ less than a year ago, multiple Chinese AI companies have effectively caught up to the frontier, increasing competitive pressure on U.S. players.
  • China is actively competing on AI software with several major model efforts, including DeepSeek, Qwen, and Kimi, plus additional large tech players.

Regulatory Fragmentation And Preemption Tensions

  • State-level AI regulation is a major risk vector, with roughly 1,200 bills being tracked across all 50 states, creating fragmentation pressure.
  • California’s SB 1047 passed both legislative houses but was vetoed by the governor.
  • There is a critique that the proposed state-level moratorium was politically infeasible and potentially overreaching in restricting states’ ability to regulate.
  • The EU AI Act is asserted to have significantly hampered AI development in Europe and contributed to major companies not launching leading-edge AI features there.
  • SB 1047 would have imposed downstream legal liability on open-source AI developers for future third-party misuse of their models years after release.
  • A federal legislative attempt to impose a moratorium on state-level AI regulation was pursued but failed when a last-minute deal collapsed.

Adoption And Pricing Shifts

  • Public stated opinions about AI are more negative than revealed preferences, with many people reporting panic while continuing to use AI products.
  • Leading AI companies with compelling products are experiencing an unprecedented revenue takeoff rate driven by real customer demand.
  • AI adoption can scale faster than past general-purpose technologies because the internet and smartphones already provide global distribution.
  • Enterprise AI monetization is trending toward usage-based pricing where customers buy tokens as metered units.
  • Hyperscaler cloud competition is enabling usage-based AI pricing by making advanced model access broadly available with low fixed costs for startups.
  • AI startups are experimenting with value-based pricing, including charging a share of labor replacement value or marginal productivity uplift from human-AI collaboration.

Model Capability Diffusion And Open Source

  • The outcome of open-source versus closed-source AI is unresolved, and both may coexist at scale.
  • Leading AI application companies are evolving beyond 'GPT wrappers' by orchestrating many models, training domain-specific models, and switching to open-source when cloud token economics are unattractive.
  • Smaller models tend to catch up to frontier-model capabilities within roughly six to twelve months.
  • State-of-the-art open-source models accelerate the spread of AI know-how by making systems easier to study, teach, and replicate.
  • A Chinese open-source model (Kimi) is claimed to replicate GPT-5-level reasoning on benchmarks while being small enough to run locally on one or two MacBooks.
  • The AI industry is expected to segment into a small number of frontier ‘supercomputer’ models and a high-volume cascade of smaller embedded models.

Cost Curve And Compute Supply Dynamics

  • Current shortages in AI infrastructure inputs (e.g., GPUs, data center capacity) are likely to trigger replication/buildout that reduces per-unit costs over time.
  • Hyperscalers are already building their own AI chips, and multiple big tech companies are pursuing internal chip programs.
  • AI chips are expected to become cheap and plentiful within about five years due to competition from NVIDIA rivals, hyperscalers, and Chinese suppliers.
  • AI running on GPUs is partly path-dependent: GPUs were designed for graphics rather than AI-specific computation.
  • Purpose-built AI accelerators designed from scratch could be more economically efficient than full GPUs.
  • The price of AI is expected to fall rapidly, driven by collapsing per-unit input costs and high demand elasticity.

Watchlist

  • State-level AI regulation is a major risk vector, with roughly 1,200 bills being tracked across all 50 states, creating fragmentation pressure.
  • AI-focused chip startups are building new architectures, but outcomes may range from independent success to acquisition by large companies that can scale manufacturing and distribution.
  • China is working hard to catch up on AI chips, and a reported U.S. understanding is that DeepSeek’s next version is being required to train only on Chinese chips to stimulate the domestic ecosystem (with Huawei highlighted).
  • There are active discussions in Washington, D.C. about a next attempt to land a federal approach to AI regulation and federal leadership over a 50-state issue.

Unknowns

  • What are the audited and comparable revenue, retention, and margin metrics behind the claimed “unprecedented revenue takeoff” for leading AI companies?
  • How prevalent and stable are $200–$300/month consumer AI tiers (conversion, churn, and long-run ARPU distribution)?
  • What is the real-world and independently verified performance/cost parity timeline between frontier models and smaller/open models?
  • Is the claim that Chinese models have “effectively caught up to the frontier” since the DeepSeek moment supported by consistent benchmark and deployment evidence?
  • Will chip supply and pricing actually become “cheap and plentiful” within five years, and what portion of AI workloads will move to hyperscaler in-house silicon versus merchant GPUs?

Investor overlay

Read-throughs

  • State level AI regulation fragmentation can raise compliance and liability overhead, especially for open source distribution, creating uneven operating conditions across the US.
  • Token metered pricing and high consumer tiers may shift monetization away from seat based SaaS, changing revenue quality and margins for AI product companies.
  • Current shortages in GPUs and data center capacity may drive replication and new architectures, reducing per unit compute costs over time and compressing advantage windows for frontier models.

What would confirm

  • Movement toward a federal AI framework that preempts or harmonizes state rules, or evidence of accelerating state level bill passage creating materially different requirements by state.
  • Audited, comparable revenue retention and margin disclosures showing durable uptake of token metered pricing and measurable adoption of 200 to 300 per month consumer tiers.
  • Observable easing of GPU and data center constraints alongside lower unit pricing, plus increased deployment of hyperscaler in house silicon or new accelerators that change workload economics.

What would kill

  • Sustained absence of enforceable state level AI rules or successful broad federal preemption that reduces fragmentation and liability uncertainty.
  • Evidence that high priced consumer tiers have low conversion or high churn, or that token metering fails to support durable margins versus pass through compute costs.
  • Compute supply remains constrained with no meaningful per unit cost declines, or merchant GPU dominance persists without credible alternatives, preventing the forecast cost curve improvement.

Sources