ai-economics
5 items · chronological order
Can AI Replace Humans for Market Research?
$100M Series A announcement dressed as trend piece. CVS's "95% accuracy" claim is backtested against known answers — the real test is predicting unknown findings, which nobody's shown. Digital twins for market research are a cost/speed optimization, not a new form of intelligence. The hard-to-reach population simulation (chronic disease patients from sparse data) is where overconfidence becomes actively dangerous.
Bits In, Bits Out
Hoel argues writing is the canary domain for AI capability — 6 years in, LLMs produced efficiency gains and slop, not a quality revolution. The Amazon book data is compelling (average worse, top 100 unchanged), but the extrapolation from writing to all domains is structurally weak: verifiable domains like code and math behave differently from taste-dependent ones. Best articulation of the "tools not intelligence" thesis, but cherry-picks the hardest domain for AI to show measurable ceiling gains.
Americans' Electricity Bills Are Up. Don't Blame AI.
AI data centres are scapegoats for electricity price increases driven by decades of deferred grid infrastructure, transformer supply shortages, and fossil fuel dynamics. The real insight is buried: an industry bigwig admits AI provides utilities a pretext to win regulatory approval for capex they should have made years ago. The "blame the shiny new thing for costs that were always coming" pattern maps directly to enterprise IT budgets.
Meet the A.I. Prospectors Tapping a Billion-Dollar Gusher
Profile piece that's functionally a PR placement for Cloverleaf (PE-backed, $300M fund) but reveals a genuine new commodity class: "powered land." The real story isn't the wildcatter romance – it's that every AI API call now sits on top of a real estate and energy intermediation stack that extracts margin at each layer. The Insull parallel (grid-connected beats on-site) is the structural bet worth tracking; SMRs are the wild card that could break it. Economics are conspicuously opaque – no cost basis, no margin data, just big exit numbers.
Oracle and OpenAI End Plans to Expand Flagship Stargate Data Center
Nvidia paid $150M to a DC developer to ensure its GPUs — not AMD's — fill the expansion, making it an infrastructure intermediary, not just a chip vendor. The deeper signal: OpenAI's "often-changing demand forecasting" suggests even the largest training compute buyer is uncertain about forward requirements, cracking the infinite-linear-scaling thesis. Cooling failures taking buildings offline in winter are the first concrete evidence of operational fragility at hyperscale AI density.