AI Automation Signals·Updated Apr 3, 2026

Where is AI automation heading?

Think of this page like construction permits for AI automation. Before a building goes up, permits spike. Before AI replaces tasks in an industry, developers start downloading the tools to build those systems. We track 57 automation tools across 10 industries.

What happens when workers get more productive?

How AI Automation Reaches the Labor Market

Developers download tools

Every tool on this page automates a specific category of work. browser-use automates clicking through websites. faster-whisper automates transcription. docling automates document processing. When download counts spike, it means more teams are building systems to perform these tasks without a human.

We currently track 57 tools across 10 industries, with 12 flagged as surging.

Tasks get automated, not jobs

A paralegal's job includes document review, legal research, deadline tracking, and client communication. AI tools are automating the first two tasks right now — but not the last two. This is why employment numbers often lag tool adoption: the job still exists, but it's shrinking. One paralegal can now do what three did before.

Productivity absorbs the impact — until it doesn’t

When AI automates 30% of a role's tasks, companies don't immediately cut 30% of headcount. Instead, remaining workers become more productive. This is the productivity absorption phase — output stays the same (or grows) with fewer hours of human labor per unit of work.

The inflection point comes when the economics shift: when it's cheaper to restructure teams around AI-augmented workflows than to keep headcount flat. That's when employment numbers start to move. The gap between tool adoption growth and employment decline on this page is, roughly, a measure of how long the productivity absorption phase lasts in each industry.

Important caveat: Historical technology transitions suggest a 2–10 year lag between tool adoption and measurable employment effects. For AI, this timeline is unknown. Downloads are a signal of builder intent, not a guarantee of labor market impact — many tools are exploratory, and organizational adoption lags developer experimentation significantly.

The task composition determines vulnerability

Not all jobs within an industry are equally exposed. The key variable is what percentage of the job's tasks are automatable with current tools. A job that's 80% document review and 20% relationship management is more vulnerable than one that's 20% document review and 80% relationship management — even though both sit in the same BLS employment category.

What this page measures

We track the leading edge of this chain. Package downloads are a proxy for how many teams are actively building automation for specific task categories. By the time these tools are in production and affecting headcount, the signal appeared here months earlier — just like construction permits precede new buildings.

The industries where tool adoption is growing fastest and employment is already declining are furthest along this chain. Industries where tools are surging but employment hasn't moved yet may be in the productivity absorption phase — and worth watching closely.

Tier 1 · Anthropic Research

Theoretical Capability vs. Observed Exposure

The gap between what AI could automate and what it actually handles today is enormous. Blue shows the share of tasks LLMs could theoretically perform; red shows measured usage from Claude API traffic. The discrepancy suggests we are still early in the diffusion phase, even in high-exposure categories like Office & Admin and Computer & Math. Importantly, high exposure does not automatically predict displacement. In sectors with elastic demand, high AI exposure may predict expansion — more hiring at higher wages — as productivity gains make previously unviable work economically feasible.

Radar chart showing theoretical AI capability (blue) vs observed AI coverage (red) across 22 occupational categories. Computer & math shows the widest gap with ~94% theoretical vs ~33% observed.

Source: Massenkoff & McCrory (2026), “Labor market impacts of AI: A new measure and early evidence”

AI Automation by Industry

Click any industry to see the full breakdown: tool adoption trends, employment data, and which specific tools are driving growth.

Each industry follows one of three paths: Reduce, Amplify, or Expand.

NowAutomation tools surging, employment already shifting
NextTools building fast, labor impact not yet visible
LaterEarly-stage adoption, slower timeline to impact

All Tracked Tools

Click column headers to sort. Surging badges flag tools with sustained rapid growth.

ToolTypeMonthly UsageMonthly Change3-Month TrendTrend
Software & IT13 tools6 surging
300.9M+15.1%+15.6%
Legal & Compliance6 tools1 surging
52.9M+18.3%+9.3%
Finance & Insurance8 tools1 surging
42.1M+6.4%+12.6%
Healthcare4 tools
7.9M-29.9%+34.3%
Creative & Media9 tools2 surging
24.3M-11.9%+12.4%
Customer Service & Support5 tools
12.4M-10.9%-3.5%
Education & Training3 tools
46.7M+8.1%+5.1%
Manufacturing & Logistics5 tools
52.8M-3.5%+4.0%
Sales & Marketing4 tools
25.1M-0.9%+7.5%
(7 packages)

What Happens When Workers Get More Productive?

AI-driven productivity gains lead firms down three paths — often simultaneously. Click each to see the research.

AI makes workers more productive

Research

Employment for 22–25 year olds in AI-exposed occupations declined ~16% since late 2022, while experienced workers remained stable or grew.

Brynjolfsson, Chandar & Chen (2025) — Stanford Digital Economy Lab

Corroborated by ADP Research (2025) payroll data showing hiring slowdowns concentrated in entry-level positions within AI-exposed industries.

Example

IBM replaced several hundred back-office HR positions with AI agents, consolidating routine processing that previously required large junior teams.

Research

Customer support agents using AI resolved 14–15% more issues per hour, with gains up to 35% for the least experienced workers. Microsoft/Accenture field experiments showed ~26% increase in completed pull requests for developers.

Brynjolfsson, Li & Raymond (2025) — Quarterly Journal of Economics

Why gains are uneven: Gans (2026) — formalizes “jagged intelligence” — AI boosts productivity on some tasks while degrading it on similar-looking ones, making local calibration critical.

Example

A 50-person support team handles the volume that previously required 70 people, with faster resolution times and higher customer satisfaction scores.

Research

This is Jevons Paradox applied to labor: efficiency doesn't shrink demand, it grows it. When AI dramatically reduces the cost of work, previously unviable projects become viable. The net effect depends on two conditions:

A: Demand elasticity

Does cheaper output create more demand?

B: High task exposure

Enough AI-augmented tasks to change the cost equation?

Agentic coding example: A project needs 50 engineers but the ROI doesn't justify it — so the company hires 0. AI agents make it a 10-engineer problem. Now the company hires 10. Net: +10 engineers, not -40.

Historical precedent: ATMs didn't eliminate bank tellers — they made branches cheaper to operate, so banks opened more branches, and total teller employment rose for decades (Bessen, 2015). Spreadsheets didn't eliminate accountants — they made analysis affordable, expanding the market for accounting services.

Note: expansion is a net effect, not frictionless. Some workers will be displaced at individual firms even as the sector grows overall.

Explore the full demand elasticity framework

Example

A marketing agency uses AI to cut production costs 40%, then expands from 3 service lines to 7, hiring specialists for the new offerings. A mid-sized company starts a software project that was previously too expensive, hiring 10 AI-augmented engineers for work that would have required 50 before.

The Productivity J-Curve

Most firms do all three at once. And it takes time — measured productivity often dips before it rises as firms invest in AI tools, reorganize workflows, and retrain workers before reaping the gains. The same pattern played out with electricity and computers.

The automation tools tracked above are the leading indicator. The BLS data tells you which path each industry is taking.

How to Read This Data

TL;DR

Think of this like construction permits for AI automation. Before AI replaces tasks in an industry, developers download the tools to build those systems. We track Python and JavaScript package downloads as a leading indicator of where AI automation is heading. When industry-specific tools grow faster than general AI infrastructure, it signals a shift from “people using AI” to “AI doing the work.”

Package downloads do not equal production use. The signal is in relative growth rates across industries, not absolute numbers.

The construction permits analogy

Before a building goes up, construction permits spike in that neighborhood. This page works the same way: before AI replaces tasks in an industry, developers start downloading the tools to build those automation systems. We track both Python and JavaScript package downloads as a leading indicator of where AI automation is heading.

What is the Automation Acceleration Index?

The AAI compares how fast industry-specific automation tools are growing versus general AI infrastructure (like the OpenAI or Anthropic SDKs). When the AAI is above 1.0, it means the tools that automate specific jobs are growing faster than the underlying AI platform — a signal that we're moving from “people using AI” to “AI doing the work.”

What counts as “surging”?

A tool gets flagged as surging if it shows 3 or more consecutive months of greater than 20% month-over-month growth, or if its recent growth rate has at least doubled compared to the prior quarter. These are the tools gaining adoption fastest — and the industries they serve are worth watching.

Community signals

GitHub stars and issues, plus StackOverflow question volume, provide supplementary context alongside download data. Stars indicate developer interest, issues reflect active development and bug reports, and SO questions show how many people are trying to use a tool. These are shown as compact indicators on each tool but are not factored into the AAI calculation.

Important caveats

Package downloads do not equal production use. CI/CD pipelines, Docker builds, and dependency resolution inflate counts. The signal is in the relative growth rates across industries, not the absolute numbers. New tools with small user bases can show extreme growth percentages. And correlation is not causation — rising tool adoption doesn't prove job displacement, but it does indicate where investment and capability are concentrating.

Data sources: pypistats.org (Python packages), npm registry (JavaScript packages), Bureau of Labor Statistics CES (employment data), GitHub API (stars & issues), Stack Exchange API (question volume). Updated monthly.