Skip to content
islam.ninja
Go back

We Are the Ore

6 min read

On extraction, enclosure, and the mine that eats its own foundation

Most people think AI is too powerful. They worry it will think for us, decide for us, eventually replace us.

They have the threat backwards.

The more unsettling possibility is not that AI is too intelligent. It’s that AI is an extraction machine — and we are what’s being mined.


The Cost That Kept Value Buried

There is a basic fact about economic value that most people never encounter: most of it never gets used.

Not because it doesn’t exist. Because extraction is expensive.

Nineteenth century prospectors walked past gold deposits every day. The ore was there, the value was real, but if pulling it out cost more than it returned, it stayed in the ground. Value is not just a property of things. It is a relationship between what something is worth and what it costs to recover it.

This principle is older than capitalism. It explains why fishing villages ignored deep-sea beds for centuries, why ancient civilizations farmed only flood plains while highlands went untouched, why the same patch of earth could be worthless in 1800 and priceless in 1900 — not because the earth changed, but because the extraction technology did.

AI is an extraction technology. What it has changed is not what exists. It is what can be reached.


What Was Too Expensive Before

The value AI unlocks is not in the obvious places. Faster writing and cheaper code reviews are just productivity gains — the same kind spreadsheets gave accountants. The real extraction happens at the threshold where costs previously crossed into loss.

A company with a decade of customer support conversations had a goldmine of behavioral insight. Product managers knew it. They just couldn’t process fifty thousand transcripts by hand. The ore existed. The extraction cost was prohibitive. So the knowledge stayed buried — not in a database, but effectively underground.

A developer maintaining a million-line legacy codebase carried immense institutional knowledge in the architecture itself. Implicit. Queryable only through expensive human hours. Not worth surfacing. Left inert.

In each case, the value was not created by AI. It was recovered. The mine had always been there. What changed is the machinery.

This is what most productivity research struggles to measure — it is built to find speed improvements on existing tasks, not value unlocked by previous impossibility. Those are different phenomena, and conflating them obscures what AI actually is.

Those examples involve knowledge companies already owned. That’s the mild version. The deeper extraction runs across the public record — the writing, judgment, and code that no single entity owns. No one thought to put a fence around it, because no one imagined it could be mined.


Who Owns the Deposit

Physical mining required land rights. You could not extract ore from ground you did not own or license. The law was adversarial by design, because extraction without ownership is just theft at scale.

AI mining requires no such thing.

The deposit is human-generated knowledge — every piece of writing ever published, every judgment ever encoded in a decision, a review, a line of code. Vast, scattered, and structurally unowned. The people who created it receive no royalties. The communities that maintained it get no extraction fees. The value was produced collectively and is being captured privately — which is not a market outcome. It is a seizure that arrived before anyone had language for it.

This is contested. Publishers, authors, and regulators are pushing back — lawsuits filed, disclosure rules drafted, collective licensing proposed. But those efforts are responding to an extraction that was already underway — the legal frameworks are being built on a mine that is already operating. The law is a slow institution trying to catch a fast machine.

And the ore is not homogeneous. A Wikipedia editor maintaining articles in a minority language, a novelist whose backlist was scraped without a licensing conversation, a content moderator whose judgment calls trained a model’s sense of harm — these are different grades of deposit bearing different costs of extraction. The damage concentrates where the least power exists to resist it.

Every time this pattern has appeared in history — shared land turned to private property, shared knowledge turned to intellectual capital, shared attention turned to monetizable surface — economists and historians have reached for the same word: enclosure. It describes the moment a commons gets a fence around it. The encloser captures the value. The people who built and maintained the commons bear the cost. The damage only becomes fully visible once extraction is complete, when the land is barren and the fishery gone and the aquifer dry.

What is being enclosed is the cognitive commons — the accumulated output of human thought, everything written, argued, coded, and judged over centuries. It took that long to build. The infrastructure now extracting it was built in a decade.

That asymmetry is the real measure of what is happening.


The Self-Consuming Mine

There is a specific failure mode when the extracted resource is also the ecosystem that produces it.

Strip a river of fish faster than the population regenerates and you don’t get diminishing returns. You get collapse. The mine does not run out gradually. It tips.

AI is building this dynamic in cognitive territory, in two directions simultaneously.

The first: it is extracting the incentive to produce ore at all. When competent writing, code, and analysis can be generated at near-zero cost, the economic return on producing those things with effort collapses. Writers earn less. Developers get commoditized. The human judgment that made the training data valuable becomes harder to sustain economically. The conditions that produced the deposit are being destroyed by the extraction process.

The second: it is contaminating the deposit with synthetic fill. Models trained on AI-generated output learn from imitation of imitation — each generation a photocopy of a photocopy, the resolution dropping with every pass. The signal degrades. Authentic human judgment — the high-grade ore — becomes progressively harder to distinguish from synthetic substitutes. The mine dilutes itself.

Together, these produce a sequence no one in the industry states clearly: AI extracts value from human cognition, which reduces the economic incentive to produce high-quality human cognition, which degrades the quality of future training data, which degrades the quality of future extraction.

The mine is eating its own foundation.

The deposit is not geological. It is not self-replenishing by default. It is the accumulated effort of people operating under economic conditions that the extraction is actively dismantling. Whether it is finite is the question no one running the machinery is paid to ask.


The mine is open and the machinery is extraordinary. What it cannot do is put the ore back.


Share this post on:

Previous Post
The Data Was Stale. The Strike Was Precise.
Next Post
The Scribe and the Stock Photographer