Taste looks subjective until experienced practitioners converge. The convergence comes from shared constraints — and AI is removing them.
In 1980, Abbie Conant auditioned behind a screen for a trombone position with the Munich Philharmonic. The jury selected her. When the screen was lifted, the orchestra’s music director, Sergiu Celibidache, reportedly said: someone made a mistake here.
No one disputed the playing. The dispute was about everything else.
Behind the screen, the jury’s ears converged. They tracked real qualities — tone, phrasing, structural command of the piece — without knowing who held the instrument. When the screen came up, identity returned: gender, expectation, the accumulated weight of who was supposed to hold that chair. The ears had measured something. That something wasn’t the only force acting on the judgment.
Calling it objective ignores what happened after the screen came up. Calling it subjective ignores the convergence that happened before. Taste sits in a space that neither word reaches.
What convergence looks like
By the mid-2020s, a pattern had become visible in software architecture. A decade earlier, the industry had embraced a design philosophy called microservices — breaking large applications into dozens of small, independent pieces that could be built and deployed separately. Engineers who had independently built and operated these systems — at different companies, with different technologies, over different decades — arrived at overlapping conclusions.
The convergence wasn’t on what to do instead. It was on what hurt. When something failed, the failure could live in any of a dozen pieces, and finding it meant searching across all of them. The independence each team was supposed to gain was consumed by the effort of keeping everything in sync. Shipping a change to one part meant testing it against every part it touched. These engineers hadn’t coordinated. The architecture had taught them.
They still disagreed about alternatives — when to split, how far to consolidate, whether the monolith was a destination or a starting position. But the band of disagreement was narrow compared to the spread among developers who had read about both approaches and operated neither at scale.
This is what convergence looks like up close. Not consensus. A narrowing. The same pattern shows up in surgery, in structural engineering, in typography. People who have done the work long enough start seeing the same hazards, even when they can’t always name them.
Scar tissue with opinions
An editor’s ear for prose wasn’t educated by studying grammar. It was educated by reading thousands of manuscripts under conditions where the judgment carried weight — an author’s career on the line, a print run committed before the final pass. Each cut that turned out to have been load-bearing left a residue. Not a rule. A sensitivity.
Engineering works the same way. Every senior developer carries the memory of systems that broke. The abstraction that looked elegant until it met production traffic. The caching layer that saved forty milliseconds and created a week of stale-state debugging. The “temporary” workaround that survived three years because nobody understood it well enough to remove it.
These failures don’t teach principles. They teach flinches. The engineer who has been burned by premature optimization doesn’t apply a rule against it. She feels the weight of it differently. The code looks the same on screen. It doesn’t feel the same in her hands.
Taste is what you have after the constraints have done their work on you.
But if taste is personal — built from individual failure — why does experience produce agreement rather than idiosyncrasy? Because the constraints are shared. Gravity is the same for every architect. Memory allocation punishes every systems programmer along the same axes. The human body breaks along predictable lines for every surgeon. Each practitioner carries their own failures, but the failure space has structure. The scars are personal. What caused them is not.
Removing the teacher
This is where the argument turns from observation to problem.
AI removes friction, failure, and constraint from development. That is its point.
Consider a junior engineer who joins a team using an AI coding assistant. The assistant catches the flawed database change before it reaches production. It suggests the performance fix the engineer wouldn’t have thought to add. It spots the timing bug that would only surface under real traffic. Each intervention is correct. Each saves time. And each removes an experience that would have, eventually, left a mark.
A year in, the engineer reviews a code change the AI has written — a component that passes every test, follows every convention, and will hold up fine until it won’t. There is nothing in the code to flag. There is also nothing in the engineer’s history that would let them feel the particular brittleness of this abstraction. An engineer who had lived through that failure before would hesitate. This one approves.
The model that wrote the code carried the residue of taste — the accumulated judgment of millions of practitioners who had flinched at similar patterns and left traces of those flinches in their work. The engineer holding the approval button did not. Both produced the right output on that Tuesday. The distance between them won’t be visible until a day the model’s judgment doesn’t cover the gap, and there is no flinch left to catch what the tool missed.
The honest case
Calculators were supposed to destroy mathematical intuition. Spell-checkers were supposed to kill the instinct for language. GPS was supposed to atrophy the sense of direction. In each case, the tool shifted where judgment formed rather than eliminating the conditions for it. Perhaps AI is the same kind of shift — constraints migrating rather than vanishing.
But previous tools shifted constraints gradually. Calculators didn’t appear in every classroom simultaneously. GPS adoption took decades. Each transition had overlap: a generation that learned without the tool and taught the generation that learned with it. AI is compressing that overlap into nothing. The junior engineer hired next quarter may never work without an AI pair. They will produce competent code from the start. They will pass review. They will ship. And they may reach positions of architectural responsibility without ever having been the person who stayed up debugging a failure that no model anticipated — the failure that would have become the scar that would have become the taste that would have caught the next one.
A sharper objection remains. Maybe what looks like convergence among experienced engineers is conformity — shared training, shared conference talks, shared war stories calcified into instinct. The engineer who flinches at the microservice split might be flinching at a wound that no longer applies. Paradigm shifts do make old taste obsolete. The mainframe programmer’s instincts were actively harmful in the client-server era. Some scars should fade.
But even when those scars became obsolete, the practitioners who carried them had been shaped by some constraint. They had developed the capacity for taste — the ability to be marked by experience — even if the specific content needed updating. The question is not whether old taste is sometimes wrong. It is whether you can develop the capacity for taste at all without the pressure that produced it.
Behind the screen, the jury converged because each member had spent decades listening to thousands of performances under conditions that punished bad judgment. A principal chair shapes an orchestra’s sound for years. Their ears were not gifts. They were residues — of attention, of error, of consequence.
Remove those conditions. Replace them with a system that selects well every time. Within a generation, the selections are still sound. The ears are gone. Not because anyone decided to discard them, but because the thing that built them — the long, inefficient, failure-laden process of developing judgment — was quietly optimized away.
The orchestra still sounds good. What the system cannot do is notice the moment its own selections begin to narrow — when the distance between good and good enough closes, and there is no ear left in the room that can hear the difference.