Posted on Mar 3, 2026

Semantic Ablation

Back to glossary

Origin: Term coined by Claudio Nastruzzi (The Register, February 2026), developed and extended in my writings


Definition

Semantic ablation refers to the process by which a text submitted to a generative model to be “polished” does not improve but instead loses its substance. The generative model identifies zones of high semantic density — passages containing singular formulations, rough edges, imperfections that characterize human thought in its particularity. These zones are then replaced by the most probable, most expected, most statistically average word sequences.

What was raw thought with rough edges becomes, for the reader, a smooth surface without semantic or structural friction. This is often the moment when one “senses” that content has been processed by generative AI: something clean but emptied of its originality, in the legal sense of the term — the expression of free and creative choices that reflect the author’s personality.


In my writings

Semantic ablation is the result of an optimization objective. This phenomenon can be measured. By submitting a text to successive cycles of AI processing, lexical diversity is progressively reduced, like a page photocopied endlessly that ends up no longer resembling the original. Generative models tend to reduce the entropy of the original text.

A convergence movement can be set in motion without users realizing it, when millions of people use the same models to write, draft, rephrase, and imagine. What constitutes the originality of a writing style is marginalized by the statistical weight of homogeneous mass production. This is not merely an aesthetic loss: it is an erosion through standardization that progressively reconfigures the imaginaries that these tools claim merely to assist.

This loss is not only aesthetic and semantic. It becomes attentional: see attentional ablation.


Articles where this term is used


See also