The distinction is not subtle. A man submits a photograph—his photograph, with his light, his accident of composition, his particular Tuesday afternoon—and asks a machine to clean it up. Remove a blemish. Correct what is minor. The request is for subtraction: take this small thing away and return the rest. What he receives is not subtraction. What he receives is a new image, built from whole cloth, that wears his photograph's clothes and answers to its name.
The specimen, posted to the Reddit forum r/ChatGPT under the title "Photo Cleanup," presents as a before-and-after pair. The convention is legible. We are meant to read left-to-right, flaw-to-fix, as one reads a dermatologist's testimonial or a detergent advertisement. The framing invites us to see improvement. What it shows, if one looks with any care at all, is replacement.
The "after" image does not contain the original's pixels, corrected. It contains new pixels, generated, that approximate the original's subject and spatial arrangement while altering everything else: the quality of light, the texture of surfaces, the behavior of shadow, and the physiognomy of the human face presented. The machine has not performed restoration. It has performed *recomposition*—and the interface has permitted this to pass as the same operation.
This matters. Not because the output is poor. It is, in fact, competent in the way that a well-cast understudy is competent: it hits its marks, it knows the blocking, and it is not the person you came to see. The output matters because the tool was asked to do one thing and did another, and the gap between those two things constitutes the entire history of photographic ethics.
Restoration presupposes an original. The restorer's authority derives from subordination to the source material. The scratch on the negative is removed; the photograph beneath it is revealed. The relationship is archaeological. What the machine has done here is not archaeological but authorial. It has taken the photograph as input—as *prompt*—and produced a new artefact that refers to the original without preserving it. The photograph has been consumed. What remains is a citation.
The user appears not to have noticed, or—and this is the more interesting possibility—not to have minded. The post is celebratory. The responses are enthusiastic. The replacement has been accepted as repair, and the original has been discarded not with grief but with satisfaction, the way one discards the draft once the clean copy exists. But the draft and the clean copy were, in this case, two different documents written by two different hands, and the confidence with which the second was accepted as a corrected version of the first suggests that the photograph's identity was never located in its specificity to begin with.
This is the auteur question applied at its most compressed scale. Did the user make a conscious decision to accept substitution as correction? Did the machine make a conscious decision to substitute? The answer in both cases is no, which is precisely the problem. The user wanted a clean photograph and received one. The machine could not edit the photograph—it lacks the capacity for the scalpel-work of localized correction—and so it did what it can do, which is generate. The interface presented generation as editing. The user accepted the interface's claim. No one in this exchange made a decision. A decision was nevertheless made: the original photograph is now gone, superseded by its replacement, and the replacement has been certified as authentic by the only authority that matters: the person who stopped looking.
The tool's limitation is structural, not incidental. These systems do not modify images. They produce images. The difference between a conservator and a forger is a difference the interface is designed to collapse. The prompt field says "edit." The operation is "generate conditioned on." These are not synonyms. They are not even adjacent. But the output lands in the same frame, labeled with the same filename, and the user—who wanted a clean photograph, and who received a clean photograph—has no reason to interrogate the seam.
What has been lost is not quality. The generated image may, by certain measures, exceed the original. What has been lost is provenance. The photograph recorded a specific arrangement of light at a specific moment. The output records nothing. It refers. It is a painting of a window, installed where the window was, and it is very good, and the view is almost right, and no one has yet asked why the breeze stopped.
The specimen is, finally, a small thing. One user, one photograph, one forum post. But the operation it documents—the silent replacement of the recorded with the plausible—is not small. It is the operation. And the most precise thing about it is the title the user chose: "Photo Cleanup." As if the photograph were still in there, somewhere, under the generated surface. As if cleaning had occurred.
Specimen: Before-and-after image pair, presented as photographic correction. Recovered from Reddit, r/ChatGPT, December 2024. The "after" image shares the composition of the "before" but none of its photographic data.
