The Corner

Cass Report AI Images: Patients Aren’t Poster Children

A person walks past an image of a National Health Service worker displayed on hoardings outside a temporary field hospital at St. George’s Hospital in London, January 8, 2022. (Henry Nicholls/Reuters)

The Cass Report is a devastating exposé.

Sign in here to read more.

I’ve covered the Cass Report, a landmark independent review commissioned by U.K.’s National Health Service on gender-related medical care for minors and young adults. The Cass Report is a devastating exposé: It confirms that previous studies on the topic are of “poor quality” and there is “very limited evidence on the longer-term outcomes” associated with medicalized transition. Hillary Cass, who led the review, is firm in her introduction: “The reality is that we have no good evidence on the long-term outcomes of interventions to manage gender-related distress.” But perhaps the significance of the Cass Report is what it does, not what it says; the NHS has already implemented some of the recommendations and initiated investigations into clinics that offered gender-related treatments. 

Unsurprisingly, the activists who chanted “trust the science” lacked well-conducted studies and evidence to justify their support for “gender-affirming care”; the people who couldn’t provide robust arguments resorted to demonizing anyone who doubted their conclusions. Given the absence of data to refute the substance of the Cass Report, activists are trying to discredit the review because of its images.  

It has been confirmed that the Cass Report includes at least two images generated by artificial intelligence. The cover art shows a child from shoulders-down leaning against a wall, but the fingers are weirdly melded together. Later in the document, there is an image of a teenager with pink and yellow hair; the eyes are a bit odd, and the buttons on the jacket don’t quite match. 404 Media verified that the image of the teenager was AI-generated and sourced from Adobe Stock images.

Activists seized the opportunity to invalidate the entire report. In its coverage of the images, Pink News wrote “artificial intelligence has the potential to . . . reinforce discrimination and stereotypes associated with the LGBTQ+ community.” Alejandra Caraballo, an instructor at Harvard, wrote on Twitter that “This cover photo of the Cass Review is AI generated. Once again, they need to create a fictionalized idea of trans people because they refuse to actually meet with us or see us as people.” Carabello added that “The Cass Review couldn’t be bothered to talk to actual trans people so they used AI generated images to stand in for trans people.”

Of course, using “fake” images does not mean that the researchers never talked to real people. There are practical reasons why a damning report on gender-related medical care should avoid using photos, especially ones that depict minors — whether or not those kids had been harmed by gender-related treatment. Trust in medicine depends in part on confidentiality; using photos with identifiable information raises serious ethical concerns about privacy. The Cass team told 404 Media that “the report uses Adobe stock images — some real and some AI” and “in selecting images the Review was conscious of the sensitive and contentious nature of the subject matter and made effort not to use images that could identify any individuals.” Indeed, the extreme politicization of gender-related treatments could easily render an identified person either a hero or a victim subject to exploitation by activists across the political spectrum.

I don’t think the Cass Report needed images other than relevant graphs and charts, but I can understand that the researchers who compiled a several-hundred-page document thought that stock photos might contribute a bit of aesthetic value by breaking up the lengthy text. They could have been a bit more careful and selected cover art that didn’t have creepy fingers. Still, the images do not invalidate the research and analysis. Ultimately, a high-profile publication doomed to be controversial for its findings — like the Cass Report — on a sensitive and divisive topic has an obligation to prevent turning patients into poster children, and that can be accomplished in part through AI-generated images. 

Abigail Anthony is the current Collegiate Network Fellow. She graduated from Princeton University in 2023 and is a Barry Scholar studying Linguistics at Oxford University.
You have 1 article remaining.
You have 2 articles remaining.
You have 3 articles remaining.
You have 4 articles remaining.
You have 5 articles remaining.
Exit mobile version