Subscribe
A look at the grass and a statue in front of Washington Hall.

Washington Hall at the U.S. Military Academy at West Point. (Rose L. Thayer/Stars and Stripes)

A West Point cadet was dismissed from the Army after he pleaded guilty to using artificial intelligence to create a fake nude photo of a woman and then threatening to publicly release the image if she did not provide him an actual nude photo of herself.

Cadet Cayden Cork, 20, pleaded guilty to extortion and indecent conduct Feb. 10 at the U.S. Military Academy at West Point, N.Y., according to Army court records. Military Judge Col. Trevor Barna sentenced him to be reprimanded, to forfeit all pay and allowances, to be confined for 10 days and to be dismissed from the service. The sentence was consistent with the terms of a plea agreement, according to a West Point spokesperson.

The court credited Cork with 10 days of confinement so he did not spend any time in custody, according to the spokesperson.

Capt. Anthony Williamson, a prosecutor with the Army Office of Special Trial Counsel, said that personal responsibility is not diminished because a crime was committed using AI and that the military justice system can adapt to the ever-evolving technology landscape.

“When service members use these emerging tools to commit serious crimes against fellow service members, the Army will act to protect victims and uphold good order and discipline,” he said.

Cork’s dismissal reflects “the seriousness of his misconduct,” Williamson said.

The last time a cadet was dismissed following a court-martial conviction was 2024, when Cadet Tyjaha Batiste pleaded guilty to taking videos of people’s private areas without consent.

Cork, of Groveland, Fla., took a publicly available photo of a woman in July and August 2024 and altered the image with AI, according to his charge sheet. He then sent the photo to her in a text message asking, “is this you” and “how accurate is this?”

He then in September 2024 threatened to publicly release the fake image if she did not send a photo of herself. The woman’s name is redacted from court documents.

The ease with which anyone can use AI to create fake images, known sometimes as “deepfakes,” has become a growing concern. AI can use an innocuous photo of someone to generate a nude or intimate image.

The FBI first reported an uptick in extortion-type crime using AI-generated fake images in 2023.

Congress last year passed federal law to criminalize publishing fake intimate images without permission. It also makes it illegal to threaten publication of these images “for the purpose of intimidation, coercion, extortion or to create mental distress.”

author picture
Rose L. Thayer is based in Austin, Texas, and she has been covering the western region of the continental U.S. for Stars and Stripes since 2018. Before that she was a reporter for Killeen Daily Herald and a freelance journalist for publications including The Alcalde, Texas Highways and the Austin American-Statesman. She is the spouse of an Army veteran and a graduate of the University of Texas at Austin with a degree in journalism. Her awards include a 2021 Society of Professional Journalists Washington Dateline Award and an Honorable Mention from the Military Reporters and Editors Association for her coverage of crime at Fort Hood.

Sign Up for Daily Headlines

Sign up to receive a daily email of today's top military news stories from Stars and Stripes and top news outlets from around the world.

Sign Up Now