The claim circulating on Russian platforms that a Ukrainian soldier allegedly confessed to killing a child near Sudzha is either fake or at least unverified. The message was posted on a Facebook account that, according to available data, previously belonged to Ukrainian soldier Artur Yakovitskyi. His family has reported him missing – according to open sources, he disappeared near the settlement of Sudzha on 28 February 2025. On 28 March, a suspicious post appeared on his Facebook page, allegedly containing a confession to the murder. On 1 April, a screenshot of this post began actively circulating on pro-Russian Telegram channels. The image of the “murdered teenager” attached to the post was likely generated by artificial intelligence – this is supported by the analysis results of several AI detection tools.

Russian news sites are spreading claims that a Ukrainian soldier has confessed to his comrades that he killed a child near Sudzha. The stories include a screenshot of a Facebook post allegedly made by a member of the Ukrainian armed forces in which he repents and asks the boy’s parents to give their son a “proper burial”. The accompanying image shows the body of a teenager dressed in a military uniform that resembles Russian camouflage.

Screenshot – rostov.tsargrad.tv

The screenshot attributed to the Ukrainian soldier was widely distributed by propaganda Telegram channels. According to the Osavul project, the first appearance of the screenshot and the claim of a “confession” took place on 1 April 2025 in the Telegram channel “Odessa Za Pobedu!” under the hashtag #подписчикипишут (subscribers messages).

StopFake was able to find a similar publication on Facebook. The picture of the boy and the “confession” were posted on the evening of 28 March 2025 from an account named “Artur Yakovitskyi”. Minutes later, another post appeared saying “I don’t care what happens to me”. Under this post, a user named “Valia Oseledchuk” commented, claiming to be the biological aunt of the real account owner, Artur Yakovitskyi. According to her, he is missing and someone has probably gained unauthorised access to his Facebook account – possibly people acting in Russia’s interests. She urged people not to share or comment on the fake posts. StopFake decided to investigate further to verify the information.

Screenshot – facebook.com

Indeed, since March, various Facebook groups dedicated to the search for missing persons have featured posts from family members asking for help in locating Artur Vasylovych Yakovitskyi, born in 1999 – examples can be found here, here, here and here. These posts state that he is a Ukrainian serviceman who went missing near Sudzha, near Mala Loknya, at the end of February 2025.

Back to Artur Yakovitskyi’s Facebook profile: the account was created in January 2025. It currently shows only three posts – two dated 28 March and one dated 4 January (a profile photo showing a group of soldiers in Ukrainian uniforms). Artur Yakovitskyi can be identified in this image – his appearance matches photos previously shared by his relatives. In addition, comments made on the account before February 2025 suggest that the page indeed belonged to him.

Artur Vasylovych Yakovitskyi is also listed as a missing person in the database of the Ukrainian Ministry of Internal Affairs. The official entry confirms that he disappeared on 28 February 2025.

Screenshot – wanted.mvs.gov.ua

It is highly likely that third parties gained access to Artur’s phone and Facebook account after his disappearance and used them to post fake content.

As for the image of the “killed boy” allegedly posted by the Ukrainian soldier, it is most likely fake and AI-generated. The photo shows a teenager dressed in what appears to be Russian military camouflage. On closer inspection, however, the uniform only vaguely resembles Russian equipment and doesn’t match it exactly. In addition, the texture of the fabric appears unnatural and there are facial distortions – signs typical of AI-generated images. StopFake has previously analysed such visual anomalies in its report How AI-Generated Content Distorts Truth About War and Russian War Crimes: Detecting and Analyzing AI-Generated Images.

Скриншот – facebook.com

StopFake also analysed the image using several AI detection tools. Hive Moderation gave the image an 87.9% chance of being AI-generated. Other services, such as SightEngine and Undetectable, rated the likelihood as ‘low’, possibly due to the poor quality of the image.

Screenshot – hivemoderation.com

For context: Ukrainian forces invaded parts of the Russian Kursk region, including the city of Sudzha, in early August 2024. However, by mid-March 2025, under enemy pressure, Ukrainian troops were forced to withdraw from Sudzha and its surroundings back to the eastern Ukrainian border. Following the withdrawal, the Kremlin intensified propaganda narratives accusing Ukrainian forces of war crimes in the previously occupied territory. These claims are aimed at discrediting the Ukrainian military and undermining international support, as analysts at the Institute for the Study of War (ISW) have noted. Such forgeries, including forgeries of alleged ‘confessions’ by Ukrainian soldiers to atrocities or crimes, are not new – they are systematically used in Russian information campaigns to create the image of a ‘cruel Ukrainian soldier’. Such content is particularly actively disseminated after the tactical retreat of Ukrainian troops in order to undermine their credibility at home and abroad.

The Sudzha example is just one episode in a wider series of coordinated disinformation attacks. Fake confessions, compromised social media accounts and AI-generated images are increasingly being used as tools in Russia’s information warfare.