The victim of sexual abuse of children begs Elon Musk to delete links with his images

BBC news surveys

A victim of sexual abuse on children begged Elon Musk to stop links offering images of his abuses published on his social media platform X.
“Hear that my abuses – and that the abuse of so many others – is still in the process of traffic and goods here is exasperating,” explains “Zora” (not his real name) who lives in the United States and was abused for the first time more than 20 years ago.
“Whenever someone sells or shares children’s mistreatment equipment, he directly supplies original and horrible abuses.”
X says that he has “zero tolerance for sexual abuse materials on children” and attack those who exploit children remain “an absolute priority”.
The BBC has found images of Zora while investigating the world’s sexual abuse trade in children, estimated at billions of dollars by children’s light, the Global Child Safety Institute.
The equipment was part of a cache of thousands of similar photos and videos offered for sale on an X account. We came into contact with the trader via the Telegram messaging application, which led us to a bank account linked to a person in Jakarta, in Indonesia.
Zora was first abused by a family member. A collection of images of his abuses has become infamous among pedophiles who collect and exchange such content. Many other victims are faced with the same situation, because abuse images continue to circulate today.
Zora is angry that trade continues to date.
“My body is not a commodity. It has never been, and it will never be,” she said.
“Those who distribute this material are not passive liabilities, they are accomplice authors.”
X account monitoring
The images of Zora’s abuses were originally only available on the so-called Dark Web, but it must now live with reality that links are openly promoted to X.
Social media platforms are trying to rid their platforms of illegal equipment, but the extent of the problem is enormous.
Last year, the American National Center for Missing and Exploited Children (NCMEC) received more than 20 million compulsory reports of technological companies on incidents of children’s sexual equipment (CSAM) – illegal images and videos on their platforms.
The NCMEC tries to identify the victims and the authors, the organization then contacts the police.
We approached the group “Hacktivist” Anonymous, whose members are trying to fight against the trade in children’s abuse on X. One of them told us that the situation was as bad as ever.
They made us switch to a single account on X. He used a photo of the head and shoulders of a real child like Avatar. There was nothing obscene on this subject.
But the words and emojis of the account biography clearly indicated that the owner sold sexual abuse equipment for children and that there was a link to an account on the Telegram messaging application.

The merchant seemed to be based in Indonesia and offered “VIP packages”, image collections and abuse video files for sale to pedophiles around the world.
The anonymous activist had worked to report the multiple accounts of this merchant on X, so that they can be deleted by moderation systems of the platform. But whenever an account was deleted, he told us, another new one would replace it.
The merchant seemed to have supervised more than 100 accounts almost identical. The activist told us that when he had contacted the merchant directly using Telegram, the merchant replied by saying that he had thousands of videos and images for sale.
“I have a baby. Kids Young 7-12,” he wrote in messages to the activist seen by the BBC. He also explained that part of the content showed a rape for children.
We contacted the merchant ourselves.
He provided links to equipment samples, which we have not opened or visited. Instead, we contacted experts from the Canadian Child Protection Center (CCCP) in Winnipeg – who work alongside the police and are legally authorized to see such content.

“The Telegram account was, for lack of a better term, a tasting pack – essentially a collage of the equipment he had of all the different victims,” ​​said Lloyd Richardson, director of CCCP technology. “When we looked at all the different images of the collages, I would say that there were thousands.”
Among the files were images of Zora.
Its attacker in the United States was continued and imprisoned many years ago, but not before images of abuse were already shared and sold around the world.
Zora said to us: “I have tried over the years to overcome my past and not let him determine my future, but the authors and the stalkers always find a way to see this dirt.”
As we get older, the Stalkers discovered the identity of Zora, contacting it and threatening it online. She says she feels “intimidated by a crime that stole my childhood from me”.
Find the merchant
To identify the merchant selling photos of Zora, we put it as a buyer.
The merchant sent us his bank information and an online payment account, the two had the same name indicated as the account holder.
The anonymous activist had discovered that this name was also linked to two money transfers and another bank account.
We found a man with the same name as that listed on the accounts, at an address on the outskirts of the Indonesian capital, Jakarta.
A producer working in the city for the BBC World Service went to visit the address and confronted a man on the spot who, when he presented the evidence, said he had been shocked.
“I don’t know anything about it,” he said.
The man confirmed that one of the bank accounts was his and said that he had been created for a transaction linked to a single mortgage. He said that he had not used the account since and that he would contact his bank to find out what had happened. He denied knowledge of the other bank account or money transfers.
We cannot know with certainty if, and to what extent it can be involved and, therefore, we do not name it.

The way in which Zora’s images were marketed is a method used by hundreds of traders around the world, according to our survey.
Articles on X use different hashtags familiar to pedophiles. The images that appear on the platform are often taken from known images of children’s abuse but can be cropped so that they are not obscene.
Elon Musk said that the abolition of children’s sexual abuse material was his “absolute priority” when he resumed X, then known as Twitter in 2022.

Social media platforms in general, not only X, could do much more to prevent criminals from publishing several times in this way, explains Lloyd Richardson of the CCCP.
“It is great that we can send a withdrawal notice (to social media platforms), and they delete the account, but it is the bare minimum.”
The problem is that users can return to platforms in a few days with a new account, he said.
X told us that he had a “zero tolerance” for the sexual exploitation of children. “We are continuously investing in advanced detection to allow us to quickly take measures against content and accounts that violate our rules,” said a spokesperson.
The platform has told us that it works “in close collaboration with the National Center for Missing and Exposed Children (NCMEC) and supports the efforts to apply the law to continue these odious crimes”.
Telegram said: “All channels are moderate, and more than 565,000 groups and channels related to the spread of the CSAM have been prohibited so far in 2025.”
The platform said it had more than a thousand moderators working on the issue.
“Telegram proactively monitors public content on the platform and removes reprehensible equipment before you can reach users or be reported,” said a spokesperson.
When we told Zora that her photos were exchanged using X, she had this message for the owner of the platform, Elon Musk: “Our abuses are shared, exchanged and sold on the application you have. If you act without hesitation to protect your own children, please do the same for the rest of us. The time to act is now.”
If you are affected by one of the problems raised in this report, help and support are available via the BBC Action Line
https://ichef.bbci.co.uk/news/1024/branded_news/b0a0/live/9b341db0-7cee-11f0-83cc-c5da98c419b8.png