Researchers warn about the coordinated network of improvement of sexual abuse of children in X

Advertising

European researchers found that, in their opinion, it is a coordinated network for the sale and distribution of sexually frank children’s images on the social networks platform X.

Alliance4europe is an insurmountable organization found At least 150 accounts are divided by the material of sexual violence against children (CSAM) on the platform during the four -year period in July. According to estimates in the report, the coordinated network began its activities around May 17.

Researchers believe that the network shared “millions” and that “business continued to be untouched” on X, which is controlled by billionaire Elon Musk.

Report comes after the US court revived Last week, part of the trial against X, which accused the company of negligence for allegedly not immediately mentioned in the video of obvious images of young boys in the center of missing and operated children of the United States (NCMEC).

In the European Union, meanwhile, The discussion continues About how to relate to the flooding of brutal from childhood on the Internet, while respecting people’s rights in digital confidentiality.

Improved messages led to the fact that platforms buy the content of sexual brutal handling with children

According to the analysis, the criminal network was flooded with specific hashtags associated with pornography with the content of cruelty with children, which was subsequently strengthened or distributed using new accounts. These accounts commented or reprinted content to increase the obligation.

The hashtags were used as “concentrates” of sexual violence over children, which made it easy to open other flooded hashtags and new CSAM accounts ”, as was found in the report.

Some posts were extremely graphic. They turned on the video depicting children who “were subjected to sexual violence, hastily or … clearly published,” the report discovered.

Many of the joint messages included links to online Termins or conversations about analysis, dating sites or sites selling folders with the content of sexual violence over children.

One of the reinforced pages was associated with the active address of the bitcoin -puppet, which attracted $ 660 (573 euros) in 23 transactions, which, according to the report, “can confirm (E) that the business arrives for people who buy access to content.”

‘New accounts are constantly being created “

When the researchers noted two of the first posts on the network on X, they said that the platform began to quickly remove pieces of content.

X began to block users who believed that they were minors from access to content. Researchers found that this prevented, but did not stop business activities.

“New accounts are constantly being created, which even indicates some automation, providing constant access to CSAM content,” the report said.

Researchers stated that the X “WHACK-A-MOLE” approach can actually facilitate the spread of these posts to remove illegal content and complicates the collection of evidence.

“We have no tolerance for sexual exploitation”

EURONEWS Next contacted X to request a commentary on the Alliance4europe conclusions and clarify whether the company has recent changes in the CSAM content on its platform. We did not receive an immediate answer.

However, in June, the security team X stated in the application The presence of “zero tolerance for the sexual exploitation of children in any form.”

The platform stated that the Hash CSAM began new “hash -shash -attitude, allowing its team“ quickly and safely compare the media content ”, not sacrificing the confidentiality of the user.

Fragmentation is technique It is used by the algorithm to create fingerprints in the computer system and compare fragmentation with another stored in one database is called fragmentation.

When X finds the content of sexual violence against children, he works “quickly” to suspend the account and report this NCMEC, USA. In 2024, the X said that he sent 686 176 reports to the center to the center and suspended 4.5 million CSAM accounts.

This led to 94 arrests and one conviction, the platform added. NCMEC confirmed these numbers as the latter in the application for Euronews Next.

“We invested in advanced technologies and tools to prevent the spread of poor factors, search or participation in operational content in all forms of the media in X,” the company says in June.

Leave a Comment