
On August 18, 2025, the content creator and influencer Anyme (alias Anyme023), 22 years old, announced on Instagram that he is putting his career on hold on social media. Beyond the news of one of the most followed streamers, the halt raises questions about how platforms regulate the work of creators in France and the EU: mental health, moderation burden, transparency of partnerships, and the application of texts (DSA, SREN) under the watch of ARCOM.
The Facts
On August 18, 2025, the creator Anyme (alias Anyme023), whose real name is Maxence Turlot (22 years old), announced on Instagram that he is putting his career "on hold on social media." The message specifies neither duration nor reason. The individual has several million followers on TikTok, Twitch, YouTube, and Instagram. During the first half of 2025, he was among the most-watched French-speaking streamers on Twitch, with peaks of over 100,000 live viewers according to public data from his streams. The announcement comes as he was expected to participate in the GP Explorer in early October 2025.
The information concerns the general interest as it questions the health of creators. It also questions the moderation of platforms. Additionally, it addresses the transparency of sponsored content. Finally, it examines the legal framework in France and the European Union.

Intensive Production and Continuous Exposure
Streaming platforms like Twitch and short video platforms like TikTok rely on engagement algorithms. These algorithms value the frequency and duration of publication. This model encourages an intensive pace of creation, especially live, where the audience interacts in real-time. Several social science studies describe forms of exhaustion related to audience pressure. Additionally, algorithmic uncertainty and monetization dependent on visibility also contribute to this exhaustion. Creators simultaneously bear a moderation burden that includes managing the chat and filtering comments. Often, they benefit from the support of tools and volunteer teams for this task.
Among young audiences, who are very present on these services, parasocial relationships reinforce the expectation of a creator’s almost permanent availability. The pause of a leading figure thus acts as a revelation of a fragile balance between the model’s attractiveness and human costs.
FR-EU Framework: DSA, ARCOM, and SREN
The Digital Services Act (DSA) imposes obligations on online platforms for transparency, reporting, and removal of illegal content, as well as accessible appeal procedures. Services exceeding 45 million monthly users in the EU are classified as very large online platforms (VLOP) and are subject to independent audits and systemic risk assessments (disinformation, threats to child protection, potential impacts on mental health).
In France, the SREN law (May 21, 2024) designates ARCOM as the coordinator of digital services for the application of the DSA, alongside entities such as CNIL (personal data) and DGCCRF (commercial practices). Users can contact ARCOM in case of alleged non-compliance. Twitch is not classified as a VLOP at the European level; it falls under the standard regime of the DSA, which nevertheless provides for transparency reports, notification mechanisms, and reasoned decisions in case of moderation.
Key Points:
- Reporting and notice-and-action: possibility to report clearly illegal content and the right to be informed of the decision.
- Transparency: regular publication of figures (moderation, appeals) and elements on the applicable rules.
- Appeals: internal channels, certified mediation, and, as a last resort, judicial means.
Moderation: Responsibilities and Limits
On its Safety Center, Twitch describes policies against harassment and hateful conduct, automoderation systems, and a gradation of sanctions. In practice, the burden also falls on creators: stream settings, chat management, handling raids. Legal experts remind that platforms are subject to a reinforced obligation of means: to act promptly after notification, document the measures taken, notify the concerned user. Media sociologists highlight the interest of shared moderation. This involves a dedicated team, public rules, and alert tools. Moreover, they emphasize the need for minimal training of creators on these issues.
Sponsorship and Transparency: The Law of Commercial Influence
Since the law of June 9, 2023 regulating commercial influence, any content promoting a product must carry a clear mention. This can be "Advertisement" or "Commercial Collaboration", and it must be accompanied by a contract. This obligation was supplemented by the ordinance of November 6, 2024. Some practices are prohibited (e.g., promotion of cosmetic surgery), others are strictly regulated (financial products, gambling). The DGCCRF can sanction violations, with brands being co-responsible. For young audiences, the readability of mentions is an issue of trust and media literacy.

Charitable Events: Civic Role and Safeguards
Charity marathons like ZEvent have demonstrated the streaming ecosystem’s ability to mobilize donations and massive audiences. These formats help structure communities and raise awareness for associations. However, they cannot replace regulation, as organizers and platforms remain bound by common law. This includes combating harassment, respecting dignity, and fulfilling moderation obligations. Internal charters and partnerships with specialized NGOs can usefully complement the legal framework.
Citizen Effects: Information, Radicalization of Exchanges, Youth
Information Live streaming can serve as a supplementary medium for explanations, culture, and popularization. Live streaming offers advantages such as interactivity. However, it also presents limitations such as real-time verification and errors. The traceability of sponsored content and the clarification of moderation rules contribute to the quality of information.
Radicalization of exchanges. The direct address and the dynamics of the chat foster affinity communities. Research on parasocial interactions shows that they can strengthen adherence, but also harden oppositions when contradiction is limited. The DSA aims to prevent these systemic risks through evaluation and mitigation measures.
Adolescent and young adult audiences benefit from spaces for socialization and mutual aid. However, they remain exposed to cyberbullying, sleep disorders, or excessive use. The SREN-DSA couple provides safeguards; their effectiveness requires media education policies involving schools, families, and communities.
Public Policy Paths
- Prevention of exhaustion: encourage sustainable rhythms (public schedules, breaks), promote co-moderation and access to professional listening cells for creators.
- Access to appeals: make the ARCOM desk operational for notifications and DSA appeals, with standardized information and clear deadlines.
- Advertising transparency: generalize the integrated labeling "commercial collaboration" in live interfaces and open consultable archives of sponsored content.
- Opening to researchers: facilitate secure access to anonymized data (chats, moderation measures) to evaluate the effects of the systems.
- Media education: integrate modules on parasocial relationships, live advertising, and moderation in EMI courses.
- Support for charitable events: condition sponsorships on anti-harassment charters, team training, and minimal public reporting.
Issues and Continuation
The pause of Anyme takes place in a context where online creation weighs in the public space. It raises a simple question: do platforms and public authorities have suitable tools to protect users? These include creators and audiences. Moreover, these tools must do so without hindering innovation. The answer lies in the application of texts (DSA, SREN, "influencers" law) and in the cooperation between platforms, authorities, and communities.
For reference:
- DSA: obligations of transparency, risk assessment, and appeals.
- ARCOM: coordinator for the application of the DSA in France.
- SREN: law no. 2024-449 of May 21, 2024.
- "Influencers" law: June 9, 2023, supplemented on November 6, 2024.
- Twitch: public moderation rules and transparency reports.
- ZEvent: notable charitable role, complementary to law.