Jean Pormanove’s death live on Kick: what we know, who’s accountable, how to fix it

Death live on Kick: contested moderation, DSA and Arcom at the forefront. The platform promises safeguards. Adil Rami denies any involvement, according to his lawyer.

The live death of Jean Pormanove, 46 years old, on the Kick platform during the night of August 17 to 18 in Contes (Alpes-Maritimes) has upset hundreds of thousands of internet users and reignited a sensitive public debate: what was known, who should have acted, and when? While the autopsy rules out third-party intervention, the case tests platform moderation, the DSA framework, and our role as spectators.

Established Facts

During the night of August 17 to 18, 2025, Jean Pormanove, whose real name is Raphaël Graven, 46 years old, dies in Contes (Alpes-Maritimes) during a live on the Kick platform. The Nice prosecutor indicates that the autopsy reveals no internal or external traumatic injury. Furthermore, it rules out third-party intervention. The probable causes are therefore medical or toxicological. Additional analyses (toxicology, anatomopathology) are underway.

Earlier videos, widely circulated, show the deceased being insulted, slapped, and strangled. Additionally, he is hit by paintball shots. These scenes are presented as "challenges" or "staging". A second man, nicknamed Coudoux, visibly vulnerable, also appears. Justice has opened an investigation, seized equipment, and interviewed several people.

Clara Chappaz, Minister Delegate for Artificial Intelligence and Digital Affairs, denounces an "absolute horror" and refers the matter to Arcom (the French regulator of media services and platforms). Kick promises to review its moderation rules for France and to cooperate with the authorities.

Why This Case Shocks

Monetizing Vulnerability

Turning real or staged suffering into paid entertainment creates a market where the integrity of others becomes a variable of audience. Levels of donations, subscriptions, paid "challenges": the more a sequence shocks, the more it captures, the more it earns. This logic of buzz aligns the economy with the exposure of vulnerability. A mechanism reminiscent of the film ‘Death Watch’ (1980).

Consent or Coercion?

In these formats, consent is often invoked. However, ethics distinguish between informed consent and consent under pressure (economic, psychological, social). Saying "yes" to survive or gain visibility does not guarantee a free choice. Simple question: who gains what? and who bears the human risk?

Platforms: From Rule to Reality

Under the European Union, the Digital Services Act (DSA) requires platforms to:

  • Assess and mitigate systemic risks (violence, dignity violations, harassment, health effects);
  • Offer effective reporting procedures and a reasoned response;
  • Publish transparency reports and provide access to certain data to authorities,
  • For very large platforms, undergo independent audits and offer a recommendation without profiling.

In France, Arcom is the Coordinator of Digital Services: it oversees the application of the DSA, issues warnings if necessary, sanctions, and cooperates with the European Commission. The SREN law (May 21, 2024) complements this framework (transparency, enhanced reporting, account suspension penalties after conviction).

The key question is not limited to "could these videos legally remain online?". Instead, it concerns "was everything done, early enough, to prevent the harm?".

In practice, a design responsibility is required. It includes real-time detection of violence, distress, and humiliation. Additionally, it demands the cutting of streams by a human operator. Furthermore, it is necessary to establish duration safeguards, such as live limits and breaks. Moreover, clear bans on certain challenges are essential: strangulation, point-blank shooting, substance ingestion. Finally, ensuring traceability between staged scenes and real scenes is crucial.

A Chain of Responsibilities, Beyond Individuals

The live is an ecosystem. Besides the hosts, there are co-streamers, managers, technicians, communities, sometimes sponsors. At what point does inaction become moral complicity? When a humiliating challenge is profitable, everyone benefits (visibility, income), except the exposed person. Responsibility does not stop at the medical cause of death: it questions those who profited from the content.

The Public, a Decisive Actor

The public is not a bystander. Donations, tips, subscriptions, comments, shares: each gesture encourages and normalizes. Authorities have pointed out the role of massive audiences (up to 200,000 people connected) in the escalation. A simple principle: do not feed the machine, report instead of relaying, refuse to disseminate traumatic clips.

Rami denies any involvement with Kick and these contents. Through his lawyer, he maintains he has never been a viewer, participant, or promoter, despite his live shout-out to ‘Naruto.’ He says he cares about JP and offers his condolences, but his appearances with the collective continue to fuel doubts about his role.
Rami denies any involvement with Kick and these contents. Through his lawyer, he maintains he has never been a viewer, participant, or promoter, despite his live shout-out to ‘Naruto.’ He says he cares about JP and offers his condolences, but his appearances with the collective continue to fuel doubts about his role.

Kick, Stake, and the Attention Economy

Kick was launched in 2022 by Ed Craven and Bijan Tehrani, also co-founders of Stake, an online casino using notably cryptocurrencies. This relationship fuels the idea of a model where shocking engagement is profitable. Kick claims a more lenient moderation and a generous revenue share. Hence, a challenge: what ethical guarantees when attention is the main currency?

Dignity and Disability: Red Lines

Exposing a vulnerable or disabled person to amuse or shock offends dignity and non-discrimination. The argument "everything is scripted" is not enough. Indeed, the staging of violations can cause real harm. This concerns both those involved and the spectators. For example, there is a risk of trivializing violence, desensitization, and imitation.

Informing Without Harming in the Face of a Live Death

Journalistic and civic ethics dictate: do not republish images of the death, describe rather than show, contextualize rather than create buzz. Presumption of innocence for individuals as long as the investigation is not closed. However, an evaluation of systemic responsibility is necessary now. This includes platforms, algorithms, and the attention economy.

Concrete Steps to Act

For Spectators

  • Do not fund escalation: avoid donations conditioned on violence, refuse degrading challenges, stop sharing shocking clips.
  • Systematically report using platform tools. Additionally, alert the competent authorities when life or integrity is at stake.
  • Adopt digital hygiene reflexes: "who pays what? who benefits? what human risk?"

For Creators and Co-Streamers

  • Code of conduct: no challenge that harms dignity or health, right to withdraw without pressure, and emergency stop at any time.
  • Operational safeguards: impose a maximum live duration and plan mandatory breaks. Moreover, a well-being referent should be designated to ensure compliance with the rules. Thus, ensure the immediate ban of dangerous challenges (strangulation, point-blank shooting, product ingestion, prolonged deprivation).
  • Traceability of "staged" vs real scenes: disclaimers are not enough; the platform must be able to cut.

For Platforms

  • Proactive real-time detection (violence, distress, humiliation, endangerment) with a human kill-switch.
  • Responsible creator pathway: verifications, training, anti-fatigue thresholds (long lives), alert channel for teams and the public.
  • Transparency: publish rule enforcement reports, communicate on responses to reports and on delistings.
  • Regulatory cooperation: comply with the DSA (risk assessment/mitigation, data access for authorities), under the aegis of Arcom.

For Public Authorities and Schools

  • Enhanced media education: mechanics of attention, live economy, peer pressure, consent.
  • Better-equipped emergency procedures: direct contacts platforms-authorities-rescue for risky lives.
  • Regular evaluation of the adequacy of the framework (DSA, SREN) to the specificities of continuous and cross-border live.
The tragic end of a fragile man. How far will the voyeurism of an increasingly sadistic and mercantile humanity go?
The tragic end of a fragile man. How far will the voyeurism of an increasingly sadistic and mercantile humanity go?

Start from the Facts, Analyze the Mechanisms

Stay as close as possible to the established facts: death without identified traumatic cause, ongoing investigations, questions about moderation. Then analyze the incentives and power dynamics that make this type of content possible and monetizable (economic model, algorithms, communities, audiences).

Caution on individual criminal responsibilities as long as the investigation is not closed. However, systemic responsibility must be examined. Indeed, this includes platforms, the attention economy, and service design. Additionally, audience behaviors must also be considered. Therefore, this topic can and should be debated now.

This article was written by Christian Pierre.