More than 2 million times on social networks have viewed the image of the actor Alain Delon who previously passed away. An AFP journalist recently noted that scammers used it to attract internet users to an online casino.

An account on X, which fights against online scams, revealed this information. The video uses the identity of the actor who died ten years ago. It features his voice and image.
An imitation of the actor in the video says, “If you’re watching this video, I”m already dead. I’ll give you 100,000 euros if you can’t win in my online casino.”
AFP journalists noted recently that the fake ad has since been deactivated. However, alternative videos featuring the actor and redirecting to the casino remain.
Meta reminded users that publishing advertisements with misleading images of public figures violates the platform’s rules. This practice is intended to scam people.
The social media giant explained that it created models specifically to detect viral content using images of celebrities to better combat such issues.
We call these faked contents using AI or artificial intelligence deepfakes. They increase rapidly on the internet, fuel a wave of disinformation which affects many personalities such as the singing star Taylor Swift and arouses concern before electoral deadlines like the American presidential election.
Elon Musk faced a bombardment of criticism after sharing a deepfake last July on X (formerly Twitter). He shared it with his 192 million subscribers.
It showed Kamala Harris and a voice mimicking the Democratic candidate calling President Joe Biden failing. Moreover, it also accuses him of not knowing how to run the country at all.
At the age of 88, death of Alain Delon has sparked a wave of reactions and tributes around the globe. The actor’s disappearance has made the front page of numerous foreign dailies, from Italy to Japan, through the United States.