top of page
  • community959

Disturbing Uses of AI Voice Cloning

By: Zerlina Tang

As technology regarding AI is becoming more advanced than ever nowadays, people are using them to recreate voices of people who existed. However, they are being used poorly, as in giving people voices they didn’t want or ask for.

However, some people posting history videos are gaining more followers because of this, and more people are interested in history. Cory Bradford, a TikToker with over 1 million followers who creates historical videos, said that while he normally avoids using AI technology in his own uploads, those who do are likely attempting to increase interaction, especially on a platform where the audience skews younger. Bradford himself uses this technology to draw in his audience, and people are in fact, more willing to watch his videos talking about history, and there’s a positive inclination for his audience to learn more history and research about the topic.

However, some are using AI to make rather disturbing videos about deceased or missing children using the AI voice to narrate their own story. “Hello, my name is James Bulger. If my mom turned right, I could have been alive today. Unfortunately, she turned left,” an incredibly childish voice from an image said. The British two-year-old was abducted by two other ten-year-olds and was tortured and later, killed. At that time, his mother was paying for groceries.

“To use the face and a moving mouth of a child who is no longer here, who has been brutally taken away from us, there are no words,” Denise Fergus, James’s mother, told the Daily Mirror. The newspaper was later told by her that the posts were “disgusting.” Fortunately, TikTok has removed the videos of James narrating his own death since they were seen as “too disturbing to the audience.” The advancement of AI can make it more difficult to distinguish between fact and fiction, posing both societal and individual risks, according to TikTok’s guidelines, and users who share synthetic or manipulated media depicting realistic scenes must include a disclaimer stating that the content is not real. But people still continue to make these. For example, Madeleine Beth McCann, another child who has gone unseen, has also been voiced by AI technology. And most disturbingly, Anne Frank was seen on videos advertising baby clothes before describing disturbing scenes from the Holocaust, which is seen as absolutely outrageous. Are you still looking for beautiful and self-designed baby clothes? Then go to the link in my bio and let yourself be surprised. And now I’ll tell you the story of Anne Frank,” the young girl “says” in German.

In the end, this use of AI technology is being seen as being absolutely unrespectful to the deceased ones by the community, and everyone is wishing to let this stop.

3 views0 comments
bottom of page