By: Grace Liu
Recently, on social media platforms, especially TikTok, A.I. has been used to recreate voices of dead or missing kids. While these posts can raise awareness, they can also spread misinformation and offend families.
In a TikTok video, an image of James Bulger, a 2 year old that was abducted and murdered, narrated his story. “If my mom turned right, I could have been alive today.
Unfortunately, she turned left,” the child’s voice says, explaining that if she had turned right, she would have seen her son being led away by the two 10 year olds who tortured and killed him.
James’ mother, Denise Fergus, told the Daily Mirror that the posts were “disgusting.” “To use the face and a moving mouth of a child who is no longer here, who has been brutally taken away from us, there are no words,” she said.
Other videos have been made about Madeleine Beth McCann, Peter Connelly, and Gabriel Fernandez; all children who have gone missing or were abused and killed.
These videos are being removed from TikTok, but thousands of people had already viewed the videos before they were taken down. Some of these videos remain available on YouTube.
A.I. softwares have the ability to mimic children’s voices, but they are computer generated, and they aren’t based on the kids’ real voices.
The content creators that post these videos often aim for a lot of views and comments, as according to research, emotional content is shared more. There were many comments on these videos. Some people left messages and tributes to the children, while others criticized the creator for posting the material.
Hany Farid, a professor at the University of California at Berkeley, also noted that social media is driving user engagement because the posts are “morbid and sensational.” Farid said the images of the children fall into “the general category of deep fakes, now more commonly being called generative AI.”
Cory Bradford, a history TikToker said he avoids using A.I. in his posts, but those who do are most likely seeking attention and views. He said the technology to recreate realistic historical images is not “quite there” yet.
“It's getting there,” Bradford said. “And it will be there soon. And we all have to deal with the consequences of what that’s going to lead to.”
Link to article: https://eb18600f7bb2916037f5ee8e636ce199.cdn.bubble.io/f1691947548972x218641736927466460/AI%20social%20media%20videos%20depict%20missing%2C%20dead%20children%20narrating%20their%20stories%20-%20The%20Washington%20Post.pdf