top of page
  • community959

AI Storytelling Gone Too Far

By: Alan Chen

Content creators on TikTok have been making AI generated story-telling, and it’s hard to tell the difference between made by AI and not made by AI. Some of this content included inappropriate topics, such as racism, discrimination, bad words, and violence. People have possibly done this to seem “cool”. Platforms such as TikTok have banned harmful AI content.

Sometimes when you’re on a platform such as Instagram you find a post that is strange. It shows a person talking about something, but what they’re saying is really strange or inappropriate topics or they’re face is strange. The type of video is helped by an AI or an Artificial Intelligence. This type of content is often called AI storytelling.

Alisha Aurora is a 17 year old TikToker. She first saw AI storytelling when she saw a famous basketball player telling his life story. She said, “I actually thought it was real, it’s hard to draw the line between what’s real and what’s fake.” It’s hard to figure out what’s real and fake, but if you look carefully enough you can see the difference. This means that people can impersonate other people, for good reasons, or for bad.

AI storytelling uses videos with fake people that actually look and act like a real living person. They can have AI generated avatars that look almost exactly like real people. The storytelling can include a variety of things, from fake people telling fictional stories, to important historical figures telling their stories.

One of the AI story’s swith more than a million views is about a famous deceased rapper named Tupac Shacker telling his story. Another video is about a boy talking about a Fortnite game going wrong. In a lot of million view videos, there is disturbing or inappropriate content such as s*x, racist stereotypes, and other not bad things. So there are lots of inappropriate content on platforms like TikTok, Instagram, Snapchat, and others.


CBC News

4 views0 comments
bottom of page