top of page
Keyboard on a laptop PC

ARTICLES

Social media app TikTok removes Islamic State propaganda videos

  • Writer: sn pubs
    sn pubs
  • Oct 23, 2019
  • 2 min read

Updated: Oct 16, 2024



Social media app TikTok has taken down accounts that posted propaganda videos for the Islamic State group in the latest scandal to hit the popular platform. TikTok, which is owned by Chinese firm ByteDance, claimed some 500 million users globally last year, making it one of the most popular social apps.


The videos featured corpses being paraded through streets and Islamic State fighters with guns, according to the Wall Street Journal, which first reported the story on Monday.


The Journal said the posts were from about two dozen accounts, which were identified by social media intelligence company Storyful. “Content promoting terrorist organizations have absolutely no place on TikTok,” the company said in a statement emailed to AFP.


“We permanently ban any such accounts and associated devices as soon as identified, and we continuously develop ever-stronger controls to proactively detect suspicious activity,” it said.

The TikTok platform, which allows users to create and share videos of 15 seconds, is particularly popular with teenagers.”Unlike other platforms, which are centred around users’ friends or communities, TikTok is based on engaging with a never-ending stream of new content,” said Darren Davidson, the editor-in-chief of Storyful.


“The ISIS postings violate TikTok’s policies, but the sheer volume of content makes it difficult for TikTok to police their platform and root out these videos,” he said.


The app has been marred by controversy in recent months. In April, TikTok was briefly banned by an Indian court over claims it was promoting pornography among children.


The app is banned in neighbouring Bangladesh and was hit with an enormous fine in the United States for illegally collecting information from children. The company has refuted the allegations, saying they abide by local privacy laws.


Like any other social media platform, there is always the possibility for inappropriate behavior, especially when minors are involved. TikTok’s parent company, Bytedance, does work to remove videos in violation of the Terms of Use, but that can only go so far. In recent months, many social media sites have taken action to protect young people on their platforms — for example, YouTube has disabled comments on videos made by minors to protect them from predators. Hence, I feel only time will tell how TikTok continues to fend off potential threats, and to step up measures to make this app a more safe environment for many. While Tiktok can take action by removing videos that violate their guidelines, the sheer number of videos make it almost impossible to filter content completely, hence I feel users should be discerning to the kind of information they are receiving, and refrain from watching clips with harmful content. Likewise, users who are too young should have parental consent before creating any videos themselves or parents’ supervision over the videos they are watching.


Reference:


Lynn Tan

3 Justice

Comments


bottom of page