May 28, 2024

TikTok Under Watchdog Lens for Providing Inaccurate Information

UK communications regulator Ofcom has launched a probe into TikTok for allegedly supplying “inaccurate” information about parental controls on the platform. The watchdog is now investigating if the popular Chinese-owned video-sharing platform has violated the 2003 Communications Act.

Ofcom reportedly presented TikTok with a legal information request regarding its family pairing system. The information provided by TikTok also made its way into a recent Ofcom report, published on December 14. However, the regulator says that it has “reason to believe that the information it provided was inaccurate”.

Technical Glitch, Claims TikTok

TikTok, however, blamed the issue on a technical glitch that may have resulted in Ofcom being supplied with inaccurate data in response to its request for information. The company also went on to add that it detected the technical issue several weeks ago and had notified Ofcom already.

It was TikTok’s move to inform the watchdog that led to the probe in the first place, TikTok claimed.

Ofcom scrutinized Snap, Twitch, and TikTok as part of building its child safety report. All three companies were requested for information on how they comply with legal requirements on protecting minors from being exposed to videos that can potentially cause harm to their “physical, mental, or moral development”.

The communications regulator discovered that despite the enforcement of child protection measures on these platforms, children are still vulnerable to seeing such harmful videos on all three platforms.

It was when Ofcom asked TikTok about its parental control system that the inaccuracy issue arose.

Known as “Family Pairing”, the parental control system was first introduced in 2020. The feature enables parents to link their accounts with their kids’ and manage different aspects, such as screen time, content filtering, privacy settings, and direct messages. While paired users aged under 18 can deactivate the pairing, it notifies their parents when they do.

The available evidence suggests that the information provided by TikTok in response to the notice may not have been complete and accurate.Ofcom

However, the watchdog added that it might update the report if TikTok provides it with more accurate information.

Ofcom Findings Indicate Alarming Shortcomings of Child Safety Measures

According to the regulator’s child safety report, more than 20% of children aged between 8 and 17 years have an adult online profile that states their age to be 18 or higher. Additionally, one-third of the children younger than 15 years have accounts with the user age set at 16 years or older.

Highlighting the alarming statistics, Ofcom questioned the adequacy of having a policy to simply make users declare their age while signing up instead of verifying the information.

While users are required to be at least 13 years or older to be able to use TikTok, Twitch, or Snapchat, younger kids can easily join by faking their age.

The platforms have enforced methods to detect underage users, including both human moderators and artificial intelligence. However, the available data is insufficient to determine the actual number of underage users, the report said.

The report also pointed out how Onlyfans, a platform popularly known for its explicit content, stands in stark contrast. Users are required to pass facial age estimation, ID checks, and other checks to verify their age before they’re granted access to the platform.

Ofcom expressed concern over Twitch offering open access to all content, which means everyone, even children, can view any video on the platform – including the ones rated mature. While content warnings are in place, they can simply be dismissed.

Unlike TikTok and Snap, Twitch doesn’t offer parental control features. Instead, the platform’s terms and conditions require parents to supervise their children in real time.

While the state of things doesn’t look very promising, the firms will soon be required to comply with the Online Safety Act, which was published recently to protect children from harmful content on social media.

free coins
free coinsfree coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins

Leave a Reply

Your email address will not be published. Required fields are marked *