TikTok is not addictive, social media giant tells politicians
TikTok has denied the platform is addictive, telling members of an Oireachtas committee that users must make a “conscious choice” to scroll.Several social media and tech companies, including TikTok, Snapchat, Microsoft, Google and Meta, which owns Facebook and Instagram, appeared before a committee meeting on the safety of children online on Thursday.Several committee members remarked how they had previously heard from children and teenagers about spending hours scrolling on apps, particularly TikTok.However, Richard Collard, the app’s minor safety public policy lead, denied the platform was addictive.READ MORELabour calls for Galway West byelection candidate Helen Ogbu to be included in RTÉ debateGerry Hutch rejects racism accusation: ‘I have friends – Indians, blacks, whites’Bill to end three-day ‘cooling off’ period for access to abortion introduced by Sinn FéinPolls suggest three-way shootout in Galway West byelection“We wouldn’t agree with the term ‘addictive’. That doesn’t mean we don’t take the wellbeing of children incredibly seriously,” he said.Asked about TikTok’s autoplay feature, Collard said it is not on by default, adding that users “have to make that conscious choice to move to another [video].” Autoplay, also known as auto-scroll, is a feature that automatically advances the user to the next video.Sinn Féin TD Ruairí Ó Murchú told the company representatives their products were “utterly addictive”.“It’s very difficult to accept that any of you are doing what you need to do ... my understanding of this business is you keep people on as long as possible, that’s how you generate money, and on some level, you don’t really care how that happens.”Several committee members raised a US case in which both Meta and YouTube were found liable for deliberately designing addictive products. Both were ordered to pay damages earlier this year after a US jury found the platforms had caused harm to a young woman.Both Dualta Ó Broin, Meta’s head of public policy in Ireland, and Ryan Meade, public policy manager at Google, which owns YouTube, said they disagreed with the verdict and will be lodging an appeal.Acknowledging that TikTok had reached a settlement with the woman, TikTok’s Susan Moss said the company would rather “spend our time and focus our efforts improving safety rather than in the courts”.The committee heard that 100 million videos are uploaded to TikTok each day, about one per cent of which, or one million, is “violative”. It also heard that platforms were using artificial intelligence to detect and remove the vast majority of content that does violate rules.Separately, the committee was told that an age verification system rolled out at app-store level, rather than through each individual social media company, would be “very flawed”.Chloe Setter, child safety public policy manager at Google, which owns the Play Store used for Android phones, said she felt it was a “cynical attempt to shift responsibility” from social media companies, after the proposal was outlined by Meta’s Ó Broin.Ó Broin had told the TDs and Senators that such a move would “make sense”, as it would allow for once-off verification.“Service-level responsibility already exists, it’s enshrined under the Digital Services Act, and I cannot help but feel this is a sort of cynical attempt to shift responsibility elsewhere, to put the responsibility on to the app store,” Setter said.Such a move would deny the responsibility that each individual platform has under law, she said, “to protect and know their users”, adding that they are “best placed” to do this.Setter argued that such a system would be ineffective as it would not cover websites, adding there would also be privacy issues concerning the sharing of age data “across every app”.Moss, on the other hand, said the proposal was a “prudent idea”.Regarding the idea of a ban on social media for those aged under 16, Ó Broin said young people in Australia, where such a ban exists, are accessing “other apps” that might not have the same level of safety restrictions or regulation.