Search
Search titles only
By:
Search titles only
By:
Menu
Forums
New posts
Search forums
Home
What's new
New posts
Log in
Register
Search
Search titles only
By:
Search titles only
By:
Menu
Install the app
Install
Reply to thread
Home
Computers & Internet
Mobile Computing
TikTok to diversify its ‘For You’ feed, let users pick the topics they want to avoid
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Message
[QUOTE="Sarah Perez, post: 4555"] TikTok [URL='https://newsroom.tiktok.com/en-us/an-update-on-our-work-to-safeguard-and-diversify-recommendations']announced[/URL] this morning it’s taking a new approach to its “For You” feed — the short-form video app’s main feed, powered by its algorithmic recommendations. The company [URL='https://techcrunch.com/2020/06/18/tiktok-explains-how-the-recommendation-system-behind-its-for-you-feed-works/']already detailed[/URL] how its algorithm works to suggest videos based on users’ engagement patterns within its app, but admits that too much of a particular content category can be “problematic.” The company now says it’s working to implement new technology to interrupt “repetitive patterns” on its app and is also developing a tool that would allow users to have a say in the matter by letting them pick which topics they want to avoid. The company explains in its announcement that “too much of anything, whether it’s animals, fitness tips, or personal well-being journeys, doesn’t fit with the diverse discovery experience we aim to create.” However, TikTok isn’t diversifying its algorithm because people are complaining of seeing one too many cute puppy videos — it’s doing so because regulators are cracking down on tech and questioning the harmful impacts of unchecked recommendation algorithms, particularly when it comes to teen mental health. [URL='https://techcrunch.com/2021/09/30/facebook-grilled-in-senate-hearing-over-teen-mental-health/']Facebook[/URL] and [URL='https://techcrunch.com/2021/12/08/instagrams-adam-mosseri-senate-hearing-teen-safety/']Instagram[/URL] execs, along with those from [URL='https://techcrunch.com/2021/10/26/tiktok-snap-youtube-hearing-congress/']other social platforms[/URL], have been hauled into Congress and questioned about how their apps have been directing users to dangerous content — including to topics like pro-anorexia and eating disorder content, for example. TikTok, in its announcement, makes mention of the types of videos that could be harmful if viewed in excess, including things like “extreme dieting or fitness,” “sadness,” and “breakups”-themed videos. While a user who shows interest in videos of this nature may find them interesting, the algorithm isn’t yet smart enough to know that directing the user to more of the same, repeatedly, could actually do the user harm. This problem is not limited to TikTok, of course. Across the board, it’s becoming clear that systems designed only to increase user engagement through automated means will do so at the expense of users’ mental health. While Congress is currently most interested in how these systems impact young people, some studies,[URL='https://www.wired.com/story/not-youtubes-algorithm-radicalizes-people/'] though debated[/URL], have indicated that unchecked recommendations algorithms may also play a role in [URL='https://www.nytimes.com/interactive/2019/06/08/technology/youtube-radical.html']radicalizing[/URL] users who could be drawn to extreme views. TikTok says it will also test new ways to avoid recommending a series of similar content when users watch and engage with videos in these potentially harmful types of videos. But it only offered examples of the types of videos it would limit, not a full list. In addition, the company said it’s developing technology that will help it to recognize when a user’s “For You” page isn’t very diverse. While the user may not be watching videos that actually violate TikTok’s policies, the company said that viewing “very limited types of content…could have a negative effect if that’s the majority of what someone watches, such as content about loneliness or weight loss.” Another strategy TikTok plans to roll out includes a new feature that would allow people to direct the algorithm themselves. They would be able to use this feature to choose words or hashtags associated with content they don’t want to see in their “For You” feed. This would be in addition to TikTok’s existing tools to flag videos you don’t like by tapping “Not Interested,” for example. To be clear, TikTok’s announcement today is only about laying out a roadmap of its plans, not the actual launch of such changes and features. Instead, it’s an attempt to hold off regulators from further investigations into its app and its potentially harmful effects. its strategy was likely informed by the types of questions asked of it both during [URL='https://techcrunch.com/2021/10/26/tiktok-snap-youtube-hearing-congress/']its own Congressional hearing[/URL] and those of its rivals. TikTok notes that the actual implementation could take time and iteration before it gets things right. “We’ll continue to look at how we can ensure our system is making a diversity of recommendations,” the company noted. [/QUOTE]
Insert quotes…
Verification
Post reply
Home
Computers & Internet
Mobile Computing
TikTok to diversify its ‘For You’ feed, let users pick the topics they want to avoid
Top
Bottom
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.
Accept
Learn more…