Twitter to start hiding comments from suspected 'trolls' in conversations

Daniel Fowler
May 16, 2018

Twitter has a troll problem.

It actually sounds pretty simple at first: according to a Twitter blog post Tuesday, the company will simply organize conversations on Twitter differently based on "behavioral signals" created to root out trolls in "communal areas" of the social network.

This new, automated approach is meant to hide possibly annoying or abusive tweets or replies from public conversations and search.

To weed out unhealthy contributions to Twitter conversations, the platform's algorithm and human reviewers will look for certain signals, including how often a user is blocked by people they interact with, whether they have created multiple accounts from a single IP address, and whether the account is closely related to others that have violated the company's terms of service. Its latest initiatives involve working with outside researchers to determine what constitutes healthy conversations on its service. Now, we're tackling issues of behaviors that distort and detract from the public conversation in those areas by integrating new behavioral signals into how Tweets are presented. The content itself will remain on Twitter but will only be shown if people click on "show more replies" or choose to see everything in their searches. While these apparently account for less than one percent of Twitter accounts, the platform maintains that this portion of users still significantly affects the online experience.

Harvey and Gasca said that there are many new signals that Twitter is taking in, most of which are not visible externally.

"These signals will now be considered in how we organise and present content in communal areas like conversation and search".

USA will not take part in Astana talks on Syria - diplomat
But members of Syria's armed opposition present in Astana immediately ruled out attending the event in Russian Federation . The guarantor states brokered a cease-fire in Syria in December 2016.

"People contributing to the healthy conversation will be more visible in conversations and search", it said.

Twitter says that abuse reports were down 8% in conversations where this feature was being tested.

But it adds: "Our work is far from done". This technology and our team will learn over time and will make mistakes.

"We want to take the burden of the work off the people receiving the abuse or the harassment", Dorsey said in a briefing with reporters. There will be false positives and things that we miss; our goal is to learn fast and make our processes and tools smarter.

Twitter executives Harvey and Gascam said that the initiative is part of an ongoing attempt "to improve the health of the public conversation on Twitter".

Other reports by

Discuss This Article