Facebook says it will start removing posts on its site that it views could spark violence.
The move is a response to criticism that the spread of rumors and false stories on its platform has led to people suffering physical harm in countries around the world, Dow Jones reported.
Facebook’s approach to misinformation has been focused on suppressing the popularity of such content on the platform — without entirely scrubbing the problematic content.
The social networking company has faced questions about being a source for false information that can inflame societal tensions.
As for policing, the company says it will rely on local organizations to decide whether specific posts contain false information and could lead to physical violence.
If so, the posts will be removed.
A Facebook spokeswoman said the company will begin in a couple of global hotspots, first in Sri Lanka and later in Myanmar, two countries where some people and groups have used Facebook to spread rumors that ultimately have led to physical violence.
It is still to be decided who its partners will be and what the criteria is for becoming one.
Earlier this month, India’s government rebuked the Facebook-owned messaging service WhatsApp for allowing rumors and false reports to circulate on its service after a series of deadly attacks on victims mistakenly accused of kidnapping children, according to Dow Jones.