Facebook, Others Create Terrorist Content Database

(Image: Shutterstock)

Facebook, Microsoft, Twitter, and YouTube are creating a joint database of digital fingerprints for violent terrorist imagery that they have removed from their platforms.

The companies announced Monday that they would share the content with each other with the goals of identifying potential terrorist content more efficiently and curbing the spread of online terrorist content.

“There is no place for content that promotes terrorism on our hosted consumer services,” the companies said in a joint blog post. “When alerted, we take swift action against this kind of content in accordance with our respective policies.”

The companies will begin adding to the database images and videos that appear to be the most “egregious” terrorist material that is likely to defy the companies’ content policies. Companies that aren’t participating in this project can use the information to identify similar content on their platforms, review it against their own policies, and make the decision whether to remove the material.

The European Commission pushed U.S. companies to do more to curb hate speech online one day after the database was announced. The commission published a report Tuesday that said that only 40 percent of perceived hate speech that gets flagged by Silicon Valley companies, which equals about 600 posts, gets reviewed within 24 hours. Of those 600 posts, about a quarter get removed from the sites.

Facebook, Microsoft, Twitter, and YouTube will decide independently which of the materials posted to their platforms should be added to the database by using their own definitions of terrorist material. The companies plan to remove any personally identifiable information attached to the content before putting it in the database.

“Throughout this collaboration, we are committed to protecting our users’ privacy and their ability to express themselves freely and safely on our platforms,” the companies stated. “We also seek to engage with the wider community of interested stakeholders in a transparent, thoughtful and responsible way as we further our shared objective to prevent the spread of terrorist content online while respecting human rights.”

Morgan Lynch
About Morgan Lynch
Morgan Lynch is a Staff Reporter for MeriTalk covering Federal IT and K-12 Education.
No Comments

    Leave a Reply

    Recent