Faced with criticism that Facebook is not enough to counter the extremist message, the company prefers to say that its automated systems remove the vast majority of banned content before glorifying IS and al-Qaeda.
But the complainants’ complaint shows that Facebook itself inadvertently provided communication and recruitment tools to two extremist groups, with dozens of pages named after them.
Social media seems to have made little progress on the issue in the four months since the Associated Press explained that Facebook pages that automatically generate businesses support Middle Eastern and white extremists in the United States have been.
On Wednesday, the Trade, Science and Transport Committee will question representatives of social media companies, including US Senator Monica Beckert, who is leading Facebook’s efforts to send extremist messages.
The new details come from an update of a complaint to the Securities and Exchange Commission that the National Center for Whistleblowers is planning to submit this week.
The title of one page is listed as “political ideology” “I love the Islamic State”. It features the IS logo inside the outline of the popular Facebook thumb icon.
In response to a request for comment, a Facebook spokesman told AP: “Our priority is to locate and remove content posted by people who violate our policy of staying at the forefront of bad actors, actors and dangerous organizations.
Auto-generated pages are not unusual, people cannot comment or post them As pages on Facebook, we violate our policies.
Next-generation networks that exclude anyone. While we can’t capture everyone, we keep our vigilance in this effort. ”
Facebook has many functions that create pages of content published by users. The updated complaint is being investigated in a procedure aimed at assisting business communication.
Cuts employment information from user pages to create corporate pages.
In this case, extremist groups can help because they allow users to like pages and may provide a list of empathy for employers.
The new file also found that pages from users promoting extremist groups are easy to find with simple searches using their names.
He revealed a page of “Mohamed Atta” with a distinctive image of a follower of al Qaeda, the hijacker in the attacks of September 11.
The page lists the user’s Al-Qaida page and teaches him as “bin Laden professor” and “Afghanistan Terrorist School.”
With mixed success, Facebook has so far limited the spread of extreme content. In March, he expanded his definition of prohibited content to include white US nationalist and white separatist material as well as international extremist groups.
It said it had banned 26 million pieces of content related to 200 top white organizations and global extremist groups such as ISIS and al Qaeda.
However, it is unclear whether the company is working well if the company is still having trouble getting rid of its pro-known extremist organizations program.
But as the report points out, a lot of materials go beyond blocks – and they are created automatically.
The AP story in May highlighted the issue of auto-creation, but the new content identified in the report indicates that Facebook has not solved it.
The report also said that the researchers found that several pages referenced in the AP report were removed more than six weeks later on June 25, while Bickert was questioned the day before another congressional hearing.
This problem was reported in an initial SEC complaint by the center’s executive director, John Kostiak, alleging that the social media company had increased its success in dealing with radical apostles.
“Facebook wants to convince us that its magic algorithms somehow clean up its website from extremist content,” Costak said.
“However, these same algorithms are automatically generated pages titled” I love the Islamic State, “and are ideal for communicating and recruiting terrorists.