Folha will post its content again on Facebook this Tuesday (6). The newspaper page, which currently has 5.5 million subscribers, has not been updated since February 8, 2018. With reactivation, readers have a new path to access the content of the publication.
Upon her departure, Folha announced that she would no longer post on the social network as she believed the platform was acting to reduce the visibility of professional journalism, allowing messages containing fake news to proliferate and mix with information. serious. Since then, Facebook has taken a series of countermeasures.
The company has restricted the flow of fake news and hate speech, removing pages and providing warnings and additional information in posts with factual issues (more in the timeline below).
One of the most emblematic actions in this regard came in early June, when the platform revised the pass it had given to politicians in its posts.
The change took place under the leadership of a group of independent experts created by Facebook itself a year ago – digital law lawyer Ronaldo Lemos, columnist for Folha, is one of the members. It was also this committee that recommended the continuation of the suspension of former US President Donald Trump, which was also accepted.
In Brazil, in March 2020, Facebook and Instagram first deleted President Jair Bolsonaro’s (non-party) post, alleging he had broken community rules.
Shortly after, in July last year, fake accounts linked to the Bolsonaro family and the office of the President of the Republic were dismantled, due to the use of duplicate profiles and fictitious statements. The reason for the deletion was not fake content or misinformation, but the behavior of the accounts – some of them even exploited by real people, but who created fictitious profiles to amplify the distribution of something.
From 2017 to 2020, the network reported removing 714 profiles in Brazil for being part of “influence operations, coordinated to manipulate or corrupt public debate aimed at a strategic objective.”
Facebook CEO Mark Zuckerberg said a year ago in Germany that 35,000 people were assigned to review online content and implement security measures.
These automated teams and technologies, equipped with artificial intelligence, suspend more than a million fake accounts a day, he said. According to the network, in the first quarter of 2021, around 1.3 billion fake accounts were deleted as soon as they were created.
There have also been changes in the business model of journalism.
The company is now enabling the so-called paywall (digital content billing) to be adopted in content published in Instant Articles, a tool designed for fast loading of text from news sites.
Reports in this optimized version were to be open to any reader. Today, it is possible to “close” them by displaying the invitation to subscribe. Folha pioneered the use of this mechanism in Brazil, having launched its paywall in 2012.
Since 2018, Facebook has also adopted measures to protect intellectual property rights in publisher content and has been involved in programs to encourage professional journalism.
The company claims to have invested $ 600 million (3 billion reais) since then in support mechanisms for the information industry and said it plans to allocate $ 1 billion (5 billion reais) over the next three years.
One of these projects is the Brazilian Comprova, funded with support from the Facebook Journalism Project. The Verification Consortium, of which Folha is a part along with other vehicles, has been active since the 2018 general election, having an additional edition during the pandemic.
Around the world, Facebook had previously supported similar programs, such as Cross-Check, a collaborative online false information verification project ahead of the 2017 French presidential election, and Verified in Mexico.
In 2020, precisely in the midst of the worsening of the novel coronavirus pandemic, Facebook ramped up disinformation control measures and expanded the verification partnership. He then started removing bogus health and anti-vaccination content that broke the rules of use and could cause real harm to people in the offline world.
According to the Reuters Institute’s Digital News 2020 report, considered the world’s largest on trends in journalism and new technologies, Facebook (24%) and WhatsApp (35%) are the main platforms for distributing fake content today. on Brazil. the survey of the year, released in June, corroborates the data. Last year’s research team interviewed 80,000 people in 40 countries.
In response, the network claimed to have placed bogus news tags in around 50 million posts worldwide. “We have directed more than 2 billion people to health authority resources through the Covid-19 information center,” Facebook said in a statement.
The platform also cited financial aid to journalists and organizations to fight the pandemic and funding for editorial training by the ICFJ (International Center of Journalists), for example.
Folha was one of those selected in Latin America – in total, 44 vehicle projects from 12 countries in the region received US $ 2 million (R $ 10.1 million) from the coronavirus support program. Similar initiatives have taken place in the United States, Canada, Asia and Africa.
Another measure to encourage professional journalism has been the creation of Facebook News, an exclusive space for journalism on the social network. Launched in the United States, the tool started to be used this year also in the United Kingdom and Germany – no date is set for the launch in Brazil.
Finally, the last announcement of the American company, last May, refers to increased control of the dissemination of disinformation.
In the coming months, users will be notified through new formats when a profile or page repeatedly shares fake news. Repeated breaches will result in stiffer penalties for profiles, with reduced scope and warnings, to new subscribers, that the profile has already propagated fake content.
Before, only pages and groups were penalized by the algorithm. With the change, a profile may be seen less by followers and friends.
Assessing the veracity of pages, as is already the case with publications, continues to be the responsibility of third parties – there are 80 partner control agencies.