A JOINT COMMUNIQUE BY PEOPLE’S ACTION FOR FREE AND FAIR ELECTIONS (PAFFREL) AND HASHTAG GENERATION TO FACEBOOK INC.
The below is a joint communique by People’s Action for Free and Fair Election (PAFFREL) and Hashtag Generation with observations from the 2019 Presidential Election and looking at the impending 2020 General Election.
The communique was shared with Facebook on the 23rd of April 2020.
In the lead up to Sri Lanka’s 2019 Presidential Election, People’s Action for Free and Fair Elections (PAFFREL) observed that despite the rapid advent socio-political discourse within social media platforms in recent years, it remained a domain that was not actively monitored for violations of national election laws. Sri Lanka has 6.2 million active social media users (6 million of whom are on Facebook). Therefore, PAFFREL identified social media monitoring, with an emphasis on Facebook, as an indispensable activity in order to ensure that these platforms were not used to disrupt and interfere with the democratic process of a national election.
In this context, the PAFFREL partnered with Hashtag Generation – a youth-led online community organization, to implement a multi-pronged strategy aimed at identifying violations of election law, disinformation, and hate speech on Facebook in the lead up to, during and in the immediate aftermath of the Presidential Election as a targeted intervention. As part of this intervention, PAFFREL oversaw a “newsroom” managed by Hashtag Generation as a 7-days-a-week operation that ran for the duration of the campaign period including seven days after the conclusion of elections.
In the newsroom, PAFFREL and Hashtag leveraged a range of tools, including Watchdog Sri Lanka’s trilingual social media monitoring technology to carry out real-time monitoring of election law violations (with a focus on sponsored content on social media), false news, and ethno-religious rhetoric on key social media platforms with a particular emphasis on Facebook. Hashtag Generation deployed a team of monitors that engaged in collation, analysis and verification of data generated to enable PAFFREL to report potential violations daily to the Election Commission of Sri Lanka (EC) and to enable evidence-based advocacy. PAFFREL managed, coordinated and provided independent verification of all cases reported to its Complaint Management Unit by Hashtag Generation and forwarded all credible reports to the EC. Concurrently, PAFFREL and Hashtag Generation escalated posts that the monitors identified as violations of election law, false news, and hate speech directly to Facebook for follow-up action through a Trusted Partner Channel.
As Sri Lanka stands on the cusp of another election cycle, we call on Facebook, as the most popular social media platform in the country, to improve the implementation of commitments made during the previous election to ensure greater electoral integrity and limiting the misuse of its platform during Sri Lanka’s Parliamentary Elections.
OUR RECOMMENDATIONS FOR ACTION
This experience, and our direct engagement with Facebook during the Presidential Election cycle has helped us identify three areas where we strongly recommend improvements to ensure a social media landscape that lends itself to greater electoral integrity.
Our recommendations are specific in nature, as they are meant to be actionable by Facebook within the current electoral cycle and do not go beyond Facebook’s current product offerings in relation to elections. While we believe that all social media platforms have wide-ranging responsibilities towards consumers and markets that go beyond their current engagement in the Sri Lankan market, this communique does not attempt to address these concerns. The recommendations recorded here in the joint communique have been raised on numerous occasions in forums and discussion with both the Election Commission and Facebook in the aftermath of the Presidential Elections and are simply documented and reiterated here. No new suggestions that may not be practicably implemented at this time have been included.
Our key recommendations to Facebook (elaborated below) to be in place prior to parliamentary elections are:
- Declare a maximum response time commitment and an average response time for content reported as potential violations of Facebook’s community standards during the election cycle;
- Introduce the false new disclaimer product to appear in a timely and uniform fashion across the Sri Lankan market on Fact-Checked false news and boost third party Fact Checker content through Facebook advertising to reach a wider audience;
- Audit and report back to trusted partners on the effectiveness of the Ad Library tool, ensuring the features and performance promised in the past election cycle are improved from the performance levels of the past cycle and are fully functional during the upcoming parliamentary election cycle.
The above are drawn from the key lessons learnt from our joint intervention and our suggestions to Facebook – identified under 3 categories – to minimize the use of the platform for disruptive purposes during the impending election cycle, including the dissemination of content that violate election related laws and regulations as well as Facebook’s own community standards.
REPORTING VIA THE TRUSTED PARTNER CHANNEL
In the lead up to the presidential election, our dedicated team of monitors based on a rigorous internal verification process reported 828 posts that were determined to be in violation of election law and/or Facebook’s community standards via trusted partner channels. Facebook only removed 90 (i.e. 10.96%) of the total reported content during the campaign period that spanned roughly five weeks from 7 October to 13 November 2019.
While we understand that monitors from Facebook or third parties may, at times, disagree with our interpretations, we do not believe that there was adequate staffing to ensure that such a review process was conducted in a timely manner. Evidence of this was that some reports took over 60 days to receive a response through Facebook’s trusted partner escalation channel.
The time taken by Facebook to respond to the reports, ranged from under 24 hours to over two months
Some responses, including the response shown in the screengrab above came well after the conclusion of election, thus diluting the significance of the action taken by Facebook. Moreover, numerous reports, to date, remain at the escalation/pre-evaluation stage.
We recommend that the response time should be improved in a manner that such review processes meaningfully contribute to upholding Facebook’s community standards during the upcoming election cycle. We therefore request that Facebook clarifies its key performance indicators and expected maximum response time from the teams that monitor content reported through its trusted partners channel.
Facebook has not made a commitment to remove content containing false news as per their community standards. However, the same community standards make a commitment to remove false news content that violate its policies in other categories community standards, such as spam, hate speech, or fake accounts.
During the campaign period that preceded the presidential election, our monitors identified, recorded and reported a high number of posts containing mis/dis-information, including those that were part of targeted viral misinformation campaigns. One such organized campaign was targeted at the Millennium Challenge Corporation’s proposed Sri Lanka Compact. This misinformation campaign was kick started by a video posted by a private biology tutor who is a popular social media figure and was widely shared before the video was taken down by Facebook or removed by the owner. This particular video seemingly triggered the production and dissemination of a large number of content containing misinformation, some of which were debunked by Fact Checking websites recognized by Facebook and the International Fact-Checking Network. Despite the potential of these posts to lead to voter suppression or compromising electoral integrity, they remained on Facebook for the duration of the campaign period, including as paid-for advertisements.
Additionally, our monitors recorded posts containing misinformation that were targeted at politicians, individuals and groups identified by Facebook as “protected.” However, the inconsistencies in relation to definitions and lack of clarity on Facebook’s specific stance on False news that may violate the platform’s other community standards meant that the monitors could not directly escalate these cases to Facebook for review.
Facebook’s community standards also underline a commitment to reduce the distribution of content rated as false by independent third-party fact-checkers. Measures taken by Facebook at the global level include a “False news screener/disclaimer,” which had been rolled out at the global level prior to the last year’s election in Sri Lanka. However, this was not enabled for Sri Lanka during the campaign period. Some False news content remained available for viewing without the disclaimer throughout the election period and included those that were rated by the two independent fact checkers recognized by Facebook in Sri Lanka (AFP Fact Check and Fact Crescendo Sri Lanka) as containing disinformation. Furthermore, Facebook’s AD Library tool also demonstrated that advertising of posts Fact Checked as ‘False’ by the third-party Fact Checkers continued throughout the election cycle reaching users without a disclaimer. Following is a screengrab from a post containing misinformation in relation to MCC’s proposed Sri Lanka Compact, as classified by Fact-Crescendo (Figure 2).
A post-election review done by our monitors revealed that posts remained available for viewing with no disclaimer even in the immediate aftermath of the election. In what could be viewed as a positive development, a screener/disclaimer has since been put up over the content mentioned earlier (Figure 3).
We recommend that the false news disclaimer and ‘downranking’ should be rolled out in a more systematic and timely manner during the next election cycle in Sri Lanka. We further believe that the disclaimer redirecting the viewer to the analysis done by the fact checker is best shown in the local language of the post in question.
Our monitors also reported that the analyses done by the authorized fact checkers debunking false news content drew significantly less engagement compared to false news content that was subjected to review (Figure 4). Therefore, we encourage Facebook to provide “ad credit” for the independent fact checkers to boost their content to a wider audience.
The “newsroom” remained operational 24/7 for the blackout period during which the online monitors reported a total of 871 sponsored posts (Facebook advertisements) by using the Facebook Ad Library tool to identify active ads. This real time reporting of sponsored posts on social media that contravened the election laws, resulted in nearly 600 ads (68%) being stopped by Facebook. Majority of these posts were removed/stopped within 3-12 hours of reporting.
During the blackout period the monitors reported that, the Issue, Electoral or Political tab on Facebook’s Ad Library Tool, which helps the differentiate political advertisements from non-political advertisements failed to capture a number of sponsored advertisements containing underlying political messages. As a result, the monitors had to manually screen advertisements posted by political pages that were inaccurately determined as non-political by the Ad Library tool, thereby delaying the reporting process during a highly time-sensitive period. This remains an issue to this date, with our monitors reporting inconsistencies with the Ad Library Tool’s ability to recognize political content (Figure 5).
Furthermore, numerous advertisements that appeared on the newsfeeds of our monitors were not immediately captured as live advertisements by the Ad Library tool (Figure 6), making it impossible for them to report these advertisements to the EC due to the lack of a sharable link.
Due to the large candidate pool expected to contest in the upcoming parliamentary election we expect Facebook advertising to be used more widely, including for disruptive purposes such as voter suppression using misinformation. Therefore, we request Facebook to fully enforce the policy that requires all ads related to politics be run by an advertiser who’s completed the authorizations process and be labeled with the disclaimer, allowing greater transparency on campaign financing. According to Facebook’s Business Help Center, all advertisers running ads about elections or politics Sri Lanka are now required to complete this authorization process., However, our monitors continue to record political advertisements that are being carried out by unverified/unauthorized pages (Figure 8).
On the 19th of March, the Chairperson and Members of the Election Commission of Sri Lanka announced that the country’s Parliamentary Election will be postponed indefinitely due to the Coronavirus pandemic. We are writing to you at this stage, in order to recommend that in order to preserve integrity of the impending electoral process, it is imperative for Facebook to declare response time commitments for content reported as potential violations of Facebook’s community standards during the election cycle, introduce the false new disclaimer product to appear in a timely fashion on Fact-Checked false news, boost third party Fact Checker content to reach a wider audience, and audit the effectiveness of the Ad Library tool, ensuring that features such as the “Paid for by” disclaimer are fully functional during the upcoming parliamentary election cycle.
Rohana Hettiarachchi, Executive Director – PAFFREL
Senel Wanniarachchi – Co–Founder/ Director – Hashtag Generation