Filtered By: Topstories
News

Social media platforms should be accountable for disinformation —Senate panel


A Senate committee has approved a report which recommends holding social media platforms accountable for the spread of disinformation in the country.

The Senate Committee on Constitutional Amendments, at the last session of the 18th Congress on June 1, gave nod to the draft committee report contained in Senate Resolution 953. It was penned by the panel chairperson Senator Francis “Kiko” Pangilinan.

“To discourage inaction by the social media platforms, malice should be presumed on the part of the publisher (i.e., social media platform) if the libelous comment is made by a fake or fictitious person and such platform fails to take down the libelous content within a reasonable time,” the report reads.

It also stressed the need for a law that is broad enough to capture techniques that the disinformation producers use to elude accountability.

Likewise, the report seeks to amend the law on libel making the use of fake accounts or fake names in making libelous comments as in itself proof of malice.  It also seeks the revision of the Cybercrime Prevention Act of 2012 to enable it to deal with the exponential rise of the use of social media platforms on disinformation activities.

The report also recommended that government offices should ensure that their employees are not engaging in or spreading disinformation and hate speech outside of their official functions.

To recall, the Senate body held four hearings on the rise of social media platforms and the rapid advancement of technology in the country. 

The panel report also recommended the following:

1. Refile the SIM card registration bill. Persons spreading hate speech and disinformation hide behind fake accounts and fake names. The SIM card registration bill may help in determining the identities of these disinformation and hate speech peddlers. Penalties should be imposed on telcos that violate the said law.

2. File a bill that will compel social media platforms to require users to prove their identities before they can proceed with the social media platforms’ service. 

3. File a bill requiring influencers or social media personalities with large following to disclose to their followers whether they received material or monetary considerations with advertisers, politicians, and personalities. 

4. Require government offices to have policies governing their employees’ “sideline” digital media work while handling their respective official social media accounts. 

5. Campaign finance regulations should bring in transparency and accountability. People hiring digital campaigners should be compelled to disclose what campaigns they have commissioned, how much, and who are the people involved. Also, campaigns now take on very different formats, such as influencers posting or hashtags that are made to trend.

6. Pass the following legislative measures:

a. Impose administrative sanctions against government officials or employees who use government resources to wage disinformation campaigns against the public it is supposed to serve;

b. Strengthen the capacity of the educational bureaucracy to produce high-quality textbooks;

c. A memory law patterned after that of Germany, that penalizes the denial of agreed-upon historical truths, subject to the right to freedom of expression and with assurance of independent judicial intervention;

d. Strengthen the capacity of the government’s own massive media and information infrastructure to report the news independent of government influence;

e. Hold social media platforms accountable and treat them as information utilities;

f. Strengthen the capacity of the public to become critical and discriminating users of content by reviewing and improving the Department of Education's current Media and Information Literacy or MIL program for high school students;

g. Prohibit creators of harmful content to monetize their content;

7. Promote a “whole of society” approach that will require a lot of monitoring, civil society support, and support for independent audits. Ensure the participation of all stakeholders, especially the social media platforms, advertisers, media, and public relations agencies;

8. Social media platforms should be more transparent in relation to microtargeting. These platforms should also provide the necessary tools for advertisers to better monitor and have more control over their ad placements. This is in response to the advertisers and media agencies which said that it is “practically impossible” for them to monitor their ads that were inadvertently placed in content of disinformation;

9. Social media platforms should be made responsible for its algorithms, which in some instances, create a cycle of feeding harmful, inflammatory, or untrue content to its users. They should be compelled to release the details of their algorithms and core functions to trusted independent researchers to determine if such algorithms artificially amplify false and manipulative information on a wide scale;

10. Social media platforms should allow advertisers to do a detailed audit on where (i.e., specific page, channel, or video) their ads appear;

11. Social media platforms should extend their direct reporting system for requests for takedown to civil society organizations as well. Reporting should be more open to the public;

12. Social media platforms should have policies requiring collaboration with civil society groups, advertisers, media agencies, and government;

13. The academe and civil society groups should be allowed to help in the direct reporting of takedown requests which is currently only available to law enforcement agencies.

14. Government and/or social media platforms should consider accrediting independent civil society groups, nongovernmental organizations, or members of the academe to review content and identify which channels are purveyors of disinformation;

15. Academe and civil society groups should collaborate with regional and global institutions to have a wider perspective in combatting disinformation;

16. Advertisers and its creatives and media agencies should be in constant dialogue with social media platforms to ensure clean and credible content and improve algorithms for the proper placement of advertisements;

17. The 4As, MSAP, ASC, PANA, IMMAP, and other similar associations should update their self-regulatory standards, including their respective Code of Ethics, to encourage transparency and accountability in digital marketing;

18. The KBP should expand the coverage of its self- regulatory standards to cover the social media accounts and podcasts of its anchors;

19. Government should provide more support or social safety nets to digital workers;

20. Government should fund more research on networked disinformation;

21. Government should identify the role of government officials in disinformation efforts (based on its fact-checking efforts, most online disinformation are from government officials);

22. Government should focus on strengthening enforcement actions and compelling compliance of internet service providers with their obligations under the Cybercrime law; and

23. Schools should have a multi-platform information literacy and critical thinking in the basic education curriculum, similar to Finland’s models.—LDF, GMA News