Ministry of Communications and Information
Ministry of Communications and Information
Consultation Period:
13 Jul 2022 - 10 Aug 2022
Status:
Closed - Summary of Responses

Consultation Outcome

Summary of Responses to Public Consultation on Enhancing Online Safety for Users in Singapore

The Ministry of Communications and Information (MCI) conducted a public consultation on proposed measures to enhance online safety for Singapore-based users of social media services from 13 July 2022 to 10 August 2022. MCI also organised a series of engagement sessions with parents, youths, community group representatives and academics to gather their feedback and suggestions. At the close of the public consultation and engagement exercise, MCI received over 600 responses from a wide range of stakeholders, including members of the public, community groups and industry groups. 

Overall, respondents were supportive of the measures proposed by MCI to enhance online safety and provided their suggestions and feedback on the proposals. A summary of the key feedback received, and MCI’s response is set out below. 

1. Proposal for designated social media services to put in place systems and processes to safeguard against harmful online content

Respondents generally agreed with the proposal for designated social media services to have appropriate systems and processes to reduce exposure to harmful online content for Singapore-based users. Many agreed with the categories of harmful content identified, especially content associated with cyberbullying and explicit sexual content. Parents expressed concern over viral social media content featuring dangerous pranks and challenges that could be copied by their children. Some highlighted other areas of concern, such as harmful advertisements, online gaming, scams, misinformation and online impersonation. Some respondents suggested that penalties be imposed on social media services for non-compliance, while others sought assurance that the proposed measures would not affect user privacy or freedom of expression.

Industry groups called for an outcome-based approach in regulating social media services, which takes into account for example, their business models and size in implementing the proposed requirements. They also sought clarity on how designated services will be identified and defined, and to exclude services designed primarily for enterprise use.

MCI’s priority is to address harmful online content on designated social media services with significant reach or impact in Singapore, due to the prevalence and impact of such content on users, especially young users. While we focus on social media services for this round of measures, we will continue to study the other areas of concern raised. We will need to find the right balance between prioritising user safety and managing privacy and freedom of expression. Besides tackling harmful online content, we will continue to study the other areas of concern raised by respondents.

MCI agrees with the industry on the need to adopt an outcome-based approach towards enhancing online safety. In this regard, designated social media services will be given some flexibility to develop and implement the most appropriate solutions to tackle harmful online content on their services, taking into account their unique operating models. 

2. Proposed safety features and tools to manage exposure to harmful online content 

Respondents generally agreed with the importance of having safety features and tools on social media services to allow users to manage their exposure to harmful online content. Many were not aware of existing safety features on social media services, while some parents expressed concern that they lacked the knowledge to guide their children to use social media services safely. Respondents felt that more could be done by social media services to raise users’ awareness and usage of the safety features on their services. Respondents also suggested that social media services should proactively highlight self-help resources (e.g. counselling services and hotlines) to users seeking high-risk content. 

MCI encourages social media services to step up their efforts to raise users’ awareness of the safety features available on their services, and to convey information on self-help resources to users. MCI will also continue to work with other Government agencies as well as community partners to enhance public education efforts on online safety, to complement online safety regulations. 

3. Additional safeguards to protect young users from harmful online content

Respondents were supportive of the proposal for designated social media services to have additional safeguards for young users, such as online safety tools for parents/guardians and young users. Some suggested age verification systems for young users and mandatory tutorials on how users can protect themselves online. Industry groups suggested adopting an outcome-based approach when implementing safeguards for young users and for social media services to be given some flexibility to develop tools appropriate to their services’ risk profiles. 

MCI agrees that young users and/or their parents/guardians should have access to tools that enable them to manage their exposure to harmful content and unwanted interactions. These include tools to limit the visibility of young users’ accounts to the public, and the people who can contact and interact with them. Recognising that social media services differ in terms of their user profiles and the type of content published on their services, we will continue to work with the industry to study the feasibility of these suggestions as we apply an outcome-based approach to improving the safety of young users on these services. 

4. Ensuring an effective user reporting mechanism on social media services, and greater accountability to users

Most respondents supported the proposal for social media services to have an effective user reporting and resolution process that is prominent and easy to use, and where action is taken on user reports in a timely manner. Some respondents shared their experiences where content reported by them was not removed by the services. Many suggested that the social media services should update users on the decision taken and allow appeals for dismissed reports. 

Respondents were also supportive of the proposal for social media services to release annual reports on the effectiveness of their content moderation policies and practices to combat harmful content. Some noted that the reports were a way to hold social media companies accountable to the public, in relation to user safety.

MCI agrees that designated social media services should have an accessible, effective, and easy-to-use user reporting mechanism. Designated social media services should also submit annual accountability reports on the effectiveness of their measures to combat harmful content, and these reports will be made public. 

5. Proposal to allow IMDA to direct social media services to disable access to egregious content

Respondents, including industry groups, said that explanations should be provided on why the specific content, where access is disabled, was deemed harmful. Industry groups also suggested that social media services be given some flexibility on the timelines for such content to be removed, taking into consideration the severity of the harmful content and the resources of the service.

When issuing such directions to social media services, the egregious content of concern will be made clear to the services. The timeline requirements for social media services to comply with the directions will take into account the need to mitigate users’ exposure to the spread of egregious content circulating on the services. 

6. Collaboration between community, private sector and Government in the area of online safety

Most respondents recognised the importance of public education to guide users, especially young users, in dealing with harmful online content and engaging with other users online in a safe and respectful manner. Many suggested that in addition to user education efforts by the social media services, the Government could tap on school and parent/peer networks, as key conduits for public education and outreach to parents and young users. 

Respondents, especially representatives from community groups, suggested that the community could partner the Government and industry to raise awareness of existing resources for parents and children on online safety. Academics suggested that social media services could share data with the research community to facilitate studies on the prevalence and severity of harmful content in Singapore’s context. They noted that such data and research could help in the formulation of community standards that were both data-driven and sensitive to Singapore’s local context. 

Some respondents also suggested that the Government could set up advisory panels, comprising experts and appointed members of the public, who could continue to reflect public feedback and concerns on online safety issues to the Government and social media services. 

MCI agrees that online safety regulations need to be complemented by effective public and user education. To equip Singaporeans with the knowledge and skills to go online safely, securely and safeguard themselves against online harms and threats, the Government has launched public education programmes, such as NLB’s Source, Understand, Research and Evaluate (S.U.R.E.) programme and MOE’s refreshed Character and Citizenship Education (CCE) curriculum to impart digital media and information literacy and cybersecurity skills to Singaporeans, including young users. 

We welcome the participation of the community and industry in making online spaces safer, to be part of the nation-wide Digital for Life (DfL) movement to help citizens of all ages and walks of life embrace digital learning as a lifelong pursuit. In support of the DfL, the Media Literacy Council (MLC) runs programmes for youths and parents to promote responsible online behaviour to create a safer and kinder internet. MLC has also produced resources (e.g. infographics, handbooks, tip sheets), which are accessible on www.betterinternet.sg, to educate members of the public on how they can keep themselves and their loved ones safe online. We will also explore further the suggestion on advisory panels, and how the Government can facilitate and coordinate ground-up efforts. 

Creating a Safer Online Space for All

MCI thanks all respondents for the wide-ranging feedback provided. While this key summary may not have captured every feedback received, all feedback have been noted and carefully considered. MCI will provide further updates on our plans to enhance online safety for Singapore-based users of social media services in due course. We will continue to work closely with stakeholders in the community and industry as well as public sector to equip Singaporeans with the knowledge and skills to keep themselves and their loved ones safe online. We also continue to welcome feedback and suggestions to enhance online safety for users in Singapore. 

Ministry of Communications and Information
29 September 2022

Detailed Description

Public Consultation on Enhancing Online Safety For Users in Singapore

Aim

1. The Ministry of Communications and Information (MCI) invites the public to provide feedback on proposed measures to enhance online safety for Singapore-based users of social media services. The public consultation will run from 13 July 2022 to 10 August 2022.


Background

Prevalence of harmful online content on social media services

2. Social media services have transformed the way we live, work and play, bringing new and interactive opportunities for people and businesses in Singapore. However, for the good that these services bring, they can also be a place of danger. Harmful online content can lead to serious consequences in the real world. Globally, there is widespread acceptance that services distributing online content, even where it is user-generated, have a responsibility to keep their users safe from harm. 

3. While many of these services have made efforts to address harmful content, it remains a concern, especially when published on services that reach a wide audience, or when the content is targeted at specific groups of users. This includes content that:

a. Endorses acts of terrorism, extreme violence, or hateful acts against certain communities;

b. Encourages suicide and self-harm, including engaging in risky behaviours that threaten one’s life;

c. Threatens to destabilise one’s physical or mental well-being, through harassment, bullying, or the non-consensual sharing of sexual images.

4. These harmful online content can be amplified on social media services. For example:

a. Propelled by platform algorithms and user interest, content such as dangerous video challenges can go viral rapidly, leading to injuries and deaths;

b. In the case of terrorist acts, the impact of the events is made worse through the spread of videos captured through live-streaming and resharing of content.

5. Tackling harmful online content is a global issue. Countries such as Germany and Australia have enacted new laws that require online services to limit the exposure to harmful content. The United Kingdom and European Union are also working on laws to address this issue. 

6. Harmful online content also affects users in Singapore. For example:

a. Harassment and threats of sexual violence. In 2021, a poll asking people to rank local female asatizah (religious teachers) according to their sexual attractiveness was posted on social media. The post also promoted sexual violence and caused immense distress to the individuals involved.1

b. Religiously or racially offensive content that can incite religious intolerance and prejudice our racial harmony. In 2021, a Singaporean man pretended to be a woman from another ethnic group, and posted multiple racially offensive and insensitive public posts on a social media service.2

7. The negative impact of harmful online content to users is of concern: 

a. Almost half of Singaporeans polled3 by the Sunlight Alliance for Action4 in January 2022 said they have personally encountered such content. 

b. In another study, more than half of parents (54%) in Singapore reported that their children encountered inappropriate content online5. Young users may be vulnerable and lack the capacity or experience to deal with harmful online information and content, such as when they are exposed to age-inappropriate content such as sexual and violent content. 

8. We recognise that some social media services have put in place measures to protect their users. However, such measures vary from service to service. Additionally, when evaluating harmful content on social media services, Singapore’s unique socio-cultural context needs to be considered. Given the evolving nature of harmful online content, more can be done, especially to protect young users. 


Proposed Measures to Enhance Online Safety

9. To address the risks of harmful online content, MCI is considering two new measures: 

a. Code of Practice for Online Safety: Designated social media services with significant reach or impact will be required to have appropriate measures and safeguards to mitigate exposure to harmful online content for Singapore-based users. These include system-wide processes to enhance online safety for all users, and to have additional safeguards for young users6.

b. Content Code for Social Media Services: There may be content that is particularly harmful to our society, such as content that incites racial or religious disharmony or intolerance. Where such content has not been detected by the social media services themselves, we intend for Infocomm Media Development Authority (IMDA) to be granted powers to direct any social media service to disable access to such content for users in Singapore. 


Code of Practice for Online Safety

User Safety 

10. We are considering requiring designated social media services to have community standards for the following categories of content:

a. Sexual content 
b. Violent content
c. Self-harm content
d. Cyberbullying content
e. Content endangering public health
f. Content facilitating vice and organised crime

(Illustrative and non-exhaustive examples of such content are at  Annex A)

11. These designated services will also be expected to moderate content to reduce users’ exposure to such harmful content, for example to disable access to such content when reported by users.

12. For child sexual exploitation and abuse material, and terrorism content, these services will be required to proactively detect and remove such content.

13. Designated social media services could also provide users with tools and options to manage their own exposure to unwanted content and interactions. These could include tools that:

a. Allow users to hide unwanted comments on their feeds

b. Allow users to limit contact and interactions with other users

14. We propose that designated social media services provide safety information that is easily accessible to users. This could include Singapore-based resources or contacts to local support centres. 

a. We also propose that relevant safety information (e.g., helplines and counselling information) be pushed to users that search for high-risk content (e.g., those related to self-harm and suicide).

Additional safeguards for young users

15. Given our concerns about the impact of harmful online content on young users, we propose for designated social media services to put in place additional safeguards to protect young users. 

16. These additional safeguards could include stricter community standards for young users, and tools that allow young users or parents/guardians to manage and mitigate young users’ exposure to harmful content and unwanted interactions. For example, tools that: 

a. Limit the visibility of young users’ accounts to others, including their profile and content;

b. Limit who can contact and/or interact with accounts for young users; and

c. Manage the content that young users see and/or experience.

17. The tools could be activated by default for services that allow users below 18 to sign up for an account. The services could provide warnings to young users and parents/guardians of young users of the implications when they choose to weaken the settings.

18. Likewise, social media services should provide safety information that is easy for young users to access and understand. The safety information should provide guidance to young users and parents/guardians on how to protect young users from content that is harmful or age-inappropriate, and from unwanted interactions. 


User Reporting and Resolution

19. Given the sheer volume of content being created and shared on social media services, there may be instances where users come across harmful content, despite the safeguards put in place by social media services. As such, we propose for designated social media services to provide an efficient and transparent user reporting and resolution process, to enable users to alert these services to content of concern. 

20. The user reporting and resolution process could:

a. Allow users to report harmful online content (in relation to the categories of harmful content outlined at para 10) to the social media service;

b. Ensure that the reporting mechanism is easy to access and easy to use.

21. As part of this process, the service should assess and take appropriate action on user reports in a timely and diligent manner.

Accountability

22. We propose for designated social media services to produce annual reports on their content moderation policies and practices, as well as the effectiveness of their measures in improving user safety. These reports would be made available on the IMDA’s website for the public to view. Through these reports, users will be able to better understand how their exposure to harmful content is reduced on the services they use.  


Content Code for Social Media Services

23. The proposed measures under the Code of Practice for Online Safety are expected to deal with most of the harmful online content that Singapore users may encounter when using designated social media services. However, there may be instances where extremely harmful content remains online in relation to:

• Suicide and self-harm 
• Sexual harm
• Public health
• Public security 
• Racial or religious disharmony or intolerance

(Illustrative and non-exhaustive examples of such content are at  Annex B)

24. Given the concerns about the impact of such extremely harmful content, we propose for the Content Code for Social Media Services to allow IMDA to direct any social media service to disable access to specified harmful content for users in Singapore, or to disallow specified online accounts on the social media service from communicating content and/or interacting with users in Singapore. 


Working Together to Improve Online Safety

25. The aim of the proposed Code of Practice for Online Safety and Content Code for Social Media Services is to safeguard Singapore-based users on social media services, so that they can feel as safe there as they do in the real world. The Government cannot achieve this outcome alone. We will continue to work closely with stakeholders in the people, private and public sectors to strengthen online safety for all users.


We Welcome Your Feedback

26. We invite members of the public to provide their feedback in response to the above proposals by 10 August 2022. Members of the public can submit their feedback via the online feedback form at https://go.gov.sg/online-safety-consultation or clicking the button below. Organisations may wish to provide feedback using the  email template attached.



27. We will review all feedback received and refine our proposals where appropriate. We will also publish a summary of the key feedback received, together with our response, following the end of the public consultation.

 



1 “Police investigating offensive poll ranking female Islamic teachers; President Halimah and other leaders criticise poll”, The Straits Times, 27 May 2021

2 “Man jailed for racially offensive tweets under pseudonym ‘Sharon Liew’”, CNA, 8 Jun 2021

3 Online poll conducted by Sunlight Alliance for Action (AfA) in January 2022 with more than 1,000 Singaporeans on the perceptions, experiences, and the prevalence of online harms in Singapore.

4 The Sunlight AfA to tackle online harms was launched in July 2021 to tackle online harms, especially those targeted at women and girls. The AfA takes a whole-of-nation partnership approach and members of the AfA include individuals across the 3P sectors, coming together with the aim of closing the digital safety gap and creating an inclusive digital space.

5 “Rising concerns about children’s online wellbeing amid increased encounters of cyber threats in 2020”, Google Survey, 9 Feb 2021.

6 Young users refer to individuals below the age of 18.