Speech by Mr Louis Ng Kok Kwang, MP for Nee Soon GRC, at the Second Reading of the Online Safety (Miscellaneous Amendments) Bill (Bill No. 28/2022)
Introduction
This Bill empowers the Government to order restrictions on online access to harmful content and to require social media networks to comply with requirements on handling harmful content.
I thank the government for holding a public consultation and multiple engagement sessions in preparing for this Bill.
After all, the Government cannot by itself ensure online safety for Singaporeans. Internet companies, experts, parents and young people are all essential partners it must work with.
This discussion on online safety also comes at a time when I’m being pressured by my daughter, Ella, to allow her to play Roblox. Actually, she has been nagging me for years to be allowed to play this game but it has intensified lately as all her friends are playing this game.
I’m terrified of her being exposed to harmful and inappropriate content online. All parents are and I hope this discussion and this Bill will make steps forward towards ensuring a safer online world for all of us.
I have three clarifications on the Bill.
Definition of egregious or harmful content
My first clarification is about the definitions of egregious content and harmful content.
First, the Bill defines one category of “egregious content” as content that is “likely to cause feelings of enmity, hatred, ill will or hostility against, or contempt for or ridicule of, different racial or religious groups in Singapore.”
Can Minister share why this category does not include content that have a similar impact on other demographic segments, such as gender?
Second, IMDA’s draft of the Code of Practice for Online Safety lists six categories of harmful content. Social media platforms must “minimise users’ exposure” to such content.
In the final version of the Code, will IMDA provide more specific category names, detailed explanations for each category or subcategories for exclusion?
The ambiguity makes it possible for educational or otherwise beneficial content to be caught in the dragnet for harmful content.
Third, will IMDA also consider adding new categories for harmful content?
For instance, harmful content should include content that promotes extreme beauty standards. Such content harms our youths by giving them unrealistic expectations. It affects their self-esteem, and encourages them to engage in unhealthy behaviour to meet these standards.
Scope of regulations to different electronic services.
My second set of clarifications relates to the scope of these new provisions.
It seems clear that these regulations will apply to platforms like Facebook, YouTube or TikTok. But many other companies also use user-generated content.
For example, e-commerce platforms may rely on user reviews and comments. Online games may have extensive user interaction. Can Minister clarify if social media services also include online platforms whose core business is not social media?
Private or domestic communications are also excluded from these provisions. Can Minister clarify if this excludes direct messages or other user-to-user interactions?
This is a potential channel for harmful content to be transmitted. For example, a study has found that 1 in 15 DMs sent by strangers to high-profile women are potentially abusive.
Can Minister also clarify if semi-private communities such as Discord servers or Telegram groups will be treated as private or domestic communication?
Protection for individual victims
My last set of clarifications relate to whether we can do more to help individual victims of harmful content.
Victims of revenge porn, cyberbullying, or doxxing, suffer direct harm to their lives.
Since 2016, AWARE’s Sexual Assault Care Centre has supported 747 clients who experienced technology facilitated sexual violence. Survivors suffer a loss of dignity and privacy, and experience an uphill battle in containing the spread of content once uploaded into the internet.
I have three suggestions on how we can help these victims.
First, funds from penalties under this Bill can be set aside to support these victim-survivors. These funds should be used in partnership with civil society groups who are already active in the community in helping these victims.
Second, we can create a general duty of care on online communication providers to compensate individuals for harm they suffer due to the platform’s negligence in managing harmful content.
Platforms could be negligent if they are too slow in taking down harmful content, or failing to meet standards in the Code of Online Safety. This duty of care allows the victim to be compensated for their harm under the law of negligence.
This duty of care should apply not only to large social media platforms, but all online communication providers, as the potential harm does not discriminate. Small platforms may also be a way to escape detection by sharing harmful content there, then using links to circumvent the safeguards in larger platforms.
Finally, as long as end users believe they can hide behind the anonymity of the digital world, they will continue to try and publish harmful content. Individuals who are affected have to rely on themselves to work with the online platform to get the content removed.
Going to the police may not be useful as the police may lack the jurisdiction or capability to investigate matters of this nature. To ensure sufficient deterrence against end-users, will the government consider increasing resources and training for police to assist victims and take swift action against end users who post harmful content.
Conclusion
Sir, notwithstanding my clarifications, I stand in support of the Bill.
Watch the speech here.