Support Centre

Singapore

Summary

Law: Personal Data Protection Act 2012 (No. 26 of 2012) (PDPA)

Regulator: Personal Data Protection Commission (PDPC)

Summary: The Personal Data Protection Act 2012 (No. 26 of 2012) (PDPA) provides for general personal data protection requirements and contains provisions on data subject rights, the appointment of a data protection officer, and the obligations of organisations and data intermediaries. The PDPA positions the Personal Data Protection Commission (PDPC) as responsible for enforcing the PDPA's provisions. The PDPC has also released a number of advisory guidelines which provide clarification on its interpretation of the PDPA and is active in its enforcement activities. Furthermore, amendments to the PDPA entered into force on February 1, 2021, introducing a number of key reforms, including mandatory data breach notification requirements, amendments to the consent obligation, offences for egregious mishandling of personal data, prohibitions relating to the use of dictionary attacks and address-harvesting software, and the PDPC's power to accept voluntary undertakings as part of its enforcement regime.

In addition to the PDPA, the Cybersecurity Act 2018 (No. 9 of 2018) sets out the regulatory framework governing cybersecurity in Singapore and stipulates requirements for operators of critical information infrastructure. Likewise, the Online Safety (Miscellaneous Amendments) Act (No. 38 of 2022) entered into force on February 1, 2023, outlining obligations for online communications service providers more generally.

Insights

Data scarcity, or more specifically, the lack of usable and high-quality data (that is, data that has been cleaned, processed, is of high quality in accuracy and relevance, and formatted, in each case to a widely and commonly accepted standard) is an endemic problem in relation to various industries and the healthcare and pharmaceutical industries are no exception. An article published by the World Economic Forum (WEF) identified that although the healthcare sector generates a 'staggering amount of data,' a whopping '97% of it goes unused.'

In the ideal world, we would see the availability of centralized digital repositories with complete, up-to-date, fully usable health information available to the community in an accountable and secure framework.

The lack of such a framework can lead to undesirable outcomes. One US commentator argued for a national health information exchange to deal with this issue, one that formally establishes required standards for creating, maintaining, and sharing these records, and one that creates the infrastructure and capability to gather, integrate, store, and use health records, arguing that the COVID-19 pandemic threw a spotlight on the dangers arising from the lack of such access.

Other commentators argue that data scarcity or 'health data poverty' from certain communities could result in unrepresentative datasets that produce poor quality and ineffective artificial intelligence (AI) developed and machine-learning medical solutions - essentially creating a digital divide where some of the best advances in medical innovation leave out (and leave behind) such communities. Indeed, persistent data scarcity in medical machine learning has led some authors to propose solutions such as reliance on synthetic data, but even such solutions need to ensure realism and avoid harmful biases.

The article by the WEF identified three themes of concern to address this: 'the sheer volume of data, data privacy concerns, and the need for interoperability.'

In this Insight article, Jeffrey Lim and Frederick Tay, from Joyce A. Tan & Partners LLC, look at this issue from a data privacy perspective and discuss ongoing work on the Health Information Bill to develop a potential national health data repository in Singapore. This Bill has the potential to provide the framework for the delivery of a high-quality, usable healthcare data exchange - to ask how one might see such legislation establish more than a national repository and database to service patients, but also build a sound, secure, and accountable framework for conducting data collection, use, and processing in the name of innovation, improving healthcare, pharmaceutical solutions, and the medical outcomes of therapies and treatments.

A year in law and regulation of artificial intelligence (AI) is an eon. Whilst a robust regulatory framework or approach should be consistent, grounded in principles, and predictable, it must also evolve to address changing attitudes and perceptions. And whilst no one wants to kill the goose that lays the golden egg by diving headfirst into innovation-curbing regulation, the shift towards favoring some kind of regulation is noticeably gaining speed, with growing fears of social, economic, and political ills.

Now, with the EU successfully rolling out its extensive AI regulatory framework in the EU Artificial Intelligence Act (the EU AI Act), one question is whether other countries will follow suit to enact their own compatible or comparable 'hard law' in a classic demonstration of the 'Brussels effect.'

In his Insight article last year, Jeffrey Lim, Director at Joyce A. Tan & Partners LLC, described how Singapore has been a prime example of a jurisdiction that has positioned itself as a paragon of a regulation-lite, community-fostered, and inclusive multi-stakeholder jurisdiction. With its voluntary Model Governance Framework now in its second version and encompassing generative AI (genAI Model Framework), in this Insight article, Jeffrey explains how Singapore appears set to continue in this vein - an approach that has already yielded significant private sector in-country investment in AI capability building.

In a time when competing approaches to artificial intelligence (AI) governance develop in different parts of the world, Singapore is charting a path that emphasizes pragmatism and enablement.

The National AI Strategy, a high-level strategy statement by the Singaporean government, envisions Singapore as a global hub for developing, test-bedding, deploying, and scaling solutions, with an additional focus on strengthening the country's AI ecosystem enablers. Since its publication four years ago, developments in Singapore's landscape of AI governance have been consistent with this approach, employing a decidedly 'light touch' in regulation and emphasizing the provision of practical tools and frameworks for responsible development and adoption. In this Insight article, Jeffrey Lim, Director at Joyce A. Tan & Partners LLC, will summarize Singapore's approach to AI governance in this context.

In the aftermath of the now ebbing COVID-19 pandemic, the importance of technology and the need for digitalisation has been thrown into sharp relief. To ensure Singapore remains competitive and able to capitalise on the surging digital wave, Singapore's Parliament unveiled a slew of measures, policy plans, and updates to legislation in its Committee of Supply ('COS') speech on 4 March 2022. Charmian Aw, Adrian Aw, and Leon Goh, from Reed Smith LLP and Resource Law LLC, provide insight into the contents of the speech and the proposed changes to Singapore's digital future.

The processing of children's personal data, from collection to destruction, generally carries with it special considerations. Indeed, the level of protection afforded to children is often higher, due to in part their capacity to understand the consequences of providing their information and the potential risks associated with their use or misuse. In part two of this series, OneTrust DataGuidance considers the rules in the APAC region which govern children's personal data, featuring perspectives from New Zealand, the Philippines, and Singapore.

For insight into handling children's personal data in Australia, China, India, and Japan, please see part one here.

Diversity and inclusion programmes are becoming increasingly popular across the globe due to a growth in awareness and a demand for organisations to support values, such as equity and inclusion. While actively engaging in diversity and inclusion initiatives may help organisations to better understand, manage, and develop the business, it is not always clear what data can, and cannot, be included in diversity monitoring surveys or what the rules are for such data collection.

The legal requirements surrounding information relating to an individual's race, gender, ethnicity, sexuality, and health differ from country to country, with some classifying such data as 'sensitive data', while others view it under the umbrella of 'personal information'.

OneTrust DataGuidance Research has consulted with a number of legal experts operating within the Asia Pacific region in order to uncover the requirements for the collection and use of employee data for diversity and inclusion surveys. The countries covered in this Insight article include Australia, China, Singapore, Japan, Hong Kong, and India.