Support Centre

Asia-Pacific

Insights

The AI Safety Governance Framework v1.0 (the Framework) was published by the National Information Security Standardization Technical Committee (TC260) on September 9, 2024. It is neither a mandatory law or regulation, nor a national standard for AI, but more a general risk assessment and a guidance for different roles when facing the problem of artificial intelligence (AI) safety. Although not a law, it has contributed some useful and helpful opinions on AI safety governance, especially the risk assessment it proposed for safety risks. Dehao Zhang, Counsel at Fieldfisher, provides an overview of the Framework and how it may impact future AI legislation in China.

According to the 2023 Philippine Judiciary Annual Report, there are about 658,101 pending cases before the Supreme Court down to the lower courts. To support litigants, various laws were enacted to aid them in protecting their rights and defending their claims under the law. In the domain of data privacy, the Data Privacy Act of 2012 (the Act), permits the processing of sensitive and privileged personal information if it is necessary for the protection of lawful rights and interests of natural or legal persons in court proceedings, for the establishment, exercise, or defense of legal claims, or when provided to the Government or a public authority.

Pursuant to its rule-making authority, the National Privacy Commission (NPC) issued Advisory No. 2024-02, entitled Guidelines on Personal Data Processing Based on Section 13(f) of the Data Privacy Act (the Advisory). The Advisory clarifies the application of Section 13(f) of the Act to enable Personal Information Controllers (PICs) to properly rely on this basis for the processing of personal data. Notably, according to the Advisory, it is not the Act's intention to grant a blanket exemption to public authorities, but rather to strike a balance between, on one hand, the need for public authorities to process personal data pursuant to its functions and mandates, and the need to safeguard the rights and interests of data subjects on the other.

In this Insight article, Edsel F. Tupaz and Julia Antoinette S. Unarce, of Gorriceta Africa Cauton & Saaavedra, discuss the key points under the Guidelines and how to ensure compliance with the provisions regarding lawful bases for processing.

Thailand has made significant strides in establishing a robust framework for data privacy and data protection. The Personal Data Protection Act 2019 (PDPA) came into full effect as of June 1, 2022, following two enforcement suspensions. The PDPA is Thailand's first comprehensive law covering private sector entities. Before the PDPA, data protection regulations were limited to specific sectors in Thailand, such as government agencies handling personal data and telecommunications, as well as the National Credit Bureau. However, the PDPA now mandates continued compliance with both their existing applicable regulations and the PDPA for these sectors.

Since the PDPA came into full effect, various sub-legislations have been gradually issued by Thailand's regulatory body under the PDPA, namely the Personal Data Protection Committee (PDPC). This marks a critical development in the country's legal landscape. However, although the PDPA aims to protect individuals' personal data and, at the same time, balance the needs of businesses, certain areas remain unaddressed by sub-legislations and the lack of official guidelines, making it challenging for businesses to fully comply with the PDPA.

Furthermore, as globalization reshapes the business landscape, multinational organizations must navigate a complex web of legal and regulatory frameworks, especially as data protection and privacy continue to evolve in Thailand. Standards under the PDPA that govern vendor relationships are one of the key considerations businesses should carefully review, given the increasing reliance on third-party vendors for data processing. In this Insight article, Chanakarn Boonyasith, Pitchabsorn Whangruammit, and Pattaranun Hanwongpaiboon, from Nishimura & Asahi, provide an overview of vendor privacy contracts in Thailand, highlight key legal requirements, and outline important considerations for multinational organizations operating in Thailand.

The Philippines is preparing for a quantum leap in artificial intelligence (AI) adoption with the launch of the National AI Strategy Roadmap 2.0 (NAISR 2.0). Announced publicly on July 3, 2024, by the Department of Trade and Industry (DTI), and supported by the Asian Development Bank and a legislation mandate from the Tatak Pinoy Act (Republic Act No. 11981), this updated roadmap aims to position the country as a regional AI powerhouse fostering innovation and sustainable economic growth. For businesses and stakeholders keen on staying ahead of the curve, understanding the nuances between the 2021 NAISR (NAISR 1.0) and NAISR 2.0 is key to understanding the upcoming legislative and regulatory landscape in the Philippines.

In this Insight article, Edsel F. Tupaz and Danica Anne S. Escobiñas, from Gorriceta Africa Cauton & Saaavedra, take a deep dive into the NAISR 2.0, its interplay with other privacy laws, and the impact it may have on development in the Philippines.

Data scarcity, or more specifically, the lack of usable and high-quality data (that is, data that has been cleaned, processed, is of high quality in accuracy and relevance, and formatted, in each case to a widely and commonly accepted standard) is an endemic problem in relation to various industries and the healthcare and pharmaceutical industries are no exception. An article published by the World Economic Forum (WEF) identified that although the healthcare sector generates a 'staggering amount of data,' a whopping '97% of it goes unused.'

In the ideal world, we would see the availability of centralized digital repositories with complete, up-to-date, fully usable health information available to the community in an accountable and secure framework.

The lack of such a framework can lead to undesirable outcomes. One US commentator argued for a national health information exchange to deal with this issue, one that formally establishes required standards for creating, maintaining, and sharing these records, and one that creates the infrastructure and capability to gather, integrate, store, and use health records, arguing that the COVID-19 pandemic threw a spotlight on the dangers arising from the lack of such access.

Other commentators argue that data scarcity or 'health data poverty' from certain communities could result in unrepresentative datasets that produce poor quality and ineffective artificial intelligence (AI) developed and machine-learning medical solutions - essentially creating a digital divide where some of the best advances in medical innovation leave out (and leave behind) such communities. Indeed, persistent data scarcity in medical machine learning has led some authors to propose solutions such as reliance on synthetic data, but even such solutions need to ensure realism and avoid harmful biases.

The article by the WEF identified three themes of concern to address this: 'the sheer volume of data, data privacy concerns, and the need for interoperability.'

In this Insight article, Jeffrey Lim and Frederick Tay, from Joyce A. Tan & Partners LLC, look at this issue from a data privacy perspective and discuss ongoing work on the Health Information Bill to develop a potential national health data repository in Singapore. This Bill has the potential to provide the framework for the delivery of a high-quality, usable healthcare data exchange - to ask how one might see such legislation establish more than a national repository and database to service patients, but also build a sound, secure, and accountable framework for conducting data collection, use, and processing in the name of innovation, improving healthcare, pharmaceutical solutions, and the medical outcomes of therapies and treatments.

The National Privacy Commission (NPC) issued NPC Circular No. 2024-02, entitled Closed-Circuit Television (CCTV) Systems (the Circular), on August 9, 2024. It notes the previously issued NPC Advisory No. 2020-04, entitled Guidelines on the Use of Closed-Circuit Television (CCTV) Systems, and recognizes the need to provide an updated policy in relation to the use of CCTV systems due to the continuously evolving nature of technology concerning CCTV systems. Thus, the NPC has provided guidelines to assist all personal information controllers (PICs) and personal information processors (PIPs) in navigating the emerging privacy risks arising from the use of CCTV systems.

In this Insight article, Edsel F. Tupaz and Luis Teodoro B. Pascua, from Gorriceta Africa Cauton & Saavedra, highlight salient changes in the NPC's policy since its NPC Advisory No. 2020-04 and discuss the practical implications arising from notable changes.

Along with the growth and increasing prevalence of artificial intelligence (AI), including generative AI, the privacy and ethical risks brought along by the new technology cannot be understated. Ada Chung Lai-Ling, Privacy Commissioner for Personal Data (PCPD), Hong Kong, China, looks at what steps organizations can take to ensure compliance and the guidance offered by the PCPD to help this.

On September 5, 2024, the Department of Industry, Science and Resources (DISR) published a paper on Safe and responsible AI in Australia: Proposals paper for introducing mandatory guardrails for AI in high-risk settings (the Proposed Guardrails). The DSIR also announced a public consultation on the paper to receive feedback which will be used to guide Australia's approach to artificial intelligence (AI) regulation.

OneTrust DataGuidance provides an overview of the proposed guardrails and how they compare to other emerging AI regulations, with expert comments provided by Alec Christie, from Clyde & Co LLP.

In this Insight article, Vick Chien, Ken-Ying Tseng, and Evelyn Shih, from Lee and Li, Attorneys-at-Law, introduce Taiwan's progressive steps towards artificial intelligence (AI) governance. With the 2024 Draft Artificial Intelligence Basic Act (the Draft Act), Taiwan addresses legal challenges while promoting sustainable development, transparency, and innovation in AI technologies.

Further to the Provisions on Promoting and Regulating Cross-border Data Transmission promulgated by the Cyberspace Administration of China on March 22, 2024 (the CBDT Provisions) which substantially reduce the procedural burden associated with transmitting data out of China, Beijing authorities recently issued the Administrative Measures for the Negative List for Data Export from China (Beijing) Pilot Free Trade Zone (for trial implementation) including a detailed negative list (the Beijing Negative List) on August 30, 2024. The Beijing Negative List aims to further liberalize the transmission of data out of the Beijing pilot free trade zone (FTZ) and provide more clarity on the identification of important data. On top of benefiting existing companies in the Beijing FTZ, the business-friendly tone of the Beijing Negative List seems to be a good reason for companies to revisit their current data export set-up and make the best use of the benefits offered under these new rules. Are these rules indeed creating a substantial difference and a game changer?

Dr. Michael Tan and Julian Sun, from Taylor Wessing, take a deep dive into the new measures and FTZ, comparing them with other FTZs in China.

In this Insight article, Kritiyanee Buranatrevedhya and Phatrajarin Tanjaturon, from Baker & McKenzie Limited Attorneys at Law, introduce Thailand's National Science and Technology Development Agency (NSTDA) Ethical Guidelines for artificial intelligence (the Guideline), announced March 2022, a framework designed to promote responsible and ethical artificial intelligence (AI) practices.

The Personal Data Protection Act B.E. 2563 (A.D. 2019) of Thailand (PDPA), effective from June 1, 2022, is the key legislation of Thailand that provides comprehensive protection for personal data. Local and foreign entities that collect, use, or disclose personal data of data subjects in Thailand are subject to the PDPA. Cross-border data transfers are subject to stringent requirements under the provisions of the PDPA and the applicable rules issued under the PDPA. Multinational corporations (MNCs) are required to have in place adequate data protection measures for the purpose of their cross-border data transfer activities.

Kowit Somwaiya and Usa Ua-areetham, from LawPlus Ltd., provide an overview of the key considerations for MNCs to consider when implementing cross-border data transfer mechanisms. The overview is focused on the key requirements for the Binding Corporate Rules (BCRs) and the Data Transfer Agreement (DTA) as set out in relevant notifications issued by the Personal Data Protection Committee (PDPC) under the PDPA, such as the implementing rules on the criteria for protecting personal data sent or transferred abroad according to Section 28 of the PDPA (PDPC rules).