PDP Commissioner launches Guidelines on Data Protection Impact Assessment, Automated Decision-Making and Profiling, and Data Protection by Design
On 30 April 2026, the Personal Data Protection Commissioner (“PDP Commissioner”) issued the following guidelines aimed at enhancing data protection practices:
- Data Protection Impact Assessment (“DPIA”);
- Automated Decision-Making and Profiling (“ADMP”); and
- Data Protection by Design (“DPBD”).
These guidelines introduce novel requirements that align with international practices and address emerging issues in the Malaysian data protection landscape, including the use of artificial intelligence (“AI”) in data processing and the processing of children’s personal data.
This article discusses key requirements set out in these guidelines.
Guidelines on DPIA
A DPIA is an assessment of how planned operations by data controllers involving personal data may impact personal data protection. An organisation’s senior management must, upon assessment, ensure a DPIA is conducted whenever new personal data processing operations are introduced such as by engaging an external cloud service provider or launching a new mobile application. An assessment of planned processing activities should be carried out by the organisation’s data protection officer (“DPO”) to determine whether they meet the threshold to trigger a DPIA in line with the following criteria:
- The processing of personal data is expected to exceed 20,000 data subjects, or involve sensitive personal data including financial information data of more than 10,000 data subjects, (collectively, “Quantitative Threshold”); or
- The processing of personal data may involve a high risk to the protection of personal data (“Qualitative Threshold”).
If the activities do not meet the Quantitative Threshold, the DPO should exercise their best judgment to determine whether the organisation’s processing of personal data falls within the Qualitative Threshold. This assessment includes considering whether the processing:
- may significantly affect the data subjects, including their legal, economic, social, or financial status, rights, health, reputation, or access to services;
- involves systematic monitoring of data subjects, for example, through the use of facial recognition technology to offer personalised discounts or the collection of geolocation data to track spending patterns;
- uses innovative or emerging technologies such as AI-driven features;
- may limit data subjects’ rights, for example by requiring mandatory consent to access services or creating administrative hurdles that discourage individuals from exercising their rights to access their personal data;
- involves personal data of children or vulnerable individuals; and
- involves automated decision making and profiling that carries a high risk to the data subject.
A DPIA may be conducted by implementing a five-step process known as “DEICA”, which are as follows:
- Describe the processing operations, including the nature and scope of personal data, and the context and purposes of processing;
- Evaluate the necessity of the processing, including whether it is proportionate to achieve the intended outcome;
- Identify risks to personal data, such as security risks and any limitations on the rights of data subjects, by considering the likelihood and impact of the harm identified;
- Consider steps to safeguard personal data by addressing the identified risks; and
- Assess the overall residual risk level of the processing.
Where the DPIA shows a high overall residual risk, the senior management of the organisation should decide whether to accept the risk and implement any additional mitigation measures, and allocate sufficient resources accordingly.
The DPIA Guidelines, which includes a DPIA template, can be accessed here.
Guidelines on ADMP
The Guidelines on ADMP define “automated decision-making” as any decisions made by automated means without human involvement, whereas “profiling” refers to the automated processing of personal data to predict or generate information about an individual’s characteristics, such as, among other things, their work performance, financial situation, health and preferences, and may also involve making generalisations about populations based on that individual’s data.
The ADMP Guidelines apply to ADMP activities that result in outcomes which:
- affect the legal status or legal rights of a data subject, such as where an automated system terminates a contract or entitlement, or refuses a social benefit provided under the law; or
- significantly affect the data subject’s circumstances, behaviour or choices, have a prolonged or permanent impact, or result in discrimination against the data subject. This may include, for example, outcomes that impair an individual’s access to essential services, employment opportunities, credit eligibility, or reputation.
An organisation conducting ADMP activities should obtain the data subject’s explicit consent if the automated decision-making and profiling process involve sensitive personal data such as information relating to the physical or mental health, political opinions, religious beliefs, the commission or alleged commission of an offence, or biometric data of the data subject. Strong security measures should also be implemented when processing sensitive personal data, such as by encrypting, anonymising, or pseudonymising such data or implementing stricter access controls within the organisation.
Data subjects should also be provided with a personal data protection notice or privacy notice containing the following information:
- That the processing of the data subject’s personal data involves ADMP processes;
- Types of decisions made through ADMP processes;
- Justifications for the automated decisions made;
- Possible consequences of the automated decisions; and
- Scope of use of AI if the organisation uses AI for the processing of personal data.
If the organisation uses AI for ADMP purposes, the following best practices may also be adopted:
- Identifying the commercial purposes of using AI and examining any risks related to ADMP processes;
- Ensuring that AI is used in a manner that upholds a data subject’s dignity, produces accurate outputs, takes into account the limitations of AI and any possible adverse impact, and is restricted to its intended purpose;
- Implementing appropriate security measures to mitigate the risks of over-dependence on AI systems or services;
- Training the relevant personnel on the operations and limitations of AI;
- Ensuring that AI should not be relied upon as the sole basis for making policies or decisions relating to a data subject; and
- Designating relevant personnel to review the use of AI in the automated decision-making process. Such personnel should be appropriately trained and be capable of proactively evaluating and interpreting AI outputs.
The ADMP Guidelines can be accessed here.
Guidelines on DPBD
The Guidelines on DPBD set out recommended best practices, applications, and examples for integrating appropriate technical and organisational measures into the entire lifecycle of data processing activities to ensure compliance with the personal data protection principles under the Personal Data Protection Act 2010 (“PDPA”).
The Guidelines on DPBD are neither prescriptive nor exhaustive, and compliance with them is voluntary. Nonetheless, they provide practical illustrations of how the PDPA, the Personal Data Protection Standard 2015, and other relevant guidelines issued by the PDP Commissioner interact.
The DPBD framework rests on four key pillars, as follows:
- Proactiveness: Data controllers or data processors should anticipate and prevent privacy risks before they occur, and actively put in place measures to prevent personal data breaches. This involves:
- setting up proper governance and ensuring sufficient resources are in place to manage personal data risks within the organisation; and
- designing systems, programmes, projects, and processes so that only the minimum amount of personal data is collected, used, and kept, ensuring that personal data is protected by default.
- End-to-end protection: Data controllers and data processors should protect personal data throughout its entire lifecycle, from design and development through to deployment and decommissioning.
- Transparency: Data controllers and data processors should be transparent about how personal data is processed and must be able to demonstrate compliance with the data protection practices that they have stated.
- User-centricity: Systems, products, services, and processes should be designed with the data subjects’ interests and needs in mind, giving them practical control over how their personal data is used.
The DPBD Guidelines encourage organisations to take a proactive approach when handling personal data. This includes any person processing data on behalf of another person by assessing risks and tailoring their data protection measures based on how the data is used, the nature and context of the data processing activities, and the purposes of processing.
The DPBD Guidelines can be accessed here.
Further information
This article has been prepared with the assistance of Associates Siah An Gel and Mohamad Syafiq bin Mohamad Tazri.