Around the world, new personal data protection laws are coming into force and outdated laws continue to be updated: the EU GDPR, Brazil’s Lei Geral de Proteção de Dados Pessoais (LGPD), Thailand’s Personal Data Protection Act (PDPA), India’s and Indonesia’s proposed bills, California’s Consumer Privacy Act (CCPA), and the various efforts in rest of the United States at the federal and state levels. This proliferation of national personal data protection laws is bound to continue.
Yet, the impact of modern information technologies and the data and businesses ecosystems they enable transcends national and regional boundaries. The economic value of global data flows associated with this reality has surpassed the value of traditional trade in goods. And consumers and citizens are also increasingly mobile and global. In this globally connected world, it will be essential to drive convergence and ensure consistency and interoperability between data protection laws and regulations. This is crucial for both delivering effective personal data protection as well as maximising the ability to engage in responsible and beneficial data uses.
Modern data protection laws and regulatory guidance must be developed with an eye to other global personal data protection regimes rather than solely within domestic echo chambers. While the goal of global convergence must be balanced with unique local requirements and priorities, the vast majority of core personal data protections and principles can and should be harmonised in the digital world of today.
Lawmakers are just starting to dip their toes into AI regulation, but there seems to be widespread recognition that premature and over-regulation could stifle AI growth and development. Traditional data protection principles don’t necessarily align neatly with AI technologies, so lawmakers and regulators must be ready to establish and interpret rules more flexibly based on the risks to individuals and benefits to society. At the same time, some AI-based technologies, such as facial recognition, are causing a sense of urgency among some stakeholders as well as calls for specific regulation.
While such steps are being debated, AI use is increasing at a rapid pace. It is, therefore, imperative that organisations that develop and use AI technologies start building internal programmes, processes, tools and techniques to deliver accountable and trustworthy AI. Indeed, more than in any other context, organisational accountability and best practices will be the key ingredients in the “secret sauce” for an effective regulatory response to AI, by implementing relevant AI governance frameworks, such as the voluntary Model AI Governance Framework from the Personal Data Protection Commission (PDPC) of Singapore and the companion Implementation and Self-Assessment Guide for Organisations (ISAGO).
Accordingly, CIPL has proposed a layered approach to governing AI based on three pillars: (1) leveraging and expanding the existing personal data protection tools and norms, such as data protection impact assessments and certifications; (2) setting expectations for organisations in all sectors to build demonstrable and verifiable accountability programmes and best practices for AI; and (3) adopting governance practices that are risk-based, transparent and emphasise thought-leadership as well as incorporate innovative oversight mechanisms such as sandboxes, seals and certifications and constructive engagement between all stakeholders to foster understanding and build trust.
Free flow of data is essential to a sustainable global digital economy. Safeguarding this free flow of data requires coherent and efficient cross-border data transfer mechanisms and avoidance of data localisation requirements that artificially constrain data flows.
Both organisations and individuals expect that personal data protections and accountability must follow the data. Policy and lawmakers must build on those expectations. They must drive convergence between data transfer mechanisms, so that organisations can use the same type of mechanisms, such as contracts, certifications or binding corporate rules, no matter where they operate. They must also drive interoperability between different transfer mechanisms and create “adaptors” for the different national personal data protection “plugs”. GDPR Binding Corporate Rules (BCR) and APEC Cross-Border Privacy Rules (CBPR) should be able to talk to one another, just like the future GDPR certifications and APEC CBPR should be able to.
Accountability frameworks, such as certifications and codes of conduct will have to become part and parcel of any effective comprehensive data protection law and framework around the world. This includes certifications based on ISO standards, EU binding corporate rules, APEC CBPR or similar formal accountability frameworks. The Singapore PDPC, in fact, has taken a leadership role in advancing the use of certifications and CBPR. Organisations should be actively considering these certifications and start using them not only as transfer mechanisms, but also to provide assurance of compliance with the ever-growing body of national laws, and to demonstrate being responsible partners to businesses, consumers and regulators. Organisations that have comprehensive personal data protection programmes will automatically be ready to go this extra step and successfully obtain such certifications.
Regulating cutting edge, complex and evolving technologies presents unprecedented challenges for regulators, particularly when resources are limited. Deterrence and punishment alone have proven to have limited effectiveness in achieving desired results, much less encourage a race to the top in the market. If regulators want to be effective, they must apply modern and innovative regulatory approaches as well and prioritise open and constructive relationships with the organisations they regulate. This is true for all regulators, including in jurisdictions with well-established data protection cultures and histories. In countries that have, or soon will have, new or first-time personal data protection laws, like Brazil and India, newly established data protection authorities now have the advantage of getting this right from the start. Providing guidance and thought-leadership, encouraging and incentivising good behaviours and accountability will yield results, as it encourages a race to the top and taps into the desire of organisations to be seen as responsible businesses and trusted users of data.
Regulated entities must also prioritise and be ready for such constructive engagement. They must share knowledge and help educate regulators about new technologies. There is considerable scope for building compliance solutions cooperatively and ensuring responsible innovation that also protects the rights and interests of individuals. Regulatory sandboxes are a perfect example. These allow businesses to test innovative products, services and business models in real life and with actual customers under the supervision of a regulatory body. They create a safe haven for regulated companies to experiment and innovate, while helping regulators better understand the technologies they are regulating.
Data sharing between public and private organisations and within and across industry sectors will likely become more important, even transformational, to the modern data economy and digital society. It fosters competition and innovation, particularly in the context of AI-based technologies and data-driven business models. It is essential for academic and scientific research, as well for machine learning and algorithmic training. Data sharing also improves effectiveness of governments and public policy, from health, education and tax to social policy, all of which increasingly rely on data-driven decisions. There is a real need to develop frameworks for trusted data sharing based on organisational accountability. Indeed, here too, Singapore has taken a leadership role with its Trusted Data Sharing Framework. An overly “user centric” approach that makes data sharing dependent on choices made by individuals may actually defeat the benefits and full potential of data sharing. Instead, the focus of the debate should be the wide range of accountability measures that organisations could employ in this context, from risk assessments, transparency, proportionality and articulation of values and benefits to governance and data sharing agreements.
What is an approach to personal data protection regulation that is fit for the 21st Century’s Fourth Industrial Revolution? Will we start to design and apply our data protection laws in ways that truly work for individuals, or will we continue to design and apply them to make individuals work? Under the old model – making individuals work – consumers have to read endless personal data protection notices and constantly make choices about how their personal data may be used as they use different services and go about their daily life and work. Being able to make such choices is, of course, important in certain situations. In Singapore’s context, for example, consent remains important for purposes such as direct marketing for which consumers still wish to exercise choice and control. But in many cases, there are better ways to protect individuals that don’t require consumers to become full-time data protection professionals.
Despite mounting evidence of its ineffectiveness in protecting personal data and calls to shift away from this approach from all corners, the primacy of the notice, choice and consent model of personal data protection is still alive and well in 2020. Several recent proposals for U.S. federal data protection legislation have relied heavily on notice and consent, as has nearly every state bill that’s been introduced so far this year. The ePrivacy regulation in the EU has the same problem. Even where personal data protection laws, such as GDPR, do not privilege consent over other bases for processing personal data, deep rooted habits of regulators and policy makers continue to treat notice and consent as a sine qua non of personal data protection at the expense of better options grounded in “organisational accountability” that are available in plain sight.
Our digital world and society need new and different approaches to regulating personal data protection, while still empowering individuals. Now more than ever, it is essential that we unite in educating law- and policymakers in Asia, the US, Europe and beyond about the benefits of the accountability-based model of data protection regulation and about alternative grounds for processing personal data to the old-fashioned individual consent approach.
Organisational accountability requires organisations to implement comprehensive personal data protection programmes governing all aspects of the collection and use of personal information. It also requires organisations to be able to demonstrate the existence and effectiveness of such programmes upon request. It ensures robust protections for individuals and their data while enabling responsible data collection, use and sharing, placing more responsibility on organisations that are collecting and using data and less burden on individuals. Such data protection programmes may also be provided by or based on formal data protection codes of conduct or certifications and a good example is Singapore’s Data Protection Trustmark (DPTM).
One of accountability’s core features is risk-based personal data protections, which can provide organisations broad latitude in using personal data in no or low-risk contexts, enable more targeted and effective protection where actual risks are identified, and may include legal or regulatory prohibitions on certain high-risk activities that cannot be made safe. Risk-based personal data protection programmes enable organisations to focus on truly risky processing and prioritise their personal data protections in areas where it really matters. Under this approach, the primary burden of protecting individuals would lie with organisations that would now be required to formally identify personal data protection risks, mitigate against them, and be able to demonstrate and justify their risk assessments and mitigations.
Individuals clearly are concerned about how organisations use their personal data. Individuals are looking for value and responsible stewardship of their data. Many organisations are increasingly waking up to the existing trust deficit and taking proactive steps to address it, even when it’s not yet explicitly required by law. They are proactively building data protection management programmes that include leadership and oversight, risk assessments, policies and procedures, transparency, training and awareness, monitoring and verification and response and enforcement. Enlightened organisations that are taking this approach are realising the business and competitive benefits that flow from having such comprehensive programmes. It enables them to unlock the potential of their major assets – data – and to drive business growth and competitiveness through data-driven innovation.
The next year and decade will be all about accountability and corporate digital responsibility. CEOs and senior business leaders, as well as corporate boards must be ready for this step change and set the tone for this transformation.