Maintaining Public Trust Through Responsible Privacy Practices
The COVID-19 pandemic has required emergency measures to preserve our physical, social and economic welfare. As policymakers have attended to the protection of our health, ethics-centred concerns such as privacy have sometimes been of secondary focus. Now a rebalancing is underway: leaders are increasingly treating the preservation of privacy and other individual rights as an essential component of reentry plans.
Technology companies have long been confronted with questions about how personal data should be treated. Individuals insist that their information be subject to normal privacy protections, whether the data is collected and processed digitally or otherwise. Regulators, particularly in Europe, and especially since the General Data Protection Regulation (GDPR) entered into force in May 2018, have sought to ensure that individuals maintain control over their personal information by restraining tech companies from simply collecting and handling personal data as they please.
The aftermath of the COVID-19 crisis will inevitably feature renewed efforts to protect the privacy of individuals. Technology companies must be thinking now about the types of policies and practices they should adopt to ensure that they do business in a manner that fully addresses privacy needs.
APCO recently assembled a group of thought leaders to discuss best practices for protecting individual rights as pandemic-driven restrictions are loosened and economic and social orders begin being restored. The learnings from that dialogue serve as navigation points, not only in the current crisis, but also with respect to broad, on-going regulatory considerations of privacy rights, such as the review of Europe’s GDPR scheduled for late June.
Vilas Dhar, a trustee of the Patrick J. McGovern Foundation, which is concerned with the creation of positive social impact through the application of information technology, said that the current debate is grounded in the concept of public trust. Dhar said that if technology companies do not pay sufficient attention to potential harms that might arise from their data-handling practices, there will be an erosion of trust in the idea that technology contributes to the social good. Dhar said that broad-based civic participation in fashioning policies is the key to preserving not only individual rights but also the health of the digital economy.
Dakota Gruener, executive director of ID2020, a public-private partnership which advocates for ethical, privacy-protecting approaches to digital ID, echoed Dahl’s focus on the importance of trust to the development of technology-based solutions. Using digital identification systems as an example, Gruener noted how privacy, portability and interoperability can be incorporated into the architecture of technologies in a way that leaves “data subjects” in full control of their personal information.
Laura Gardner, Microsoft’s* director of global policy, explained that Microsoft has adopted a set of principles to ensure the protection of personal privacy as it develops technical solutions. Those principles, ranging from obtaining meaningful consent from data subjects to providing appropriate safeguards to secure data, are aligned with the European Union’s GDPR framework and reflect a growing consensus on appropriate approaches to the protection of privacy.
The increasing prominence of privacy and ethical concerns in the effort to control and defeat the COVID-19 pandemic sets the stage for the next round of regulatory consideration of the proper use and handling of individuals’ data. Stakeholders must be prepared for challenging debates that could have profound effects on their business practices.