Privacy-Enhancing Computation

Privacy-enhancing computation is on a growth curve as new privacy and data protection laws shift to address rising consumer concern over the eroding control of personal data. The continuous improvement of privacy-protection techniques will allow for value-added, transparent data extraction while still meeting compliance requirements. These techniques may span multiple processes, including data processing and sharing, cross-border transfers, and analytics. They would also protect data while it’s at rest or in motion. This technology is rapidly transforming from academic research to real projects delivering real value, enabling new forms of computing and sharing with reduced risk of data breaches.

Engineering Trust in Technology

We all rely heavily on technology to keep us safer, connected and secure — in exchange for ever more intrusive data collection by private companies and governments. The agency we once had over our health information and location data has significantly diminished. Users face complex decisions about the trade-offs between privacy and health, privacy and security, and privacy and convenience. AI-enabled IoT devices will intensely monitor our everyday health and provide biological feedback through even greater data harvesting. They will also pose a threat in terms of new cybersecurity challenges. The risks also compound over time as algorithms gain more predictive power and, in turn, the ability to reveal more privacy-relevant information. Unless we embed appropriate safeguards into every layer of data processing and handling, we will experience further loss of control over our choices and ethical decisions. These privacy issues permeate many contexts with similar characteristics, such as using search engines or social media platforms, online gaming, and transacting in the metaverse.

Definition of Personal Data

The European Union’s General Data Protection Regulation (GDPR) defines personal data as any piece of information that gives an actor the ability to identify an individual. This information can relate to identifiers such as a name, an identification number, location data, an online identifier, or a person’s physical, physiological, genetic, mental, economic, cultural or social identity. Personal data includes information that users voluntarily provide and data generated by their online activity and can cover everything from their medical history to their favorite TV series. The GDPR separately considers particular types of personal data in which vital individual and societal interests are likely to be at stake. These include data about ethnicity, political opinions, religious or philosophical beliefs, health, sex life, and sexual orientation, as well as genetics and biometrics. Each day, we exchange access to one or more of these data types in order to engage with social networking services. But our consent is arguably often ill-informed and ill-considered and therefore not autonomous. Service experts have argued that personal data processing is permissible not because of users’ autonomous consent but because of the provision of those services. Ethical experts have argued that the value of legitimate opportunities to exchange data for services should be assessed for users with different decision-making abilities and third parties.

Users have a range of important interests in exercising control over at least a significant part of their personal data. There are also important interests at stake in a person’s disclosure of data because one person’s exposure may permit inferences about others. Privacy interests include users’ abilities to: Users have control over privacy settings and the voluntary disclosure of information. This is accepted as sufficient to make the collection and use of personal data both legally and morally legitimate. However, the privacy self-management paradigm has received a lot of criticism, especially regarding ulterior motives and how data is harvested. This paradigm fails in different key areas:

The purposes for which a service provider uses the data go beyond what they intended, and people associated with the primary service user complain of an invasion of privacy. The sale of data to third parties is buried in complex terms, and in-app default settings induce users to transfer too much data, with no possibility to use the service while declining the onward sale of one’s data. The effects of data inference and targeted ads expose users to negative feelings (such as body shaming) and depression, especially among younger users. The failure to disclose ample information about the possible impacts of data exchange in terms of key aims and values and a sense of the likelihood of these impacts. This omission invalidates users’ consent as it interferes with their decision-making process, thereby undermining the autonomy of their decision.

It is becoming clear that more radical steps will be required to help ensure that users understand the terms of exchange.

Regulatory regimes should shift their focus from obtaining autonomous consent to use personal data toward ensuring that users have a range of options. Most users lack well-considered preferences for balancing privacy against other services, as could be provided through a menu of privacy options. Choices concerning the disclosure of information can be expected to advance both individual and common interests. These privacy options would carry additional protective value if based on information about risks and a company’s internal research on these risks.

Evolving Data Protection Laws

Until now, there has been a lack of collective commitment to further secure our rights to data ownership and privacy. Artificial constructs, like the Health Insurance Portability and Accountability Act (HIPAA), have been put in place to respond to privacy efforts. Still, some see them as only hindering information sharing and creating added paperwork and delay at every step. At the same time, state privacy laws passed pre-pandemic, such as the CCPA in California, might confer some protection against unfettered data collection and usage. Efforts to pass some privacy-related federal laws are also underway, but their passage will depend on the alignment of agendas by various stakeholders. Ethical data risk management around emerging technologies is critical to creating standards that can keep pace with new technical developments.

Greeting New Privacy Challenges

A select few companies will soon have data concentrated in their hands, especially in the telehealth sector. The uptake of telemedicine in the remote diagnosis era will continue via image recognition, access to AI and more extensive data sets for machine learning, and there is a lot at stake here when it comes to privacy protection. Today’s technology companies should at least meet minimum acceptable standards for data handling, privacy and safety. There are ways to encourage them to do more to protect privacy, including mandating that they are transparent about how they do it. We will likely see certification programs for privacy-preserving software and ethical software applicable to AI and online tools. There will be international treaties relating to information on social media. The IEEE Standards Association’s Standard for Ethical Artificial Intelligence and Autonomous Systems is one example of this trend toward developing ethically aligned AI design. Education focused on identity, data privacy, security and open-source uses of technology should also receive more attention in our school systems, to create more informed communities of users.

Putting It All Together

Chief information officers and IT executives should heed and make use of these trends to analyze how their organizations need to evolve their implementation and technology adoptions over the next three to five years. Those that adapt and plan accordingly will create long-term roadmaps for reliant and sustainable business growth. Sources This content is accurate and true to the best of the author’s knowledge and is not meant to substitute for formal and individualized advice from a qualified professional. © 2022 Camille Bienvenu

The Advent of  Privacy as a Service  - 38The Advent of  Privacy as a Service  - 88The Advent of  Privacy as a Service  - 19The Advent of  Privacy as a Service  - 46The Advent of  Privacy as a Service  - 71