The Power of Data
Massive amounts of new data are generated every day. In 2017, IBM calculated that 90% of all the world’s data had been created within the past two years.1 Data shapes our knowledge, decisions, and everyday life; data has power. Thanks to the fact that technology devices are getting smaller and smaller, data often accompanies us 24/7. In addition, technological processing power is increasing and becoming ever smarter in assisting us in our everyday lives. Programs, platforms, and apps analyze every individual data element provided by billions of users. This could be data about our whereabouts, the photos we upload, where we do our shopping and what we buy, specific keystrokes, or user behavior data. For technology creators, there is no upper bound or limit to how detailed these computational processes can be.2 As a result, application algorithms try to predict what consumers want. The more data we provide, the more accurate our algorithmic predictions, sometimes being able to state what we want and need even before we know it. Very often, technology is used absent any oversight of the processing activities that take place with the data users share and entrust to the service providers.
The Law Struggles to Keep Up
In recent years we have witnessed changes in data protection laws and regulations across the globe. Regulators and the public acknowledge the need to regulate the data processing practices of the tech industry while also providing more transparency and enabling individuals to be in control of the processing activities organizations have over their data.
These steps are progressive and necessary, but are they enough? Do the privacy policies and notices presented on websites and service offerings provide the transparency and understanding users seek, leaving them empowered and in control?
Who Reads Those Privacy Notices?
As a privacy professional, I admit that I rarely read through the terms and conditions or privacy policies when I sign up for new services online. Like others, I simply do not have the time, nor do I wish to get lost in legal jargon that often leads to more questions and doubts than I had at the first place.
While individuals are granted several rights under some of the data protection regulations—such as the right to access the data organizations have about them, restrict certain processing activities, or delete personal information—when there are no other overriding laws and regulations, there is still more that should be done.
Privacy by Design
Data protection laws like the General Data Protection Regulation (GDPR) include the requirement to incorporate privacy principles into the entire lifecycle of a product or service. Privacy by Design is not a novel concept introduced with GDPR. It was first introduced by Dr. Ann Cavoukian in the mid-1990s as an approach to systems engineering.3 Thus, it has been lingering for a long time but not taken under consideration as much as we would like to see.
Regulations such as GDPR call for organizations not only to react to data protection and privacy infringements but also actively design and embed privacy requirements in product design to prevent privacy blunders from happening in the first place. So, what should be done? Organizations should actively incorporate privacy by design principles and requirements throughout the entire design process of products and services. They should question and challenge the presented personal data collection designs. Until recently, organizations have collected and processed all the data they could collect because it has been convenient.
Artificial Intelligence and Privacy
The presumption sometimes has been that while not all the processed data elements could be used at the moment, they could be beneficial at a later stage. Furthermore, when we speak about machine learning, the “pro” arguments for excessive data collection have been that the more data we have, the more accurate our models will be. One privacy by design principle obliges product developers and service providers to avoid excessive data collection and limit data hoarding. Organizations should only collect and process what is absolutely necessary to achieve the purpose they have in mind. We should remind ourselves that the more data we have, the more risks we have.
How Empowered Are We?
As an individual user, how often have you struggled to find the privacy settings on an app or been unpleasantly surprised to find out that certain settings have been turned on by default and are already actively sharing your data? Have you ever felt like you have been manipulated into approving a feature without fully understanding what you agreed to? The use of dark patterns—manipulative design features that intentionally deceive or influence user behavior—to improve conversion is not acceptable. Usability of an app’s settings is just as important as the rest of software design. Security and privacy features of a product must be easy for users to find and understand so they can exercise their data protection rights and controls. It is the responsibility of developers, designers, and product owners to raise awareness and enable their users.
Users often state that they are concerned for their data protection and privacy, but their online behavior is inconsistent with what they say they value. This is called the privacy paradox. So why don’t people act consistently? Perhaps because human ability to gather and analyze information before making choices about how an app will share their personal information is limited. It is also impossible to predict with certainty how this information will be processed in the future and how it will affect users. This makes people vulnerable to unintended outcomes, and while the regulations try to support and empower users, more effort is needed.
The Need to Anticipate and Design for All Possible Uses
The Ethics Centre in Australia has published a list of principles for good technology that I would invite everyone who is involved with product development to review.4 They include the following principles:
- Ought before can. The fact that we can do something does not mean that we should.
- Non-Instrumentalism. Never design technology in which people are merely a part of the machine.
- Self-determination. Maximize the freedom of those affected by your design.
- Responsibility. Anticipate and design for all possible uses.
- Net Benefit. Maximize good, minimize bad.
- Fairness. Treat like cases in a like manner; different cases differently.
- Accessibility. Design to include the most vulnerable user.
- Purpose. Design with honesty, clarity, and fitness of purpose.
These principles emphasize that we should always try to anticipate and design for all possible uses. Of course, this is a very difficult advice to follow. It’s doubtful the founders of social media platforms used by billions of people envisioned, intended, or predicted their platforms would have the use cases and impacts on society they have today. However, with the evolution of digital products over time, with the evolution of the purpose, the anticipation of possible uses should be regularly revisited and the possible risks assessed. Also, do not forget Dr. Ann Cavoukian’s privacy by design principles.
The Industry Needs to Step Up for the Sake of Privacy
Technology and data can provide us with amazing rewards but also come with some risks. We should keep in mind that the products and services created are shaped by the values of their makers. As consumers, we are responsible for creating the world we want to live in. If we don’t take that responsibility, then someone with a different vision will make theirs happen.
I wish you all a safe International Privacy Day!