©2021 Senza Fili. All rights reserved.
As more of our lives and identities move online and into a virtual space where a stunning amount of information about us is – or may become – publicly available, our ability to know and permit what, how, and when our personal data is available to others becomes increasingly difficult. At the same time, the online world gives us tools to maintain more privacy than the physical world – for example, you can more easily create and manage multiple online identities than physical disguises.
Protecting our privacy and being able to retain some control over what data about us is public in the online realm is quintessential to how we manage our identity, reputation, career, social relationships and, perhaps even more importantly, our wellbeing.
Encryption is a powerful tool to protect our privacy by keeping encrypted content private as it goes through the channel we use to communicate with trusted parties. We use encryption to protect data transmission, but data stored on devices is typically not encrypted.
So encryption does not do much for privacy if the trusted parties decide to make our communications public or share it with others, or if they are not able to protect the data. You may send a photo to a friend using an encrypted channel, but your friend may inadvertently or on purpose make it public without your consent.
Users can take the matter in their own hands and use apps like Signal to get more advanced encryption or use a VPN to encrypt traffic to and from their devices.
But for most users, it is service and content providers including social media that encrypt transmission. They also decide what content or apps or services to encrypt and the type of encryption used. This is done not only to protect the privacy of the user, but also to protect and retain control of the content itself.
Because encryption is so effective in addressing crucial privacy requirements, there is a broad trend to encrypt as aggressively as possible, and to completely block access to encrypted content not only to hackers, but also to government agencies even when there is a suspicion of criminal behavior.
But are there cases where strong encryption may not be desirable or unduly expose us to risks? Can the excess in the protection of our privacy lead to a reduction in our ability to protect ourselves when bad actors use encryption to cover their tracks? And in this case, how do we balance the need to protect ourselves and to preserve privacy?
Encryption can be a double-edge sword. It protects us from others intruding into our lives, but also protects others intent on harming us. In a variation of the prisoner’s dilemma, we all benefit from encryption if we are all reasonably well-behaved. But with activities such as child exploitation – an area where encryption is an enabler of criminal activities – it is the bad guys who benefit from encryption and leave the rest of us – and namely children – more vulnerable to criminal activity.
Much of the debate on encryption these days presents an uncompromising defense of increasingly higher levels of encryption (i.e., no access to encrypted information to anybody including companies managing the data, device manufacturers, or even government entities) as the only way in which we can protect ourselves against the unwarranted invasion of privacy from governments, social media platform providers, and other private companies we engage with.
While motivated by well-documented and repeated cases in which these players have abused their ability to gain access to our data and use it to their benefit, this approach may end up backfiring because it indirectly and unintentionally facilitates the abuse of encryption – e.g., to cover up child sex abuse or to plan terrorist attacks or other criminal acts. In doing so, it may undermine the support for the socially valuable use of encryption, and justify a call for restriction of the use of encryption by those who want to have access to our data for political or financial gain.
Is there a more nuanced way to support wide use of encryption to protect privacy, while discouraging the abuse of it that harms society? Can we find a way to develop trusted encryption frameworks that combine privacy protection with detection and prevention of criminal activities? How can we maximize the benefits of encryption and minimize its abuse?
Deciding what is the appropriate level of encryption, which public and private entities have access to potentially harmful encrypted data, and how to keep these entities accountable for such access are new complex questions, whose scope goes well beyond technology.
As a society, we do not have yet answers to these questions. This is all new ground, and we are all learning what is the best way to balance the need to protect privacy and ourselves, and to decide what data is fair and acceptable to share and what data we are entitled to see protected.
The wireless community, however, can take action to provide a more flexible way to address the tradeoffs between the right to privacy and the need for protection when it comes to encryption.
Instead of focusing debate on privacy and protection on technology (i.e., what is the most effective way to encrypt data) or how service and content providers deal with privacy protection, we could create platforms that give users more choices – and more awareness of tradeoffs – to users so they can set the balance as they see fit. We could also be more transparent in acknowledging the challenges we all collectively in the wireless ecosystem face in doing so, rather than pointing fingers at each other.
Today the responsibility for choosing the right approach to encryption is largely left to the service and content providers. But why should we expect the Facebooks or Googles of this world to decide how to protect our privacy and to enforce that decision? Their voice should be heard as they are key stakeholders and enablers of encryption, and we should expect them to follow the rules, but not necessarily to set them. Users should have a say too.
Most of the time, encryption is presented as a binary option (i.e., a channel is or is not encrypted), often with the implication that the more encryption the better off we are.
What if instead, we establish different levels of encryption, which can be channel-dependent and with options available to users? Not all content needs to be protected in the same way. I would venture that most traffic that goes on WhatsApp, for instance, needs less protection than financial transactions. Also, people may feel differently about the protection they need.
To protect children, for instance, we could have platforms enabled for the content directed to children not to be encrypted or to be lightly encrypted, so that the service provider may screen video and graphic content. Parents may then allow their kids to communicate only with users who accept a lower level of encryption.
On the other end, political activists worried about exposing themselves or others to monitoring may choose tighter encryption.
We have something similar with payments. Depending on what you are paying for, you may choose any option from cash to wireless payments, to cryptocurrency. Each has its levels of convenience, traceability, and reliability, and we all have our preferences and requirements.
To some extent, this is already happening with encryption too. For instance, you can pick applications such as Signal or use a VPN if you want stronger protection. But they are still targeting a niche market.
Service providers should provide more options and transparency to create wide awareness and to empower users to make the best decision on what type of protection they need. While this would require outreach efforts, it could be beneficial to business too, as a way to consolidate (or establish) a trust relationship with users.
Being able to choose among encryption options and know their tradeoffs will not eliminate the abuse of encryption overnight, but it may be a first step to go beyond the current encryption on/off approach and take us towards a more effective approach to privacy and personal protection.