Making Privacy Consumable
As engineers we tend to be bad at simplifying topics for lay folks. Yes, we've talked about this before. But when it comes to the data security conversation, i think we have hit a new low. I say this because of the disconnect between the tech sector and the public on such a pivotal issue. The fact that so many Americans are left disinterested by the ongoing debate highlights a collective failure of tech folks to distill a complicated issue into simpler terms.
The fact stands that in an increasingly technical world, it is critical that we make topics like security more approachable to the masses. To do that we need to fundamentally change how we talk about privacy. These changes must also cover the communication spectrum: from how we talk to our clients and teams to how corporations speak about these issues to the media and politicians.
While the recent iPhone debate started bringing this to the public's eye, there is a lot of ground yet to cover. Tim Cook's "it's the software version of cancer" comments were perhaps a bit too distilled. While i agree with the sentiments, i think we can do more than simply scare the public. We should try to help them understand where we are and where we stand to go in language they empathize with.
Encryption and mailmen
Encryption is a good place to start. Most folks haven't a clue about end-to-end encryption, what it does or why it is important. That needs to change. We need a way to bring it closer to home.
Encryption is a lot like mailing a letter. The Post Office knows where it is going. They might know who sent it. Either way the contents of the letter remain private.
But say Bob commits a crime. Before Bob committed his crime, he mailed some letters. The Post Office knows he mailed them but they don't know what the letters said. The investigators would really like to know more about these envelopes. What did they say? Did they the contain anything other than a letter?
To get these answers, police ask the mailman. Of course our mailman has no idea. He doesn't open people's mail. He has no clue what the mail he delivers says. The sender's privacy is ensured by letters being left unopened. The same can be said for our encrypted messages.
But wait
Frustrated by the lack of information, the investigative team seeks to have a law passed. From now on, every mailman needs to open every letter before they deliver it. Once opened, they are to photograph it and save a scan to a folder about the sender. This stands help possible future investigations. The assumption of privacy typically expected by senders is forfeit. After all, innocent people have nothing to hide.
That final bit sounds wrong to just about anyone, regardless of technical expertise. Encryption and security is a nebulous voodoo-y topic to the public. We need tangible examples people can easily grasp and draw uncomfortable parallels from. Once drawn, they tend to react the same way we do.
After all, this very scenario is what caused WhatsApp to be shut down in Brazil last week. WhatsApp has implemented end-to-end encryption, but the Brazilian government wanted message access. WhatsApp, like the mailman, was unable to give the investigators the messages and so they were punished. Not the end of the world for an app with their backing, but a death sentence to many smaller apps.
Is data evidence or testimony
Another topic that needs to be addressed is whether data is evidence or testimony. Traditionally, testimony has been knowledge while evidence was tangible. In the past this distinction was rather obvious. But the difference is getting blurry as technology becomes more pervasive.
Our phones now passively collect data that even 5 years ago would have been strictly in the realm of testimony. As we continue to shift from PCs to phones and wearables, more and more knowledge is being stored as data. At some point we will have embedded computers in our bodies. The ideas of someone plugging a USB into my neck Matrix-style to download "evidence" is a chilling prospect.
Dystopian sci-fi aside, the evidence/testimony distinction is important because our constitution recognizes the issues with having to testify against yourself while standing trial.
"No person ... shall be compelled in any criminal case to be a witness against himself" - 5th Amendment of the US Constitution
The distinction of evidence vs. testimony is growing harder to define as the years pass. This could be one of the major legal precedents decided in the next decade. In the meantime, our duty as technologists is to protect our clients data to the utmost. If it is decided that data by definition is evidence, many startups stand to lose consumer trust overnight. Wearables immediately shift from a luxury to a liability.
Complexity made easy
As mentioned above, the recent Apple case began to shed some public light on this topic in a consumable way. Tim Cook did a reasonable job discussing the core issues in a way the public could understand. But up until Cook's comments, the best public explanation of privacy may have been Jon Oliver's "Yes, they can read your sexts" piece.
Think about that. For a topic as pivotal as the security and trust of our customers moving forward, a comedy piece about dickpics has been the most approachable bit of reporting on the issue. I can think of no better way to highlight a failure to communicate the software privacy issues in a simplified, concise way. It seems that while the importance of security cannot be understated, much of the minutia should be.
Avoiding politics
I have been hesitant to write this post for fear of breaking too far into politics on this blog. But the recent events in San Bernardino, Brazil and the introduction of the Burr-Feinstein your-browser-is-literally-illegal bill have forced my hand.
In an effort to improve our communications regarding security and privacy we must avoid politics at all costs. No topic causes defensive disassociation quite like politics.
So while the privacy battle may be fought in the political arena, the path to understanding it is paved another way. As technologists we should charge ourselves with leading people toward that understanding. We must play the role of teachers.
The public will make whatever decisions they will. We can help them to do so from an informed position. A position they are unlikely to achieve given the current level of jargon in the security and privacy conversations.
A professor once told me the duty of a software engineer is "to make software suck less". Unless we fundamentally change the way we introduce the public to security and privacy issues, software stands to suck more. A great deal more.