Appointed as the Information and Privacy Commissioner of Ontario, Canada in 1997, Dr. Ann Cavoukian served for an unprecedented three terms as Commissioner. In that time, she elevated the Office of the Information and Privacy Commissioner from a novice regulatory body to a first-class agency, known around the world for its cutting edge innovation and leadership. Dr. Cavoukian is best known for her creation of Privacy by Design – unanimously adopted as an international framework for privacy and data protection in 2010; now translated into 40 languages.
What is your approach to privacy and data protection?
Because I’m not a lawyer, the approach I brought to being Privacy Commissioner was almost a medical model of how do you prevent the privacy harms from arising as opposed to just offering regulatory compliance which is always after the fact. You wait for the privacy breach to happen, you send it to the regulator, the regulator investigates and offers a remedy. That’s valuable but it’s too little too late. So I developed “Privacy by Design” which is about proactively embedding privacy protective measures into the design of your operations, your policies, your code, your programs, your systems, so that ideally you can prevent the privacy harms from arising, very much like a medical model of prevention. I had great success with it, I developed this in the late nineties but it really took off after 9/11. In 2010 it was unanimously passed as an international standard by the international assembly of privacy commissioners and data protection authorities. I was delighted because they recognized that we needed something to parallel and compliment regulatory compliance, they recognized that we needed both. So privacy by design has grown since then, and it’s now been translated into forty languages, it has a real global presence. In May of this year in the European Union there’s a new law coming into effect called the General Data Protection Regulation (GDPR), and this actually includes privacy by design as the default in the law for the very first time. So it’s very exciting to finally get a formal recognition and it’s going to raise the bar dramatically on privacy.
How did you get to become an expert in privacy?
I studied psychology and the law when I was doing my PhD and I wanted to show how you can examine legal issues empirically. The very first job I had was as chief of research services for the attorney general here in the province of Ontario. It was really fun, the deputy minister who hired me liked my work and my research and he said I’m going to hire you on contract for a year; you have to convince my lawyers that taking an empirical approach in addition to a legal approach to legal issues has value. So I jumped at it and said sure I’d love to do that and it proved to be quite successful. I ended up having a department and I was there for five years. During that time Sidney Linden became the first information and privacy commissioner of Ontario. He was familiar with my work and knew I’d done a lot of work on privacy with the Canadian Civil Liberties Association (the C.L.A.) and he said Ann come and head up the privacy side of our operations. My first job title was director of compliance at the Freedom of Information and Protection of Privacy office. And I jumped at that, so I started working there, and then I was promoted to Assistant Commissioner of Privacy. Then when Sidney Linden was appointed as Justice Linden and went to the bench, so I applied for the job of Commissioner when it opened up adn I had the good fortune of getting the job.
So I was very very fortunate, and it was different because as I said I brought a different focus not just a legal focus. What I said to companies especially was that privacy is a business issue, make it work for you, gain a competitive advantage by respecting your customers privacy and telling them the lengths you go to to protect their privacy. So it’s a very different approach from just a legal regulatory approach.
What areas of privacy are you most passionate about?
As I mentioned, for me it’s all about preventing the harms, so I’m very big on encryption. I tell people, companies and government departments to encrypt the sensitive, private, personal information that you have on individuals. When you collect the data you use it for a particular purpose, once you’ve completed that purpose make sure you encrypt the data. A lot of people keep data for a long time – data at rest, and they leave it in plain text and that’s when it attracts all of the bad guys, both in terms of hackers but also data breaches that happen by accident on the inside. People forget to secure the data and it somehow gets spilled out and all the harm arises. So protecting data is very important and encryption is a key tool.
The other thing I tell companies especially is engage in a dialogue with your customers. When you can build trust with your customers they will gladly give you consent to use their information for a secondary use that may not have been contemplated at the time of the initial data collection. That building of trust is critical and you do that by engaging in a dialogue with your customers and being very straight and upfront. When there’s trust I found customers have no problem giving additional permissions to use their data for secondary use by the company that they trust. What they don’t want is their data released for use by third parties with whom they have no relationship with whatsoever, and they don’t know. So that’s something I encourage companies to do is to build that relationship.
The third topic I am most passionate about is privacy by design which is all about engaging with customers in a way that is an essential component of your operations, so privacy is embedded seamlessly into the design of what you’re doing, into the code. I’ve done a lot of work with engineers, software designers and data scientists to tell the technical people why it’s so critical to embed privacy in their actual operations, bake it into the code that you’re developing. One year I called it “the year of the engineer” I went around talking almost exclusively to engineers and software designers to tell them this is what I want in terms of privacy by design, can you deliver? Can you in fact embed privacy into the code and into your systems? Every single person I spoke to said of course we can do that. But then they said to me, wait a minute we’re not the problem, we rarely get asked to embed privacy into the code when we’re asked to write a program. We get instructions to write the code, we deliver it, we deliver the program and then they say oh can you bolt on a privacy solution after the fact, and that’s the problem. The engineers said to me go talk to those guys, meaning the executives, the boards of directors, etc. So the following year I did, I called it “The Year of the Board” I talked almost exclusively to senior executives and boards of directors saying you have to get rid of the silos, you have to get rid of the here’s the engineering and here’s legal and here’s a policy department and here is marketing. Get rid of the silos and have an integrated approach so that when you do go to your code writers, your tech people, and ask for something, privacy is embedded into the process. I’ve focused a lot on speaking to boards of directors, and senior executives trying to get them engaged in this process and understand that if you don’t treat privacy seriously, it will come back to bite you. I can give you one example where the CEO of a large UK based telecommunications provider is taking absolutely no responsibility whatsoever. Privacy Matters’ Pat Walsh has been writing about this, he has requested access to his data under the GDPR subject’s access requests. Companies are permitted to charge up to ten pounds, but really that’s just in the event that it’s really cumbersome and extensive etc. You shouldn’t be charging to give people access to their own data routinely, and Pat Walsh has filed complaints to this company and to their CEO who has just ignore him. They basically said we can charge up to ten pounds so that’s what we’re doing. It’s just absolutely no customer service or understanding the negative press that this has resulted in, so that to me is the worst thing you can do. If you’re a smart CEO what you do is reach out to those clients, those customers, and you say my apologies we shouldn’t have charged you, it’s your data you have a right to access your data. We may have custody and control of your data, but it doesn’t belong to us it belongs to you, here’s your data. That’s what you should be doing.
Which privacy influencers influence you?
I hate to say this I do a lot of tweeting and I get a lot of very valuable information from tweets from all sources that I normally may not engage in. So I’m very much into the technical side of things I believe in encryption very strongly as I mentioned. There’s something called homomorphic encryption which is all about being able to do statistical analyses and utility data analytics on encrypted data so the data always remains protected whilst still enabling you to do the work that you need to do. So I do lot of reading in the area of homomorphic encryption in general, Bruce Schneier is an amazing cryptographer. He has a blog and he writes regular articles, he’s brilliant so I follow his work a lot. I just try to stay on top of the publications that the public read because I want to know what they are exposed to. I try to be very responsive.
Here in my jurisdiction of Toronto Canada there is a new smart city venture that is just starting. Sidewalk Labs, which is a Google based company, has succeeded in getting the contract to build a smart city in our waterfront area by the lake. I’ve been retained by Sidewalk Labs to show them how to do privacy by design, because unfortunately smart cities are often becoming cities of surveillance and that’s the exact opposite of what we want, we want to build a smart city that is a total pillar of privacy, and so that’s what we’re hoping to do. Now the problem is there’s a lot of distrust because sidewalk labs is a Google based company, and people don’t believe that they are going to protect their privacy. So you would not believe how many emails and tweets I’ve been getting from the public saying how can you work with them? They’re not going to protect privacy! And I’m saying yes they are that’s why they’ve retained me. One of the first things I said to sidewalk Labs was if you want me to do this privacy by design in the Smart City you’re contemplating, rest assured I’m going to be a thorn in your side because I’m going to tell you to do things you’re going to want to do, and then if you don’t do them I’m going to walk away. Privacy by design is all about being preventative and proactive, it’s about embedding the much needed privacy protection measures into the design of your operations. So this is what’s keeping me up at night, last night I had tweets coming to me late into the evening like 10-11pm which for me is late so I try to get to bed by then, and I kept responding and telling the naysayers, yes smart cities elsewhere have become cities or surveillance but we’re going to do the opposite here. It’s like people don’t believe you can change anything, but I’m the eternal optimist. You have to have optimism otherwise we’re not going to have anything going into the future.
I recognize that surveillance is growing dramatically all around the world, I feel like I’m the David versus Goliath, but we know how that story ended. I don’t want to slay Goliath, I want to embrace Goliath in a dance. I want to have both privacy and data utility, we have to have both. I’ve just started a new international council called Global Privacy and Security by Design. I expanded it to specifically say “and security” which is always been part of privacy by design, but I wanted to spell it out. Whenever you have a growth of terrorist incidents as we have dating back to Charlie Hebdo, San Bernardino, Paris, Brussels I mean there’s been so many, the pendulum swings right back to forget about privacy to we need security. Of course we do, but not to the exclusion of privacy, you can do both. My mission is to rid the world of zero sum thinking. Zero sum simply means it’s a win lose proposition, either or, you can have privacy or security, one versus the other never both. Not in my world, you have to have both. So there’s the positive sum model, positive sum means you can have two positive increments in two areas at the same time, that’s what we need. So my new international council GPS By Design is having great success in getting interest in this. We’re actually launched with a major fundraising Gala here in Toronto on January 25th to raise funds to promote research into machine learning, neural networks and artificial intelligence. There’s going to be so much work in this area and we want to ensure that it follows the model of privacy and security, privacy and data utility, win win, not win lose, not either or. So that’s the other big passion of mine, getting rid of the zero sum model in the world which is so prominent.
Khaled El Emam is the head of a company called Privacy Analytics and he specializes in the strong de-identification of personal data, which he balances against a risk of re-identification. You may have heard a lot of people say well de-identification may work sometimes, but a lot of times it doesn’t work and re-identification is possible. When you use Khaled’s models of de-identification it is highly unlikely that there will be any risk of re-identification. I always tell people as well about the myth of zero risk, you can never eliminate all risk, not in anything. You send your children to school, you tell them to look both ways before the cross the street, you pray that everything is fine. Occasionally bad things happen, but you can minimize the risk dramatically and that’s our goal. Khaled excels in that so I often refer people to him in terms of the strength of his de-identification protocols and the way he addresses that. I’ve done papers with him and I just look up to him enormously.
What are going to be the key developments in the industry in the next 12 months?
The one major development I’m hoping as a result of the influence of the GDPR is in the area of IoT and connected devices. I’m a real critic of IoT, everyone is so excited about the Internet of Things and connected devices. People are just running out the door developing these things without any thought to security or privacy and it’s already beginning to come back to haunt them because now all the data breaches are becoming massive data breaches. This is just growing dramatically and this will continue. Class action lawsuits are happening here in the United States and Canada and this is what I’m cautioning people against in terms of connected devices and the things you bring into your house. Alexa, Siri, Home, all of these little speakers that you have in your house and you speak to and find out information from etc., they’re also listening to you all the time. The sweet nothings you say to your partner in your home, the personal private conversations you have with your children, all of that is going out to unknown third parties, and this is a huge huge problem, and this will just accelerate. In Germany they outlawed the connected toy Clara which was a doll that the children would speak to. All the information that was coming from the children was going to third parties. So these are the areas that I am very very concerned about and I’m hoping with the introduction of the GDPR there will be a real crackdown and a backlash from companies who don’t take it seriously. A lot of that has to do with being upfront with your customers and telling them what you’re doing, and why you’re doing it. That’s the missing the missing piece here.
If a brand wanted to work with you, what offline / online activities would you be most interested in?
I love doing public speaking, I love getting the message out that privacy breeds innovation and creativity. Don’t view it as a negative, a lot of people and a lot of companies think of privacy as if it’s going to stifle innovation etc. It’s the exact opposite because you’ve got to be really smart to embed privacy into whatever it is your company is trying to do, but the end result is so much better. I love speaking to companies, trying to educate them, meeting with their senior executives and getting the message out that this is a positive you should be embracing. Then once you do privacy by design don’t keep it yourself, shout it from the rooftops! Tell your customers how much you respect their privacy and the lengths you’re going to to protect their data. They will reward you with their repeat business, and it will attract new opportunities. So that’s a message I’m trying to get across in a big way. I enjoy public speaking so much, to me that is one of the most appealing activities because I can see faces. I always monitor the audience very carefully and at the beginning they’re usually very sceptical and by the end I see people nodding and I have got them on point and that’s the challenge and the reward to me. To show people how you can do this that privacy is not a negative it’s in fact a positive.
Also I’m keen to collaborate on content, adding quotes for a blog or contributing to whitepaper or a webinar. Anything I can do to help companies advance privacy in their operations I want to do.
Privacy forms the foundation of our freedom and that’s what people don’t understand, it’s not just a personal human right, which of course it is, but it forms the basis of our freedoms. It’s no accident that Germany is the leading privacy and data protection country in the world. It’s no accident they had to endure the abuses of the Third Reich and the complete cessation of all of their freedom, and when that ended Germany said never again. I’ve been to many conferences in Germany and every conference they start with a reference to that time and never again. Freedom is essential you cannot have freedom without privacy.
Also product development of course, I’d be keen to show developers how they can in fact embed privacy into their operations. People think of biometrics for example, and they think, oh my God there goes privacy. No, there’s something called biometric encryption which allows you to gain the strength of the biometric for the identification you need, but not have it used for unintended purposes. There are so many ways that we can do this in so many areas.
What would be the best way for a brand to contact you?
E-mail me at email@example.com