Computing / Cybersecurity

Leading with a security-first mentality

With cyberthreats increasing every day, the way to set your company apart is by getting ahead of privacy and security concerns, argues Ann Cavoukian.

Sponsored Content

Produced in partnership with Microsoft Security

As technology rapidly develops, the number of security and privacy concerns will only continue to grow. In this episode, we look at how companies can build cybersecurity into their business strategies—instead of scrambling to respond when a breach happens.

Even with danger lurking around the corner, today’s guest, cybersecurity expert Ann Cavoukian, argues that companies are turning a blind eye to security and privacy issues until it is too late. Cavoukian is the executive director of the Global Privacy and Security by Design Centre, as well as a senior fellow of the Ted Rogers Leadership Centre at Ryerson University. She’s worked closely with the government in Canada as well as private companies on the best way to defend against security attacks.

Cavoukian also says that privacy is vital to our society and an indispensable form of freedom, and that developments such as facial recognition technology are among the most egregious breaches of that freedom.

Business Lab is hosted by Laura Ruma, director of insights, the custom publishing division of MIT Technology Review. The show is a production of MIT Technology Review, with production help from Collective Next. Music is by Merlean, from Epidemic Sound.

Show notes and links

Ann Cavoukian, Ryerson University

Global Privacy and Security by Design

“Microsoft presents Dr. Ann Cavoukian on privacy and your business,” YouTube

“Dr Ann Cavoukian – Privacy By Design,” YouTube

“Will Privacy First Be The New Normal? An Interview With Privacy Guru, Dr. Ann Cavoukian,” by Hessie Jones, Forbes

“Dr. Ann Cavoukian: Why Big Business Should Proactively Build for Privacy,” by Hessie Jones, Forbes

Full transcript

Laurel Ruma: From MIT Technology Review, I'm Laurel Ruma, and this is Business Lab, the show that helps business leaders make sense of new technologies coming out of the lab and into the marketplace.

Security threats are everywhere. That's why Microsoft Security has over 3,500 cybercrime experts constantly monitoring for threats to help protect your business more at microsoft.com/cybersecurity.

Our topic today is security and privacy, but more specifically designing your products and company with a security and privacy first mentality. We'll also explore how government regulation of data will continue to affect organizations, as well as opting out--because privacy is a form of freedom.

Two words for you, facial recognition.

My guest is Dr. Ann Cavoukian, who is an expert on privacy and, for three terms, she was the Information and Privacy Commissioner for Ontario, Canada. Currently, Dr. Cavoukian is the executive director of the Global Privacy and Security by Design Center, as well as a senior fellow of the Ted Rogers Leadership center at Ryerson university. Dr. Cavoukian created the privacy by design framework and advocated, and won, for its inclusion in various government regulation including GDPR. Ann, thank you for joining me on Business Lab.

Ann Cavoukian: Oh, my pleasure. Thank you for inviting me.

Laurel: So, to start, why is privacy and security so important to you personally?

Ann: Privacy forms the foundation of our freedom. You cannot have free and democratic societies without a solid foundation of privacy. And we know this. Look at history. It's no accident that Germany is the leading privacy and data protection country in the world. It's no accident they had to endure the abuses of the Third Reich, and when that ended they said never again. Never again will we allow the state to strip us of our freedom and our privacy. And they've stood by that. When I was Privacy Commissioner, I went to a number of conferences in Germany. Every conference was started with a reference to that time and they developed this term called “informational self-determination,” that it should be the individual who determines the fate of his or her personal information and they consider it to be such an important value. They enshrined it in their constitution in 1983.

Laurel: That is amazing. Could you imagine that happening in North America?

Ann: Well, we're working towards that. There has been such a acceptance of surveillance. Until recently, in the last two years, all of the public opinion polls (Pew Research, etc.) have come in at the 90 percentile for concern about privacy, 90% very concerned about their privacy, 92% concerned about loss of control over their personal information. So, people are getting exercised about this and they want their privacy back and there is such a trust deficit right now in society, both in terms of private sector and public sector. So, I think we're going to begin to see changes in this. I ask people, you cannot give up, never give up. We always have to move forward towards our intended goals and we can do this.

Laurel: Is this why you work so closely with governments? Because some companies and consumers and citizens may actually see governments as that third way to provide some kind of structure, framework, and regulation?

Ann: Absolutely. The regulation will be critical in some areas like facial recognition and others which we can talk about. But I also want to caution people. Government also has to be controlled in terms of their unauthorized uses of personal information. It's not like governments are protectors of privacy completely. Not at all. In my three terms as Privacy Commissioner, there was a different political party governing in each of the terms; and before they became the government, they were all interested in privacy and wanting to protect consumers and citizens and privacy was paramount. Then they became the government and everything changed. Then they wanted to control the information and gain access to it and use it in their interests. So, I caution people against both private and public sector.

Laurel: Can you outline generally the differences that you see between privacy and security?

Ann: Let me be clear. While privacy subsumes a broader set of protections than security alone, in this day and age of daily cybersecurity attacks and data breaches, if you don't have a solid foundation of security from end to end with full life-cycle protection, you don't get to have any privacy. And I always like to get rid of the zero-sum mindset of privacy vs. security. Utter nonsense. You need privacy wedded with security, privacy and security to protect everything and you must have both. It's an absurd proposition that you can have one versus the other. That's the model I try to advance; and I work with cybersecurity companies and firms to show how we can enhance privacy through security by offering the strongest data protection, end-to-end encryption, full life-cycle protection. We have to have that. Then it enables you to have privacy meaning, be in control of your information.

Privacy is not about secrecy. It's all about personal control, that you decide how you want your information used and to whom you wish to have it disclosed. You have to be in the driver's seat and you can only do that if you've got a solid foundation of security.

Laurel: So, formally, privacy and security by design is this end-to-end mentality of building products, as well as organizations?

Ann: Yes, exactly. That's why I called my organization the Privacy and Security by Design Center. I wanted people to understand that they have to be married together, it's critical. And I've been doing this work obviously for a long time. We're now beginning to see advances; and last year, I should add, Tim Berners-Lee, the creator of the Worldwide Web, he went public and he said [paraphrasing] “I'm devastated in what I created in the web. It's become the centralized honeypot for personal information that companies control--the Googles and Facebooks--and give out to third parties, unknown, without any authorization. He said, "I'm devastated that I'm walking away from it. I'm creating a decentralized model." His model is called Solid but the point is, it started the movement to decentralization.

See when your data is decentralized, retained under your control in a secure enclave in the cloud for example or wherever, then you decide how you want it used and who you want it shared with. As long as you're in control, then privacy is intact. And that's the model we are increasingly going to is this decentralized model where the individual will be able to exert greater control over his or her personal information. It belongs to them, the individuals, not to the companies or the governments.

Laurel: So, once individuals start to be able to make these choices, companies will have to respond and their data systems will necessarily become decentralized as well.

Ann: Yes, and they will respond favorably because (so as you know, I do privacy by design) a number of companies are increasingly becoming certified for privacy by design. And what I tell them is: when you're certified for privacy by design, shout it from the rooftops, tell your customers the lengths you're going to to protect their personal information. And what that does is it builds trusted business relationships. The trust which is dwindling returns and what companies have told me is that when there's a secondary use for the personal information that they would like to gain access to, when they go back to their customers, they always, they are told, "Yes, you can have the information. I give you my consent to use it for that purpose." They love it because they gain a competitive advantage and they just reap the rewards loyalty and attracting new opportunity. It's a win-win proposition.

Laurel: So, what are some common problems that companies are facing with privacy and security kind of melded together here?

Ann: One of the problems is security tends to be very under-resourced, which is such a folly on anyone's part. So, invariably, data breaches arise and privacy infractions and then they're doing a cleanup. And often there are lawsuits and these days there are class action lawsuits. So, the cost of ignoring security is just crazy. It's the biggest mistake you can make as a company. So, I tell companies, don't be frugal about security or privacy because first of all, you're not going to have any privacy without a solid foundation of security. And once you have that with end-to-end encryption, firewalls, everything in place, then the privacy is much easier to address because you can tell your customers, your citizens, "We have the strongest protections in place to protect your data and now we want to ensure that you have control over your data so that you truly gain benefit from the privacy that we have promised to you and that you must have, must be under your control." Then, increasingly, it becomes a stronger organization, it grows and they don't suffer from the data breaches and the lawsuits that would arise otherwise.

Laurel: Speaking of the data breaches and lawsuits, which, like you said, are daily if not weekly, why aren't companies paying more attention to security? Why do these keep happening?

Ann: That is such a good question. I have no idea truly. I don't understand it. They think of it, I'm assuming as a cost factor, of course it costs money to secure data a lot less than once you encounter the data breach and you have to face a class action lawsuit, I assure you. So, part of it is a learning curve. It hasn't been on the forefront of the attention of boards of directors and senior executives until recently with all of the massive breaches. So, I think you're going to see a change in this coming. It's not coming fast enough, but this is what we are increasingly addressing and I learned last week that boards of directors are now much more interested in security and privacy directly than they were in the past. So, I'm the eternal optimist. We never give up. I'm hoping this trend will be shifting very soon.

Laurel: Do you think there's going to be more pressure coming up from consumers who do expect this kind of control and accountability?

Ann: Absolutely. I do a lot of public speaking and I speak to consumer groups as well and in the past, I would have to explain to them why they should care about their privacy and protecting their data and things of that nature. Now, like in the last year or so, I don't have to explain anything. They are coming to me with the questions, "Tell me how I can grow greater privacy and security. What do I have to do with businesses and with governments to explain to?" It's very rewarding actually to see this shift.

And honestly, one of the first things that I say to them, "If you're shopping either online or you're going into an offline store, do what I do." The first thing I do, and I do it politely after I pay for it, and often [the clerk will] ask you for your postal code or some more information. I'll say, "Oh, do you share that with any other third parties or would you be requiring my consent anytime you want to do that?" The clerk you're dealing with doesn't have a clue. They'll go to get the manager, the manager will come. Once the manager sees that you're interested in privacy, he or she will say, "Oh, of course you care about privacy. We'll make sure your data are protected. We'll put the walls up. We won't share it with any third parties." It's almost like you have to bring your privacy concerns to their attention and then they know what to do. It just often isn't done automatically.

Laurel: That's a really interesting idea. I like that. Do you have other tips for people to keep their privacy private?

Ann: One of the things you should never do in terms of any uses of your data that you have routinely shared with people, I'm sure you've heard recently, Clearview AI, this horrible company, in my view, that just scrapes so many facial images off of the internet, 3.9 billion facial images off of Facebook and Instagram and YouTube videos.

What I tell people when you put a photo, for example, on Facebook--and I'm not going to tell people to get off Facebook cause they won't do it--but what I'll say is, "Put the walls up around it, make it clear you only want to share it with your finite number of friends and family." Facebook, believe it or not, has very strong privacy protective measures you can invoke. Nobody knows about them. At the CES conference in Vegas in January, I was watching a panel and the person from Facebook, the chief privacy officer said, "Oh, of course we follow privacy by design. We have very strong price protective measures." My jaw dropped but the point is it's there but you have to seek it out and find it. So, search for the strongest privacy protective measures on any of the apps or websites you use and invoke them right away. That's what I urge people to do.

Laurel: Excellent, excellent advice.

Cybersecurity isn't only about stopping the threats you see, it's about stopping the ones you can't see. That's why Microsoft security employs over 3,500 cyber crime experts and uses AI to help anticipate, identify and eliminate threats so you can focus on growing your business and Microsoft Security can focus on protecting it. Learn more at microsoft.com/cybersecurity

Laurel: So, to go back a couple of years, why was GDPR needed? What was the construct around it?

Ann: That's a very good question. Europe has always led the way in terms of the strongest privacy laws, data protection laws anywhere. And they could see that concern was mounting increasingly in terms of unauthorized access to your data [with] surveillance using that data and tracking your activities and that this was getting wider and wider. And as I said, the roots of all the privacy abuses in Germany, etc., they care about this very deeply, so they wanted to create a very, very strong privacy data protection law, which they did. And to my delight, they included my privacy by design framework in it.

Data privacy by design, data protection by design, and privacy as the default. Let me explain. Privacy as the default. This is huge. It's the second of my seven foundational principles of privacy by design. Privacy by default is the opposite of the existing framework largely speaking in most of the world where you, the consumer are expected to search through all the terms of service and all the legalese in the privacy policy to find the opt-out box that says, "Do not use my information for any purpose other than this primary purpose that I have consented to."

Now let's get real. Life is short, nobody has the time to do that. It actually takes hours to do that, so nobody's doing that. It's not because they're not concerned about their privacy. It's just ridiculous expectation. Privacy as the default flips that on its head and says, "You don't have to search for privacy. We give it to you automatically. It's the default setting. We can only use your personal information for the purpose you gave it to us for that you consented to the primary purpose of the data collection. Beyond that, we can't use it for anything else. And if a secondary use arises down the road that we would like to use your information for, we have to come back to you and seek your positive consent to use it for this secondary use."

It's amazing. People love it. Consumers, individuals love this and the companies that are doing this love it also because they've told me, when they do go and seek secondary consent for a secondary use, they always get it because they've built this trusted business relationship with their customers. It becomes a win-win. So that's what I'd like to point people to. And the GDPR is, trust me, much more than just privacy by design. But I have this quote, I made a slide of it from Information Age a few years ago, which said that “if you follow privacy by design, you're compliant with the GDPR”. I assure you, you have to do much more. But that shows you the mentality associated with privacy by design that it is the highest form of privacy protection. And that if you follow it, you have a good chance of being in compliance with the GDPR, so I urge people to do this.

Laurel: How about the California Consumer Privacy Act (CCPA). That is not GDPR, but it is a first step for a large state in America to kind of take on privacy and data concerns?

Ann: Absolutely.

Laurel: How are they incorporating privacy by design?

Ann: I'm not going to suggest that they're incorporating privacy by design, but I want to applaud California. As you said, they're the first, leading the way for other states to have very strong privacy regulation, which has been largely non-existent in terms of the private sector in the United States and they're setting the tone. There are some very strong privacy protective measures in it and we're moving this forward. And so, I really applaud California in taking the lead on this and other measures. For example, various cities in California, San Francisco, San Diego, and a number of others, Oakland, are outright banning facial recognition on the part of the police, law enforcement, and city departments, etc. It's amazing, an outright ban on it because they recognize all the inherent problems associated with it and they led the way in doing this and now other states are following, Massachusetts is following, Texas is looking at it and others so, California has to be applauded for taking the lead on privacy.

Laurel: Well now we're in this topic of facial recognition. What are the privacy implications behind, I mean I don't want to say it's obvious, but …

Ann: No. There are so many and I'm going to lead with at this point, your facial image is your most sensitive biometric. So, biometric data is very sensitive above all else but your facial image is the most sensitive because once it's compromised, you can't get it back. And when I was Privacy Commissioner, a number of victims of identity theft came to me seeking my assistance in restoring their identity. They said, “These companies are all claiming that I racked up all these charges and it wasn't me. It was someone else who stole my identity.” It's a nightmare, trying to clear your name once your identity has been compromised. And let me point you to the high rate of inaccuracy with facial recognition.

In the UK, in Britain, the police use it regularly with all their massive number of CCTV cameras. Two months ago, it was noted that there was an 81% false positive rate on the part of the UK police. 81% false positive means that 81% of the matches through facial recognition were not only incorrect, they falsely identified law abiding citizens as the bad guys, the persons of interest. Try clearing your name, like I said, it's a nightmare. 81%, and I'm giving you the most conservative estimate. Two weeks ago, there was another figure, 96% inaccurate, false positives. I'm not using that just because people won't believe me. 81% is real and solid. So, imagine trying to clear your name. So, quite apart from the fact that they shouldn't be collecting this information anyway without your consent or notice of some kind, which is non-existent, it's also highly inaccurate, it's ridiculous. So, we have to, I'm hoping, put some kind of ban on the use of facial recognition in terms of regulatory developments.

Laurel: Do you think it's because [facial recognition] is such a nascent technology that companies are kind of getting ahead of themselves, in the sense that, well, “let's test it in the real world because how else you're going to test it,” which is kind of why Clearview AI went about and did it the way they did. So, it's sort of that fail fast model of Silicon Valley.

Ann: It might be but there's already a massive class action lawsuit against Clearview AI. And what I dislike so much about the CEO of Clearview AI's position is that while this information is out there and publicly available, [he says] “so we're just scraping it.” It's not out there publicly available. Yes, it might be on a public forum, like a social media forum, like Facebook or something. But unfortunately, the people that haven't thought through [how] to safeguard the information, but it was never intended for massive public use the way Clearview AI is [making] it available to law enforcement all around the world. It's like all of us are probably on these mugshots that the police now have that they can compare against. That's what it's like. It's absurd that the great majority of those images, I'm betting, well over 99% are all law-abiding citizens, but now our facial images of have been captured forevermore. It's completely unacceptable.

Laurel: And this also brings in the security aspect of securing this very sensitive biometric data, so it's not just obviously it's facial recognition, it's also fingerprints, right? Because people are using [them] more and more for opening devices. How do companies secure that data and would you advocate for a second layer, a different layer of security for biometric data?

Ann: Let me tell you about the strongest form of protection for biometric data. It's called biometric encryption. It was developed years ago by Dr. George Tomko and it works this way. Your biometric fingerprint, facial image, that's not what's captured in the database. What's captured is your biometrically encrypted information and something else, a hundred-digit pin number, great password, whatever. That becomes encrypted through the use of your biometric, either your face or finger, whatever. And that's what is retained. The beauty of it is, if there is a massive cybersecurity hack and somehow they break into the biometrically encrypted data, what do they get? They don't get your biometric, they get your biometrically encrypted hundred-digit pin number that is meaningless or something else. Nonsense data. It's beautiful. It protects your biometric 100% but still uses it in terms of the authentication that you're seeking or identification. It's a win-win. So, I give you that as an example. There are ways to use the benefits of biometrics and others in privacy and security protective ways. That's what we have to develop and really explore.

Laurel: So, security first, privacy first, ethics first because we're looking at a pretty massive mindshift, aren't we?

Ann: We are. And I actually have a slide on this when I do it in my talks, because somehow people think ethics is somehow separate from privacy and security. Nonsense. Ethics has always been a part of privacy. And I take you back to 1980 when the code of fair information practices was developed, which are principles that formed virtually all the privacy laws around the world. And I actually looked this up--the meaning of ethics in the Oxford English Dictionary--because I wanted to see exactly how they define it. And they defined it in several ways, a fair treatment of data, etc., captured in a moral code, captured in a code relating to information. So, then I point them to the Code of Fair Information Practices which is all about privacy and data protection.

Long way of saying, that ethics, is absolutely critical but is an inherent component of privacy, to suggest otherwise is folly. I'm now just preparing a new course that I'm going to be teaching in addition to privacy by design. I'm teaching a course on trust by design, which is all about ethics and trust. They're interwoven and of course, the fair treatment and use of data is critical in how it's applied, but those have always been central tenants of privacy and data protection as well. So, I'm going to take you back to the 80s and then bring you back to where we are.

Laurel: That's fantastic. And I think what we've learned today is that informational self-determination is one of the most important things you can do, not just as a consumer, but as an employee at a company, as well as a company. [And to] just advocate for this because obviously it affects everybody, because you too are a consumer.

Ann: You've captured it perfectly. Truly, that is the essence of it and I've always applauded the Germans for developing that term, informational self-determination. It nails it. You get to determine the fate of your personal information and one of the reasons why that's so important is context is key. Only you know the context associated with your data, the sensitivity or lack thereof. I tell people, look, privacy's not a religion. You want to give away your information, be my guest. As long as you make that determination and you've turned your mind to whether it's sensitive or not or whatever. Informational self-determination captures it perfectly.

Laurel: Ann, thank you so much for joining us today on the Business Lab. This has been a fantastic discussion.

Ann: It was my pleasure, Laurel. Thank you so much for inviting me to do this.

Laurel: That was Dr. Ann Cavoukian, the executive director for the Global Privacy and Security by Design Center, who I spoke to from Cambridge, Massachusetts, the home of MIT and MIT Technology Review overlooking the Charles river. Also, thank you to our partner, Microsoft Security

That's it for this episode of The Business Lab. I'm your host, Laurel Ruma. I'm the director of Insights, the custom publishing division of MIT Technology review. We were founded in 1899 at the Massachusetts Institute of Technology and you can find us in prints, on the web and at dozens of live events each year around the world. For more information about us and the show, please check out our website at technologyreview.com. The show is available wherever you get your podcasts. If you enjoyed this episode, we hope you'll take a moment to rate and review us. Business Lab is a production of MIT Technology Review. This episode was produced by Collective Next. Thanks for listening.

Security threats are everywhere. That's why Microsoft Security has over 3,500 cybercrime experts constantly monitoring for threats to help protect your business. More at microsoft.com/cybersecurity.