Carey Parker: Bruce Schneier is an internationally renowned security technologist and the author of 14 books, including the bestseller, Click Here to Kill Everybody. He's written many articles and papers, has an influential blog called Schneier Security, is a board member or board advisor for several organizations including EFF, Epic, Tor and Verified Voting and he's even testified in front of Congress. The list goes on and on. I have been dying to interview you since day one on this show and you've graciously agreed to come speak to our audience on our Pod-Centennial episode. Bruce thanks so much for coming! Welcome to the show! Bruce Schneier: Oh it is nice to be here. Congratulations on 100 episodes! Carey Parker: Thank you very much and of course, what great timing. It just so happens and I did not plan this ... It just so happens that today is data privacy day. International Day- Bruce Schneier: Is it really? Carey Parker: Well- Bruce Schneier: Happy Data Privacy Day. Carey Parker: January 28th. Bruce Schneier: So, what to celebrate, we should not go on Facebook or something? Carey Parker: Yeah, we'll talk about that. I've got so many questions for you. For me the crux of the matter always seems to come back to this and I don't understand it and maybe you can help me understand. Is it that people don't know what's going on or they just don't care? Why are we so cavalier about sharing all this data in exchange for quote unquote, free services and what does it take for people to get mad about this? How did we get here? Bruce Schneier: So it's complicated and your question really brings that out. It kind of makes no sense, yet it happens. A couple of things are going on. One, privacy is a right and most rights you don't notice until they're gone. You don't really notice that you have freedom or privacy until suddenly you don't have it anymore. So it's really hard to proactively take steps when you don't really understand what the lack of the thing means and that makes it difficult. The second is that these services are designed to make people not think of these things. So when you go on Facebook, you don't think about privacy, you think about talking to your friends. When you pick up your cell phone, you don't think, I'm going to put my tracking device in my pocket today. You think, I need to make and receive calls and get my messages and email. So the things we want are very real and very visceral and the things that it costs us to get them, the privacy, is very abstract. It's not salient. So it's not that ... You ask people rationally what do you value? People value their privacy again and again and again. Bruce Schneier: But when it comes time to the moment, what you want is to get your email, to talk to your friends and navigate to your destination, to make that call, those are all the things you want at the moment. It's not just privacy. You see this in environmentalists and you see this in health. The things we want in the abstract are not the things we do moment to moment. Because the things we want are hard and virtuous and the things in the moment are, you know, extra large fries, that'd be great. So what will it take? In a sense it's almost not possible for us at the cash register to make these decisions properly which is why it's sort of all these things in health, in public safety, and privacy and environmentalism. We tend to reach to our best selves, which is our collective selves and our policy selves and not our immediate gratification selves. Bruce Schneier: If it was up to us individually to buy pajamas that didn't catch on fire and burn maybe a child later, that would never happen because it's five dollars cheaper. At the store, five dollars is real money and everything else is more abstract. But if we can come together and agree, like pass a law that constrains our immediate gratification selves, then we do better. So that's a complex answer, but that's the kind of thing that's going on and I don't know if it'll take anything. There have been so many major privacy breaches. Carey Parker: I know. Bruce Schneier: And every one, you say, this will be the one, right? Equifax. Carey Parker: Oh yeah. Bruce Schneier: OPM, every single government official. This'll be the one, this'll be the one and it never is. Because in the end those are all abstract risks and an abstract risk paired against a real and immediate benefit is going to lose every time. Carey Parker: So let's jump ahead. I was going to get to this eventually, but you've pretty much called it out. So by any other word, that's regulation and as soon as you say that word, and I've done this myself on Twitter. I've said this word and someone said you had me until you said "regulation", because it just turns off half the populace as soon as you say regulation. I don't know what it is about this country, but they believe that there's no role for government, and yet and time and time again as you explain this in your books and your essays and your blogs, you come back to the same point where the market cannot solve this. So explain to the audience why, and you've talked a little bit about this, why the market cannot solve this. Do we just need more transparency? Would that do it? Would that let people make smart decisions and the market would just work? Or does it really come down to we have to have regulation? Bruce Schneier: Again, it's complicated and those people who say no regulation do drive cars that don't crash and do eat at restaurants and don't poisoned and go buy medicines without fear of being poisoned. That in fact, people who say they want no regulation are just not paying attention. That in fact the only way to survive modern life is through regulation. When I flew back from Europe a few days ago and I got on the airplane and I was unable and had no expertise to verify that that plane was safe. I can't say to Delta Airlines, okay, before I get on, let me see the maintenance records of this aircraft please. You laugh because that's ridiculous and is ridiculous. And yet I can get on that plane without even worrying about it, because I know there are regulations that are backing up the safety record, backing up the pilot training and pilot rest and all the things that I never even thought of. Carey Parker: Yeah, right. Bruce Schneier: It's not really about education because you cannot become an expert in food safety, in pharmaceutical safety, in airline safety, in internet safety, in automobile safety, in building safety. You just can't do that. That's impossible. Carey Parker: Right. Bruce Schneier: So what we do as society is we say okay, look, I can't become all of those experts, so I'm going to delegate to an organization who will do this work for me and that organization is a public organization, that organization is a government organization. That's what we do and that's what works. Now, there are other issues you have with the fact that you have market failures. I mean take Equifax. Let's assume that it was a great market and Equifax lost all your data. Now go fire Equifax. Oh wait, you can't. You're not their customer. Carey Parker: Right. Bruce Schneier: You have no actual business relationship with them. So what could possibly the market do? The market rewards cheap information about you sold to third parties without your knowledge or consent. That's what the market is. Now, does the landlord who buys the credit report from Equifax when you rent the apartment care about your privacy? Kind of not really. So you're going to have enormous market failures. Like Facebook, Facebook is a monopoly. You can not be on Facebook and more people are, but it socially can be very difficult. That's where things happen. I'm not on Facebook and I feel it. Carey Parker: Yeah, right. Bruce Schneier: Of course Facebook goes out of their way to ensure that you're not really thinking about privacy when you're sharing. You're thinking about friends, you're thinking about telling them things, being closer to them, all the things that allows them to mine your data on the side and there is no competitor. It's interesting that there really aren't competitors viable to these large services that don't spy on you which shows you the market isn't working. Carey Parker: Right. Bruce Schneier: I can't choose the more private version of Facebook, there is none. Carey Parker: Right. Bruce Schneier: If there was and my friends aren't on it, it's kind of boring to be there. Carey Parker: Right, yeah. Bruce Schneier: So again, I'm sort of stuck. But, this is not unusual. There exists no industry in the past century and a half that has improved their security and privacy, so called safety, without being forced to by the government. Sort of go through the list. Cars, planes, pharmaceuticals, medical devices, consumer goods, work place safety, agricultural safety. Most recently financial products. The market doesn't reward providing these safety and security features. The market rewards churning out the thing as cheaply as possible and foisting it on the public and really making it so that they don't know the difference. Carey Parker: Well, there's no consequences either. Look at Equifax. You thought for sure that maybe congress or somebody would hold those guys accountable and it just didn't happen. There seems to be no downside. Bruce Schneier: There is no downside. That's because there's no regulation. If you look at Equifax, what did they learn from the disaster? Skimp on security, hope for the best, if the worst happens, weather the press storm, you'll be fine. Facebook learned that after the election. That there's no consequence and because there's no consequence, there's no reason to do better. But there has to be consequence. And regulations are effectively consequences. Regulations are how we're going to spur innovation here. The myth you'll hear from Silicon Valley, from everybody, is that regulations hamper innovation. Carey Parker: Right. Bruce Schneier: Which is fundamental bullshit. It just isn't true. What regulation does, is it raises the cost of not doing the thing. I have a lot of computer security technologies that I can bring to bear on this problem. The thing is that Equifax doesn't want to buy them, because they're expensive. And if I want Equifax to buy them, has to be more expensive for them not to and that's what regulation does. Regulation puts your finger on the scales of the market and say we collectively are going to make this group decision that privacy, security, whatever, is more important and then everybody innovates. Suddenly, there's a market for data privacy tools. Suddenly there's a market for another way of doing business and surveillance capitalism that might be more profitable. Carey Parker: So we're talking about security and this is something, a subject of your most recent book, Click Here to Kill Everybody and I would- Bruce Schneier: We should pause and marvel at the title? Carey Parker: Yes. I know what it's like to try to come up with a title. That one, that is eye catching. That one's going to bring you in. Bruce Schneier: It's my first click bait title.You know the secret's here. The title just gets you to open the book. Or read the blurb on the Amazon page and then the blurb gets you to buy it. So the title has to be arresting. It's my first click bait title and I'm really proud of it. Carey Parker: Well unfortunately, it's really not that hyperbolic, because as you go through it- Bruce Schneier: It's a little hyperbolic, let's be fair. But it is a real thing. I'm talking about how computers fail and this notion of a class break that in fact unlike ... Oh, what's a good example? Cars. We know how cars fail. Cars have parts. Parts have mean time between failures and every once in a while cars fail and there's this entire ecosystem of auto repair shops in your community that handles the steady stream of broken cars. We know how that happens. Computers fail differently. They all work perfectly until one day when none of them do. And that different way of failing is what I'm really alluding to and the fact that computers now affect the world in a direct physical manner that there are cars and medical devices means it's no longer about data, it's about people's lives. So really is evocative of what I'm talking about even as it is a click bait title. Carey Parker: Well, and to your point earlier about cost and not spending the money if you don't have to. When you start talking about IoT, which is the internet of everything, which is, for some reason our compulsion now from a marketing perspective, to take something that was "dumb", something that probably had a computer in it anyway, but wasn't connected to the internet and think, hey what could we do if we connected that refrigerator or that light bulb or that whatever, toaster ... they've done it with toasters, and connect these things to the internet. What else could we do? If nothing else, it's a marketing term. But because the cost of those things is so, so important and the margins are so small, security is an afterthought at best. Right? Bruce Schneier: It's an afterthought. And we tend to laugh at the Internet of Things, but I think it has untold marvels. Carey Parker: Oh, sure. Bruce Schneier: That when we connect our refrigerators and toys and toasters and cars and everything in cities to the internet, that there will be these magical benefits, but we can only imagined today. I don't want to minimize the benefits of all these things. Our phones, Facebook, the IoT, it's freaking amazing and will continue to be so. But yes, there are these downsides and yes when you have these low cost devices, there's really not a lot of impetus to put security in. The buyer doesn't care. The buyer can't make a decision based on it, so it's what gets left on the engineering floor. So again, without some regulation, you'll have very, very poor security in these devices that'll be around for years and decades and they'll cause all sorts of problems, both in privacy and in safety. Carey Parker: Well and it's even ... There are even indirect costs because the Mirai botnet, there was these devices when insecure and when connected to the public internet are vulnerable and hackers know this and there's even search engines for these devices, Shodan, that allowed them to find and compromise these devices. And even though it's not affecting you directly, these things are being co-opted into this zombie army to have real effects on things you do care about like DynDNS. So it's more than just direct. Bruce Schneier: The Mirai botnet is a good example of a bunch of things we're talking about. So, here it is, these are vulnerabilities, interconnected devices particularly web cams and digital video recorders and I think some routers. They had default and lousy passwords and hackers were able to break into them, make them part of botnets that were used to do pretty serious damage on the internet. Okay, so a couple of things. Let's say your DVR at home is part of that botnet. You have no way of knowing. There's no light that goes on saying bling, I'm on a botnet. It still works as a DVR. Carey Parker: Right. Bruce Schneier: And kind of, you don't care. Carey Parker: Right. Bruce Schneier: It still works as as a DVR so you have no economic incentive to replace that. Carey Parker: Yeah right. Bruce Schneier: Right now, I mean it was the Mirai botnet that was fall of 2016. There are not 100s of botnets using that same code. So it is likely that your DVR is still part of, or several botnets because of that vulnerability. And you're going to keep that in your living room for another decade, like having no idea and then when you buy a replacement, you're not going to see a label that says, botnet free. Carey Parker: Right, right. Bruce Schneier: You could very well buy another, insecure. The market's not going to fix this. With the DVR manufacturer who's doing this at a very low profit margin and the engineering team is offshore, it's done, is not going to make this better because he has no economic incentive to do anything except give you a box that works at as cheap a price as possible. Carey Parker: Right. All right. So let's talk a little bit about data sharing, data oversharing maybe. And for me, there's at least two aspects. There's active data sharing and there's what I'll call passive. So when we go on Facebook or Instagram or Twitter and we consciously share photos, political views. We even give our DNA away just to find out what our ancestry is. At least you can say that was a conscious thing we did. Maybe we don't understand fully the consequences, but then, to me like the classic iceberg metaphor, there's this passive layer where we have this data ... We generate tons of data exhaust constantly, leave digital footprints everywhere. That to me is more of a passive thing. I think that's probably the area where a lot of people get real fuzzy or maybe don't even think about it. So help our audience understand what we are doing just by existing in our digital world today that we're really not conscious of. Bruce Schneier: So this is actually nice because it's that difference in data and metadata which sort of came as the public view in 2014 after the Snowden revelations. So basically every time you interact with a computer, it produces data about that interaction. So if you make a cell phone call, if you walk by a camera on the street, if you use an ATM machine, all of these things produce data about what happened. Now that's been true since forever. Computers do that, but until recently, most of that data was thrown away because data storage was expensive, data processing was expensive, data transfer was expensive. Now all of that is cheap. So all that data is just saved forever until you go and use a search engine, there's data and you know, Google saves that data. You use a credit card, there's data. You use a cell phone. There's two kinds of data. The data that you actually produce knowingly. So we are talking on Skype, so the data is the conversation between us that we are willingly producing. If I go to a search engine, the thing I type in, the results I get back, like the email I type, the Facebook post I type. Then there's metadata, which is really data about data, the data the system needs to operate. Bruce Schneier: So in order for this Skype call to happen, the metadata is my ID, your ID, information about your computer, my computer, the date, the time, IP addresses. When I make a phone call, it's my cell phone, your cell phone, our location data, which cell we're in, the duration of the call. You can think about that for all sorts of things. That's the data that we inadvertently give away and when I walk around with my cell phone, data about where I am is generated constantly. It has to be, otherwise the phone can't ring. Carey Parker: Right, yeah. Bruce Schneier: I was in Munich a few days ago. I'm sorry, not Munich, I was in Warsaw. And if the phone system didn't know that it was in Warsaw, people couldn't call me. The fact that the phone works means the system knows where I am. Carey Parker: Right yeah. Bruce Schneier: That's metadata and that makes the phone a perfect tracking device. So when you type something into Google or Facebook or any other website, you have a typing pattern, like how fast you type, how fast you move the letters. That's metadata and that could be used for authentication and that can be saved and collected. And really we're at the point to where all of this data and metadata can be collected and saved forever because the cost of data storage, of data transport and data processing has dropped so cheap that it's basically free. So all of this data and especially metadata that companies used to throw away, now they save because it's easier to save it all than to figure out what you should save. Carey Parker: So that leads to my next question. This is a topic that you've brought up or an angle to this that you brought up that I haven't seen often referred to elsewhere. Maybe I'm just reading the wrong things, but, the power imbalances that are generated by this data collection not only by corporations but by governments and how that affects your life in ways that people probably don't understand until maybe they run afoul of it. So talk a little bit to me about how the power imbalances grow with this and why that's such an important aspect to this. Bruce Schneier: Well data is effectively power. You have data about someone, you have some power over them. It might be minimal, it might not be useful, but that's kind of what's going on. The reason companies like Facebook and Google are so powerful is because they have so much data and they can do things with it. So it's not the same for you to have data about a Facebook employees and they have data about you. That, who they are and their position makes them a more powerful agent and it's also true for governments and if the police have data about you, they can use it in certain ways. I mean, if you get stopped by the police and they want to see your ID, if you say, well show me yours first, that doesn't equalize it. You laugh because what I said was absurd. It doesn't make the interaction equal. I showed you my ID, now you show me yours. Well, no because it's the police, they can do more with it. So data is fundamentally tied with power. You'll see this in privacy laws and open government laws. Sometimes you will see privacy being used as a way that the powerful protect themselves. Like why you can't film the police. Carey Parker: Right, yeah, I was going to bring that up. Bruce Schneier: Right, right. But that makes no sense. The police are in a position of power. Think of the power imbalance. The police high and the individual low. So privacy increases your power, openness reduces it. So this is very visual, so stick with me. Okay, the police are high and the individual is low, so then naturally police have more power. If we, the individuals, get privacy, it raises their power, which lessens the power differential between the police and the individual and that's good for liberty. If we have surveillance, that lowers the individual's power and that increases the differential between the police and the individual. That's bad for liberty. Let's get to the police on top. Privacy for police increases their power and that increases the differential and is bad for liberty. Open government laws that reduce the privacy of the police, reduces their power, it reduces the differential and that's good for liberty. That's why you can be pro privacy and pro forced openness in government because they're both good for liberty. Carey Parker: Well, that's a classic thing too, that the trade off that people always say that you must give up your privacy in order to have security. Can you debunk that one for us, because that is commonly thrown about. Bruce Schneier: Yes, that is easy: a tall fence. There's lots of security. A wall, a barky dog. There's lots of security measures that have nothing to do with privacy. Now when you say you must give up privacy for security, you look at them and say door lock. A door locks gives you security and you do not give up privacy at all. And a matter of fact, very often when you give up privacy, it's bad for security. You know, massive surveillance. Now nobody feels more secure by living in former East Germany. We're all under surveillance, don't you feel secure? I don't. Uh oh. That's why they didn't like it. Now this is a little fascile and of course there are some measures of security that do reduce privacy. We want the police to invade our personal space in order to solve crimes. That's a good thing, but we recognize that that is an awesome power and we have controls. We have a warrant process. We have various due processes that limit the police's ability to intrude in our private lives while giving them that capability. We willingly give it to them with controls because we recognize that at some level, police need the ability to invade our privacy for our security. But we have additional controls on them. Carey Parker: So let's come back to a little more down to earth, a little more practical stuff. Bruce Schneier: That's pretty practical. I mean the warrant process is a really practical thing. Don't dis it so quickly. Carey Parker: So I often hear and often find myself saying as well, that one of the key problems is that we don't pay money for these services. If the product is free, then you are the product. Is that really the problem? Obviously, look at Verizon and your ISPs, we pay them a lot of money and yet they still track us. Bruce Schneier: I think that's just part of the problem. It's certainly true that companies give away their services in exchange for spying on us. That's just part of it. In the beginning it was these companies that were delivering free services that had a surveillance model to basically get revenue and there was no way to charge for this thing they were giving. But this is surveillance capitalism. Shoshana Zuboff just published a book with that title, a phenomenal book and really everybody is now doing it. So you go to Amazon, they don't give stuff away, they sell stuff, that's their business and they are spying on you. You mentioned Verizon. So all companies are trying to get into the surveillance game because it is right now so lucrative and that's preserving capitalism really at all levels. You want to buy a car and they want to spy on you. You want to buy a coffee machine, they want to spy on you. Carey Parker: So these companies often come back to a couple of things. So first of all they'll say, yeah, sure we collect data, but you know, it's all anonymized. It's aggregated. Bruce Schneier: But it's not. They actually don't say that anymore. It turns out to be a lie and they don't say it. It's not anonymized. It's not aggregated. Verizon sells your location with your name on it. Stores sell your purchasing history. Facebook doesn't sell your data, that they give it away, but they sell access to your data to you, not some anonymous person, to you. That surveillance capitalism is about the individual, it's not about anonymous, it's not about aggregate, it's about the individual. Carey Parker: Well, of course the other thing they say, is well, you clicked the service agreement. You agreed to everything that we're doing and I found this thing the other day which is just wonderful. It was some artist named Dima Yarovinsky and he had this thing called I Agree. He took the largest social media companies and printed out on regular font all of their terms of service and ran them end to end and you actually visually see how long these things are. The winner, I think was Instagram, at like 17,000 words, which he estimated would take an average individual 86 minutes to read. So, well, look you clicked, you agree. If you didn't read it, that's not my fault. Bruce Schneier: But you just said that that's not true. Right? And it's not. We talked about this in the beginning. Why isn't notice enough? Because it's coerced, because it's not real. Doc Searles calls this the biggest lie on the internet and that's every time you click I read this and I agree. You do it, I do it, every listener does it. You click it even though it's not true. You lie all the time. Carey Parker: Yeah. Bruce Schneier: And yes, legally it means you read it, but isn't true and this again gets back to government regulation. The reason you click and don't worry about it because in the back of your mind, you think the government took care of it. Just like when I walked on that airplane. In the back of my mind if I thought about it, you know, someone else is handling airplane safety. Someone else read that agreement and it's not onerous or I didn't sell my soul. Even though that's not true, it is what we think. We believe the government is actually looking out for us, so we all click those agreements without reading it. We lie and we say I agreed, but that's disappearing. Even that fiction is disappearing. Bruce Schneier: Because that agreement, in order for it to work, you have to visit the website. You had to go do the thing. Open up the software, visit the website. You get the agreement, you click. As computers become more ambient in our lives, there's no point of interaction where we could agree. Like when you walk into a store and there are 17 surveillance cameras and your data's been collected, there's no way for you to agree. When I walk into a friend's house and they have an Amazon Alexa, there's no way for me to agree that it's listening and interpreting and doing whatever it does. That disagreeing implies this screen we visit and that's disappearing so even this notice and consent as laughable as it is, is not surviving. Carey Parker: So all this comes back to and you've talked about this I know as well, is that we blame the victim. So all these companies, they say oh if they would've done this, if they'd changed those privacy settings, those arcane privacy settings. If they'd just found that one button we gave them that was hided four levels in and given a weird title like, customize experience, which really means, please track me, that we're blaming the victim. The other part of that is, not just the dark patterns of the companies that are trying to collect this data, but just the sheer complexity of this and security and privacy can be hard. I mean, even if we wanted to do the right thing, often it's difficult. Do you envision ... We've talked about regulation and I agree. Do you envision that there is some sort of a technical solution to this that might help? Bruce Schneier: There are lots, but again, the market doesn't reward it. If Facebook deliberately makes it hard, deliberately makes it obscure, deliberately makes it so if you don't want to take the time. When you go out to the site and you get dialog box and it says, our privacy policy has changed. Click here if you want to read the new policy. Carey Parker: Right. Bruce Schneier: What you see is, I'm a box. I'm in your way. Click here and I will disappear and you can talk to your friends and you go do that. So it's not that it's hard, it's that it's deliberately designed to be hard by the companies, that they want you to accept the defaults which is to be tracked all the time. And again this speaks to regulation. If regulation forced the companies to use plain language and forced them to give actual options that made sense to the users and made defaults more private, then we would have more privacy. So yes, there are lots of technologies that can change this. Technologies of anonymization you mentioned. Technologies of deleting, not collecting, of storing it more securely, of using it more securely. But the companies will never use them because they cost money and there's no benefit to using them. Unless the cost of not using them rises, we'll never see that. But I don't have a tech problem here, I have a market problem. Carey Parker: So, if regulation is the ultimate solution to this and the only way that's really going to cause any real change in this market, do you have, and you've testified in front of congress, do you have A, confidence that the representatives that we've elected to cover all sorts of things, not just technology, understand technology enough to implement proper solutions and B, given ... And this is probably a whole other podcast, the campaign finance situation, how do we expect them to really act in our interest and not the interest of the people that are funding their campaigns? Bruce Schneier: So I'm going to ignore the second question, because you're right, that's it's own podcast. And that's a bigger problem than security. That's a problem with special interests and money and politics. But let's talk about privacy. So, no I don't expect the federal government to do anything. I don't expect them to do anything at all. What I'm really looking for here are other groups that can take the lead. Europe. Europe is the regulatory super power on the planet. Two years ago, they passed a comprehensive privacy law, GDPR, General Data Protection Regulation. Came into effect last year. We're going to start seeing prosecutions this year. That is a big deal and the United States, some states are doing things. California, New York, Massachusetts in particular. California passed a data privacy law last year. They passed an Internet of Things security law last year. New York is regulating crypto currencies. And the neat thing about this is that software is write once, sell everywhere, so we all benefit. Bruce Schneier: Let's take a car for example. For a car, the car that you buy in the United States, is not the same car that you buy in Mexico. The environmental laws are different and the manufacturers tune the engines to the local laws. But the Facebook you get in one place is the same Facebook you get everywhere else because it's easier. So California just passed an IoT law and it doesn't take effect until 2020. One of the provisions is no more default passwords. So, you imagine a company that makes an interconnected toy or a DVR or something, says okay, we need to take out our default password to comply with this California law. They're not going to make two versions of their DVR, one for California, one for the rest of the country, because that's stupid. Carey Parker: Right. Bruce Schneier: It's easier and cheaper to sell one version everywhere. So a good security or privacy law in a large enough market, affects everyone. Lots of companies have decided to implement GDPR everywhere because it's easier than figuring out who's European and who isn't. Carey Parker: Right, yeah. The internet's global. Bruce Schneier: So we are benefiting from a European law. So that's what I think is going to happen. Carey Parker: Interesting. Bruce Schneier: We're not going to see the federal government do anything for a whole bunch of reasons that are too complicated. But other markets are taking over and we're going to benefit from that. Carey Parker: All right. One more policy kind of question, then I've got to get in some practical advice before we let you go. So, at odds with all of that seems to me, or maybe it's a different side of this argument, is like the Australian law that just got passed with the Assistance and Access Act and of course the UK's Investigatory Powers Act that was two years prior to that. And of course our FBI is talking about going dark. So is this really ... Is the difference here that we want to control corporations and in one way regulate them, but we want to deregulate or unfetter law enforcement? Bruce Schneier: It's more complicated than that. It's that we want defense to win and we want security to be what's most important and that these laws in Australia and the UK, what the FBI wants, is to weaken security. Like the FBI says, we need to have a less secure iPhone, WhatsApp chat because we want to eavesdrop on what these devices and services are doing. That's what they want. The problem with that is that these devices and services are critical infrastructure. That every U.S. legislator has an iPhone. Our nuclear power plant operators, our CEOs of industry, that all of these people use these devices and services and they need to be secure. That yes, there is a security benefit in being able to eavesdrop, but there's an even greater security benefit in not being able to eavesdrop. Because once I build an eavesdropping capability, I can't decide who can use it. Carey Parker: Right. Bruce Schneier: It's a subtle trade off and as computers get more critical, as they go into cars, as they go into IoT appliances and thermostats and medical devices, this becomes much more paramount, until we cannot put back doors in devices for the benefit of law enforcement because it makes us all much less secure. Carey Parker: All right. So I always like to give my audience practical advice on things that they can actually go and do. So, love to hear, what do you do on a personal level? What do you recommend for your friends and family in terms of just basic seat belts, smoke alarms, sun screen kind of level? What do you tell them to do to guard their data privacy? What do you do? Bruce Schneier: The sad thing is, is there's not a lot you can do because the data's not in your hands. Your data is at Facebook, at Google, at Equifax. So in the wake of the Equifax breach, what's my advice? Don't have a credit card. Don't have a credit rating. Never buy anything ever from anybody. It's stupid advice. Don't have a cell phone. Don't have an email address. This is stupid advice. You can't live in the 21st century without these things. Things you can do around the edges, none of it's new. Have a good anti-virus, be careful about attachments, but in the end your data is not controlled by you. This is not a problem that you can solve other than by not engaging which is not really an option. If you really care about this, you must get involved at the policy level. Make this an issue. This was not an issue in the last election or after any election. It needs to be. Carey Parker: For sure. So, other than you know, the standard call your representative, or maybe donate some money, is there anything else from a personal level that I guess that we can hold these guys feet to the fire at the town halls? Ask them these kind of questions and ... Bruce Schneier: Yeah, it's politics and politics is complicated and politics is pretty dysfunctional right now. But it really is where we have the power and until this becomes an issue, it's not going to get solved. My fear is that, it's going to become a public safety issue. That in fact it will take a click here to kill everybody moment where someone crashes all the cars or I guess more realistically, all the cars of one make and one model year, for congress to say, my God, something must be done. But that's the kind of environment we're in. Carey Parker: I worry that the next's thing's going to be another terrorist attack and the pendulum's going to go the other way. Bruce Schneier: Yeah, I worry about that too. It won't be cyber terrorism. That's largely a media myth, but yes, if there is another terrorist attack with a reasonable death toll and a demographic of an attacker that we can demonize so it can't be like a right wing terrorist, has to be a Muslim terrorist, but yes, I think we'll lose a lot of our freedoms and privacy will be one of them. Carey Parker: We'll I can't let this go on a downer note, so give us some hope. Give me something to smile about. What are the options here that we can really look forward to? How's this going to come out? Bruce Schneier: So I think we're going to actually solve this. I don't see the end of our society here. I think we will figure it out and we will emerge from this more secure, with more security and more privacy. I think it'll take a lot of ups and downs and the near term's going to be rocky, but long term we'll figure this out. And my belief is surveillance capitalism will one day be an illegal business model and we'll look back at it like we look back at child labor. What do you mean? You took small children and put them up chimneys to clean them? That's immoral. And we'll look at this sort of thing in much the same way. And we will figure out better business models and what market economies look like in the information age in ways that are moral. And yeah, I think we'll figure this out. Carey Parker: Well, I certainly hope you're right and thank you so much again for coming on and talking to the audience. It's been wonderful having you on the show, Bruce and good luck to all you're doing and keep doing all you're doing, because you're making a big difference out there too. Bruce Schneier: Well thank you very much.