Coratti: Good morning, everyone. My name is Kris Coratti. I’m vice president of communications and events at The Washington Post. Thank you for joining us this morning for our latest cybersecurity program. From the Equifax data breach to Russian interference in the 2016 presidential election, cyberattacks are not just a threat to our national security but impact our everyday lives. So this morning, you’re going to hear from a variety of experts on the matter, but before we begin, I’d like to thank our presenting sponsor for this event, Symantec. You’ll hear from them a little later in the program and our supporting sponsor, the Center for Identity at the University of Texas in Austin. So now, I would like to welcome to the stage the Department of Homeland Security assistant secretary, Jeanette Manfra with The Washington Post’s Amber Phillips.
Bridging The Gap: Cyber Cooperation Across Sectors
Phillips: Good morning. Thank y’all so much for being here. We appreciate it and assistant secretary, thank you especially.
Manfra: Thank you for having me.
Phillips: I know everyone had a fun time getting through Washington Traffic this morning. Anyways, just to introduce the assistant secretary a little bit more, she is the assistant secretary for the office of Cyber Security and Communications at the Department of Homeland Security. It’s a mouthful. You have a really important job we’ll get to in a minute. If and of y’all have a question, feel free to tweet at #WPCyber, especially those of y’all watching online and I’ll get it here and try to ask it. Thank you again for being here. It feels like to me—and I’m a layperson in all of this—that every few weeks, there’s a massive data breach that we hear about, whether it’s in the private sector or even in government. And yet, as one cybersecurity analyst pointed out to me, he said, “I don’t really see people protesting in the streets about the Equifax breach, which affected almost half the country. Have Americans not realized that our data is not safe online? Has government? Where do we need to go to recognize the threat that clearly exists?
Manfra: I would say that some Americans have come to the conclusion that they need to adopt more safe and secure practices, both within their own capabilities, their own computers at home. Also, how they practice safe behavior online for themselves and increasingly for their children. We do spend a lot of time trying to raise awareness with the public, help them understand. These are the sorts of activities that we’d like you to undertake to be safe within your home computing environment as well as your engagement online. But the government has had a long history now of being engaged on the issue of cybersecurity; well over a decade of policy and operational expertise thinking about what the role of government is in cybersecurity and we’ve developed over time a set of principles, the first one being that a public-private partnership is critical to us being successful here. And then the second one is around us needing to be a leader and building best practices, both within the government that we adopt, but also, championing those practices out throughout the United States and globally. One of those being the cybersecurity framework that developed a couple of years ago based off of a lot of engagement and collaboration with industry. We came to this mutual understanding that we didn’t have a common way of talking about cybersecurity.
We didn’t have a common lexicon and we really needed to boil that down for executives, for people that maybe weren’t as engaged in cybersecurity but as we talked with industry, as we talked with government agencies, thinking about how they think about cyber and how they manage cyber as a risk to their enterprise and to their organization. And that framework really helped kind of distill, “These are the things you need to do and you need to think about cyber risk as a part of a risk to your business or to your government agency and here’s the way that you think about doing that. Here are the standards and the best practices you can do to think about everything from identifying those assets that are most critical to the functioning of your business or your organization to thinking about how we respond and recover.”
And where we are now is evolving, continuing to ensure that consumers and organizations and critical infrastructure are adopting these practices, thinking about risk. But what we’re trying to think about at DHS is a broader national effort to think about national risk and systemic risk and think about those critical services and functions that are critical to our way of life. And how could they be disrupted through cyber means. So sort of taking those same concepts, but then applying it to a broader national perspective. And thinking about, “What are the programs, what are the capabilities that the government has to understand that and partnership with industry?” But then, how can we best prevent that from happening and importantly, if a network is breached or one of those critical services and functions are being targeted, how can we mitigate the potential consequences from becoming catastrophic. So sort of traditional disaster response, you want to get in there early and mitigate those consequences before they become something that are more catastrophic in nature.
And so that’s really our priority in thinking about how we’re going forward.
Phillips: What I heard you say is there’s like a cultural re-education that cybersecurity is a threat. Both among consumers and with large and small businesses. You’re trying to develop a new lexicon, so you can talk to CEOs who might not understand this and then you’re now at a point where you’re kind of stepping out to realize that cybersecurity is an issue with our election systems, our electrical grid, and our financial services sector. And asking at DHS how to govern can be involved in trying to protect all that. Is that what I hear you say?
Manfra: Mm-hmm. Absolutely.
Phillips: How’s that going? [LAUGHS]
Manfra: I think we have done a tremendous amount of work and when I say we, I mean not just the government, but industry across the United States and the government in sort of thinking about this question very hard and deepening our mutual understanding between industry and the government about frankly, how data moves within their industry and how do organizations work? What is that value chain to deliver that critical service and function, whether you’re talking about having the ability to clear transactions, gain the integrity of the financial system to ensuring that citizens have access to clean water and so thinking about this across our country and thinking about how a lot of industries are becoming more and more digitally dependent. And also, understanding and sort of doing some myth busting around parts of our critical infrastructure that don’t have the risk because they are not yet so digitally dependent.
But we have a different conversation with them because a lot of these institutions and entities across the United States, when you think about their cyber systems, if you will, it’s not the computer that you’re using on your desk. These are very complicated control systems and things that are running factories in our water plants, in our electric grid. And they buy these systems once every 20, 30 years. And so while they might not be quite as dependent as say, the financial sector is on having that faith and that trust in the data, an ability to move quickly, they’re modernizing their systems. And so we want to talk to them about as you modernize those, how are you thinking about security and resilience as you buy, and you build those systems and then there’s also a workforce component to it and thinking about a workforce that is unable to think about cyber as a part of their everyday job. It’s not just for the IT or the operational technology specialists. We have to have this as our society sort of increasingly becomes a more digital society, having people that understand how to manage this as a part of their work, as a part of their home and is a part of how they manage risk to their enterprise is increasingly important. But we’re, again, trying to think about trying to stop those potentially regional or nationally catastrophic events through cyber means.
Phillips: This all strikes me as a massive job for DHS, which I know is leading the federal government efforts on all this. Do you feel like you guys have enough resources? Are there any bills pending in Congress that you would support? I think that there’s one that would establish—cybersecurity is part of the department’s critical infrastructure program, or establish an entirely new dedicated cyber agency.
Manfra: Yes, we’ve been working with Congress for a while, for years on legislation and Congress has actually been able to pass a lot of legislation that has been tremendously useful to us. We have authorities and some of it actually stems from authorities that we were given for our counterterrorism mission and partnering with critical infrastructure that we’re now able to leverage for our cybersecurity partnerships. So we’re able to have protected conversations with industry around security topics that they’re able to report information to us that we can apply protections to so that we can build that trust with them and we have years of being able to use these authorities. We also now with the Cybersecurity Act of 2015, industry is given liability protections for sharing information with DHS as a further incentive and one of the bills we’re working and we’re very hopeful that Chairman McCaul will be able to get through is to establish a cybersecurity infrastructure protection agency within the department and a lot of what it does is takes existing authorities and sort of authorizes it under one new piece of legislation. Really importantly, it changes our name.
We’re currently the “National Protection and Programs Directorate”, which doesn’t exactly roll off the tongue.
Phillips: It doesn’t tell me what you’re about, either. [LAUGHTER]
Manfra: It doesn’t tell anybody what we’re about and so really, I think it’s incredibly important that we have a name that reflects our mission, both to our stakeholders that are not just domestically, but globally, but a workforce that aligns to that mission. And so we’re very excited to continue to assist them in any way possible to get this legislation passed.
Phillips: [LAUGHS] “Hi, Congress. Please pass this.” [LAUGHTER] Let me step outside our borders. A lot of the cybersecurity threats come from nation-states or actors within nation-states that we’re not necessarily friends with. Can I get you to rank the countries you’re most concerned about in terms of cyber threats? Russia, China, North Korea, Iran. Someone I didn’t list.
Manfra: No, think you’ve captured it. [LAUGHS]
Phillips: Any ones you’re most concerned about?
Manfra: I think state actors and non-state actors have—there’s different capabilities, different intents, and those are always evolving. Leveraging cyberspace has become increasingly a tool of other mission areas, so how you leverage cyberspace for your intelligence and your national security needs, how you leverage it for trade, how you leverage it for public safety. There’s foreign policy. There’s a lot of sort of thinking about how governments have—they’re continuing to do the things that we’ve always done, but now we have a new mechanism to do that. And I think you also have increasing sophistication with criminal actors that are finding new and interesting ways to monetize information, and so we’re sort of increasing concern about sophisticated criminal organizations. A lot of, frankly, though, what you see is still low-level, basic criminal activity that a lot of what we spend time doing is how can we make it harder for that activity? We obviously spent a lot of time focused on advanced persistent threats and these very sophisticated actors because again, we’re concerned about the ability to disrupt critical services and functions and the ability to take government information, which we hold a great deal of.
But we also want to make it harder. So some of the programs that we have—our Automated Indicator Sharing program and that’s just to try to get as many people in organizations as possible sharing information, moving beyond the notion that an indicator, which is just a technical, it could be an IP address, for example, and being able—it becomes a commodity, right? Just everybody, we shouldn’t be competing or protecting this indicator, whether it’s from a government perspective or a private sector perspective competing on access to that information but being able to share that broadly and widely and we have hundreds of organizations now signed up for this and as much as we can automate that sharing, it makes it so that when one organization is a victim or even a potential victim, they were able to stop it, but they put that into the program that you’re now sharing and you sort of have this neighborhood awareness kind of effect where everybody is benefitting now and everybody can put that in their systems and they’re blocking that and that makes it harder because right now, a lot of criminals, they don’t have to be very sophisticated. They can take advantage of the same things over and over again.
And so that’s a lot of what we’re trying to do as well. But sort of in your first question, I would say, “We continue to be concerned about those states and all those capabilities are sort of always evolving and you have to consider how an inability of the US government also needs to consider how we are leveraging cyberspace to accomplish our missions?
Phillips: I think the average American is suddenly talking about cybersecurity after the 2016 election and you guys announced that Russia tried to get into at least 20 states’ election systems through various ways. Is that a concern for you guys? We just had another election and I know DHS was trying to make sure all of these election systems were safe. Where are we on that? Can I vote and not have my vote hacked or my voter registration hacked?
Manfra: Yes, please vote. [LAUGHTER] Everybody should vote. I think so the election security issue is complicated and there’s a lot of nuance. I’ve learned a great deal over the past couple of years in engaging with the election community and we talked a lot about the difference between the voting machine itself and which—there have been people that have talked about how they’ve been able to hack into those voting machines, but they still largely need to have physical access, of which in the states and localities are very circumspect in how they treat those voting machines. They’re locked in warehouses. They’re transported securely and not a lot of them, while they may be digital, they’re not necessarily online and those are different things that sometimes the average person doesn’t quite understand, and we also learned a lot about existing transparency mechanisms that are in our election process that would alert us to anything untoward happening, right?
So you have observers observing the election, counting at every single stage of the process and so we really had a lot of faith that you would have people that would notice that something is wrong and so the ability to sort of actually manipulate the vote count, while it has demonstrated, there is some technical capability to do that. To do that and observe, to do that at a scale. It seems to us that that would be very challenging. But we are engaged with the vendors who make voting machines. The Election Assistance Commission, which is an organization that was stood up in the Help America Vote Act in 2002. It has actually done great work with NIST, our National Institute for Standards and Technology, talking about standards from voting machines. We have a lot of vendors who voluntarily submitted their machines for compliance testing against these standards over the years. And so what we’re doing now is working with the Election Assistance Commission, with NIST and DHS now taking a role in helping on the security standards in helping that community and engage to ensure that we have secure machines moving forward.
The other side of it is just a broader election system itself. And that’s sort of everything from voter registration rolls, which by the way are usually either public or available for sale and so it was less about the concern with the data itself but more the ability to cause confusion. And so we spent a lot of time engaging with the election community in ensuring that we understand how their systems work, offering any assistance that we can. We’ve built a lot of great partnerships over the past several months. Just last month, we stood up, kind of the formal body, where election officials will engage with us now on a regular basis. We had a lot of people engaged with states like Virginia yesterday and we’re building now, just like we’ve had to do with all of the other sectors, you have to build trust. You have to learn. The government has to learn how state and locals run these systems and we have no intention of sort of telling anybody what to do, but learning and then finding where we can provide them value, which is largely in helping them understand the threat, helping them understand best practices that they could be adopting, and helping create mechanisms where we can share information, whether that’s classified or unclassified with those communities.
Phillips: And you make a really good point that I just want to drive home. There have been lots of investigations here in the US. There’s absolutely no indication that Russia trying to tap into the election systems changed any vote whatsoever and that’s a very good point. It’s hard to do. That’s also a big job. [LAUGHS] And we’re at a time when the president has kind of brushed off a lot of this stuff of Russia as a hoax, broadly speaking. Do you feel like you’re getting the support you need from the White House to make securing our election systems and then cybersecurity more broadly?
Manfra: Absolutely. I think one of the first executive orders that the president issued was on cybersecurity and it’s built on the years of capabilities and policies that we’ve worked with industry and it really evolves us to thinking about enterprise risk in the government. So previously, and rightly so, the way the legal framework is, if you will, is each agency has like a CEO who is responsible for their cybersecurity. But DHS provides services and information and capabilities that help agencies think about their risk. But where we want to get to is where is there a risk that we’re not seeing because we have an agency-specific lens? How are we thinking about the government as an enterprise and how do we think about, just like we’ve talked about in critical infrastructure, how do we think risk across the federal government and how does DHS deliver in a more cost-effective way and more consisted standardized capabilities and tools? We had tremendous support from the White House and it’s gratifying for me as somebody who has worked in this area for a long time to see leadership across government and industry taking this as sort of a core, fundamental competency that we need to have in government and industry and we have a lot of support for what we’re doing in the work and the capabilities that we have built and the capabilities that we want to build.
Phillips: And that executive order the president signed in May was praised by Obama officials, even Bush officials, and of course, this administration. Let me pivot back to the private sector. I’ve heard a common thread from you that information sharing is so important. Talk a little bit more about that. Do you mean information sharing with all of the data that DHS has that you’re sharing with these big companies, like Target and Home Depot that could be hacked? Do you mean sharing information with me to make sure I know how to keep my credit card safe? What does that mean?
Manfra: I think it means actually kind of all of those things. We think about information sharing, it’s not just one thing, right? In information sharing, the goal is to help somebody make a risk decision or stop something from happening that where we see somebody being targeted or an entity being targeted or in the midst of an incident, to share information quickly so we can figure out what’s going on and be able to resolve it quickly. And it’s not just about what DHS has, although with our authorities and the capabilities we’ve built, DHS is now in a unique position where we’re kind of sitting now at the hub of a lot of different pieces of information and we’re getting information from the intelligence community, from law enforcement, from industry. And frankly, even increasingly overseas and one of the things that we’ve done over the years is build capacity to have other countries have similar capabilities that we do. So where they can have a computer security instant response team at a national level that it can engage with industry.
And one of the stories I tell a lot about is with WannaCry, which was a ransomware outbreak in May and because I think it just highlights all of the mechanisms, all of the institutions, all of the capabilities that have been built and how it all sort of came together and if you don’t recall, it started in Asia, an early morning, the Friday before Mother’s Day weekend and because we had been building partnerships with similar organizations to ours within Asia and Europe, we were getting information from them early in the morning before anything had even happened in the United States. And so we were a receiver. We were completely benefitting from their experiences and sharing not only what their governments were seeing, but what they were getting from their industries. And as it sort of traversed the globe and got into Europe and we had the National Health Service issue, we were having a just incredible engagement with the similar entities across Europe, across Asia. We’re passing information back and forth and then it sort of got us into the position where we had a sense of limited understanding. Because early moments, not everybody knows exactly what’s happening. But we were getting the sense of, “Okay, here’s what’s happening, here’s how it’s being executed, here’s what’s being used, and we were able to get because we have all of the major telecommunications providers. They actually sit in our building with us. So we had them on a call in the early afternoon and they were sort of sharing what they were seeing and what actions they could if we started to see something happening in the United States and kind of sharing information real-time back and forth with people in the room with us. But we’ve also got financial sector representatives that sit on the floor. We have aviation. We have health and human services folks and we have state and local representatives on the floor and one of the things that I think highlights the commitments and partnerships that we have with industry.
When we started to see things and we’re starting to get more and more concerned after we saw what was going on in the European side, I put out a call to our IT and security company partners and I said, “Hey, if you want to get on a call at 9:00 p.m. on a Friday night, we’re going to have a call and we’ll share with you what we know, and we’d love to hear what you have.” We had 45 companies get on the call with us at 9:00 at night. Every single major security firm basically donated us analysts over the weekend. We were sharing malware back and forth. They were reviewing our products. We were able to get a public product out on Saturday morning that was a result of that collaboration. And all through the weekend, they stayed on the line with us, on these chat rooms and helping us pick all this apart. And I really believe that that’s the model for the future and it really just highlights all of the work that’s gone on for years and years and years. It doesn’t get a lot of public play. I think a lot of people don’t know that. But just tremendous partnership and just willingness that we’re all sort of in this together and we have to have that willingness. We have to be able to share things that maybe aren’t perfect yet. But it’s such an exciting time and I’m just truly honored to be able to be a part of it.
Phillips: Well, thank you so much. Before I let you go, what’s one thing you want people to think of when they walk away from this conversation? They think DHS, cybersecurity does what or is focused on what right now? What’s the bottom line takeaway?
Manfra: We’re focused on managing risk to our critical infrastructure in our government and preventing anything bad from happening and when we can’t, being in a position where we can respond as a community, as a nation successfully.
Phillips: It’s a big job. We wish you luck. Assistant Secretary, thank you so much for being here. We appreciate it.
Manfra: Thank you. [APPLAUSE]
Phillips: And I’m not going to turn this over to my colleague, Geoffrey Fowler, who will be interviewing two excellent experts. Thank y’all so much. [APPLAUSE]
Threats and Responses: Trends in Detection and Prevention
Fowler: All right, as we get sorted, good morning. I’m Geoffrey Fowler, a technology columnist here at The Washington Post. So if you’re watching here in the audience or at home or from your desk in a cubicle somewhere, you can join in the conversation. Hit us up on Twitter and just send your questions using the #WPCyber. It sounds like a Cyber Monday discount, but actually, it will send the questions up here, so we can make you part of the conversation. So that’s #WPCyber. I want to continue this morning’s conversation with a really scary but important set of questions and that is how do you prevent a cyberattack and how do you know if you’ve already been breached? So I’ve got two smart guests here to help us figure that out. First of all, Sam Curry is the chief product officer and security officer at the firm Cybereason and he has over 25 years in IT security experience and then over on the other side, we have Rob Knake. He’s a senior fellow at the Council on Foreign Relations and previously spent years as the director of cybersecurity policy at the NSC for the Obama Administration. I hope I got that right.
So let’s dive right in with this: okay, so we have this Hollywood image of how a breach happens. It’s like a guy in a hoodie. He’s probably somewhere in Eastern Europe and he wants your good stuff. What’s wrong with that image versus reality? How does this really happen?
M: I’ll maybe kick it off and then maybe Rob can say something. I think the first images, while they may identify with the image, they’re not sitting around wearing hoodies. They’re not necessarily lone wolves. They’re not necessarily a guy, either. All walks of life and they’re tended to be very organized. If anything, they look like we do. They look like companies. They look like very agile organizations. They’re well-funded. They have HR departments. They have all sorts of functions. So that image of especially in the movies is usually somebody with a funny accent who starts typing and hits “ENTER” and says, “We’re done.” Very flashy. While there may be many of those or folks who wear the trappings, that’s not what they’re about. They’re very motivated, very dedicated and organized.
Fowler: What do you think, Sam?
Curry: So I think that the really bad news here is how you find out that you’ve been breached is usually not because you’ve got a crack IT security team that’s discovered. Most breaches are discovered by third parties. You find out when law enforcement tells you. You find out when your credit card number gets used by a third party. You find out when your bank accounts have been drained. So it’s not usually because you’ve detected the adversary on your network and have been able to contain them. That’s obviously, where most companies would like to get if they can’t stop the breach from happening in the first place. But for the most part, right now, we are in catch up and response mode on cybersecurity.
It’s worth unpeeling what a breach is a bit. We tend to use the word to mean many things, but there’s a big difference between an infrastructure breach and either an information breach or abuse of access. These things are slow. The best in the world move about four-to-five systems a day as they escalate privilege, shift systems, move laterally, and scan. The goal is to go deeper, understand a network, and usually, then be able to exploit it. But there’s time in there and to Rob’s point, it’s really hard to know if you’ve been breached and there seems to be an impression that—not infrastructure breach so much, but the more serious ones that’s been developed and actually used. The question is often, “How has something bad happened and what does it look like?” Because even when the external authorities find out, it’s usually some telemetry that’s given about the development of access by the bad guys. The adversaries they know are in the networks, but there’s a fog of war that applies. And so as you start to investigate it and you start to act on this, you actually affect their behavior.
It’s a lot like submarine warfare. They’ll go dormant, they’ll go silent, and then the question is, what’s the fact base? Have they got something? Don’t they have something? How do you get control of the network again? It’s worth unpacking that to some degree.
Fowler: So what do companies and organizations do wrong on these questions? What do you see the most common mistakes are? They think, “Hey, we take security seriously here at Bank XYZ.” But what are they really doing wrong?
Curry: So there’s an awful lot of companies out there and organizations. So for those who may we watching or listening—I realize that it’s very hard to generalize, but the majority, they treat security as a set of checklist items. They think of it as a box they can stick in a rack or a product that they can buy, and it will just take care of it and they want that. I had a—
Fowler: Who wouldn’t want that? That’s how organizations work.
Curry: I had a chief information security officer tell me once, “I would like a one-year rack-mountable security solution I don’t have to think about. Tell me the price for that.” And I said, “Yeah, I would like one to end world hunger and I would like one that stops cancer as well.” But the difficult thing to realize is—the analogy I think I used from the other day, it’s like health. I’d love a pill that just made you healthy. But it takes exercises and it takes eating well and the equivalent in security is you’ve got to do the checklist right. But you’re going to have to learn how to get healthy. You’re going to have people who live to stop bad guys who have the right tools at their disposal to find a human opponent. It’s the only thing in IT where the adversary is human. In other words, we haven’t yet solved the Turing challenge.
There is no AI in spite of all of the marketing that is just going to be like how HAL 9000 and find it for you. We have to use people and we have to use them very effectively and build their skills and have them at the sharp end fighting this. That’s the biggest mistake I see.
Fowler: So organizations have to eat right and exercise?
M: And exercise.
Fowler: What do you think? What are people doing wrong?
Curry: So if you take it from a basic level on up, right? Most breaches in this country, if you look at the data coming out of something, like Verizon’s data breach report, where every year, they collect the data on how breaches happened. Eighty-five percent of them are due to the most basic hygiene issue, weak passwords, or stolen credentials.
Knake: Right, so you’ve got to do the basics, right? What are the basics in this case? Well, rotate passwords. Better yet, start replacing passwords or going to multi-factor authentication. Most companies aren’t doing that. Hence, 85% of breaches happen when somebody steals credentials. Then, from there, you look and the next set of breaches are due to vulnerabilities that are already known. So this is kind of the basic hygiene that Sam is talking about. Unfortunately, if you do those two things right, then you get into the next world that targets you. That’s government agencies and major corporations where you’re doing spear phishing attacks; targeted attacks to try and compromise an individual account to try and get them to click on a link, download malware, take over the host, and then go through that escalation process that Sam was talking about.
If you want to stop that, you need skilled analysts, you need the ability to hunt within your network. You need the ability to bring in large amounts of outside information. It is no easy or simple task. I think the good news is, though, it is possible. There is a lot of nihilism in cybersecurity and yet, there are companies who are effectively able to manage this problem day after day, month after month, year after year. What we need to see is the spreading of these kinds of capabilities from a small number of government agencies, a small number of defense contractors, a small number of IT companies and banks down to the rest of the economy.
Fowler: How much is it going to cost? How much does an organization need to spend on cybersecurity to live up to your vision?
Curry: So I was at a company that had a breach and we spent $125 million in the year of the breach as a write-down, which is atrocious, right? And I remember saying to some folks that we had doubled our spending post-breach. But frankly, the way you spend it matters more than what you spend. My belief is that good operations, good execution, good management discipline over security. And by that, I mean really focusing on the things that have the highest risk is the most important. So rather than just say, “Hey, we spent $5 million last year and now we’re spending $10. That must be twice as good. I think it’s better to talk about what you’re spending it on, what the mission is, how you’re focusing, how you’re doing triage. Whether you’re in the government or in the private sector, the way you spend that money actually matters more than how much. And typical, by the way, is about 7 or 8% of IT is spent on security. But I’ve seen it spent terribly and I’ve seen it spent very well.
Fowler: Would you put a number on it?
Knake: Well, I would say that if you look globally at IT spending versus IT security spending, the global number is about 2%. So a lot of catching up to do to get to Sam’s number of 6 to 8%. I’ve seen some organizations that are spending upwards of 20% of their IT budgets on cybersecurity but they’re spending it for long-term capital investments to create secure networks, to create secure architectures, to migrate away from legacy systems. That’s very expensive. It’s also, I think, ultimately what many companies need to do to make themselves secure.
Fowler: So if you’re a CEO or a member of a board, you’re probably aware that cybersecurity is an issue, but how do you know if you’re getting enough or the right kind of information from your people to measure whether you’re doing enough. Is there a metric that people should be looking for? Aren’t organizations kind of under perpetual attack now? So you don’t want to get a call every time.
Curry: That’s an excellent question. Probably the biggest problem in security, I think, in general, is this lack of alignment between the security discipline and the business. So if you’re a chief information security officer or whoever sits at the top of the security chain, you, on the one hand, have to be deeply rooted in the technology, which is very esoteric. This is not something you talk at a C-level. If you get on an elevator with a CEO, you should not be talking about how many viruses you stopped at the latest exploit. That’s not interesting to them, right?
Instead, the alignment of the CSO to what the business cares about is their biggest source of alienation. CSOs don’t last in their position generally more than about 12 to 13 months because they’re not perceived to be true C-level in many cases. And my advice to CSOs is, there are only really three or four things you should be talking about at the C-level and you’re the translation between the security department and risk for the business. And by that, I mean legal risk, operational risk. Basically, all of the other forms of risk in the organization. So you should be talking about revenue and cost and margin, right? So what is the impact on the business or risk reduction in a risk language that’s coming to everyone or you should be talking about customer satisfaction or employee efficiency is the third one. Or finally, how you align with a strategic initiative.
Everything out of your mouth has to be about that and you have to stop thinking of yourself as defending what you do in security from the rest of the business and instead, showing that you really understand the core business or mission in the case of the government.
Fowler: Is there any other metric that you think folks should be looking?
Knape: I think the way I’ve seen some companies break it down effectively is they look at your standard inputs, outputs, outcomes metrics. And so if you treat your inputs as your money, your people, what you’re doing on your contractual side, how many technologies you’re onboarding; okay, that may be useful to a C-level executive. They’re going to understand that kind of thinking. Then your outputs, you want to start looking at what’s the state of your cybersecurity hygiene and how you can measure that in a quantifiable way continuously, right? So it’s not reporting on the latest checklist that was done a year ago but you’re actually in real-time able to provide data on the state of your network and its security and then I think the outcome question is a really difficult one for many companies to answer. In my view, the companies that do this the best, what they can say is we’re tracking our adversaries.
We know when we’re being targeted and we know we’re in the kill chain. What stage in the attack we were able to stop them at and to be able to tell their senior leadership, “We kept them from getting anywhere near our crown assets, our critical assets.” And if you can do that, you can show your effectiveness. You can also make the case repeatedly that you need more inputs, more money, and more people because they may not have been stopped at the first or second opportunity.
Curry: This is a problem though because if you’re CEO and you’re sitting there saying, “You’ve already got 30 people in the department and you’re saying you need more money?” CEOs know two things, right? Technical departments always want more people and marketing always wants more money. They know that. How do they make the tradeoff between who gets the next 100K or 200K? So the best metrics are, as you say, the ones that show the efficiency of the department and the true numbers in terms of the risk avoidance. Now, right now, we’ve seen a flash of stuff that is very low-level, but very, very public. Things like ransomware and destructionware that grab the headlines. I’ve been a CSO now four times and in one of my CSO stints, I knew something more nefarious was happening and to go look for it because we had a DDoS attack or a ransomware outbreak.
In other words, the bad guys would use this as a form of distraction and vary in noise-to-signal ration the actual thing they were doing. So rather than say, “Hey”—we all collaborated. We shunted that DDoS off to one side or we prevented a massive logic bomb in our environment on one occasion where somebody detonated a very destructive thing to go after the data in our data centers. I said, “Okay, congratulations but now what’s in the noise? What is buried in there that this was a cover for?” And that’s sometimes harder to quantify because you can actually measure the impact of an inverted piece of ransomware but it’s hard to if you don’t know the denominator, you just know the numerator.
We caught this number of advanced attacks. What ratio does that represent or what risk was averted? That’s harder to measure sometimes.
Fowler: What did Equifax do wrong? Whoever wants to take that.
Curry: That’s a horrible question but—
Fowler: We only have 15 minutes left.
Curry: Well, I’ll dive on it and then maybe Rob will think of something in that time. But the first thing is I very rarely see the media get it right. I think Equifax is a whole lot of issues wrapped up in how it handled the issue as opposed to what actually happened. And one of the worst things you can do is if you weren’t there, play that Monday morning armchair quarterback. I believe the truth comes out over time in these things. I think they probably sat on it far too long. I don’t think they took it seriously high up in the organization. And I think, by and large, I think one of the biggest issues that we have with Equifax is it’s hard to see who the customer is and what the privileged information they contain is and how they respect it. There’s a lot of privacy legislation coming out now in Europe. We have GDPR about to come out that says that you don’t have a right to this data. It’s privileged data and you need to treat it as such and our own privacy legislations are evolving as well here in the United States, but I think the biggest thing was they took too long and they didn’t treat it as a crisis seriously enough, soon enough.
I think internal processes will come to light. Did they do the right things? It’s hard to say from the outside.
Knake: My answer is what they did wrong is they didn’t value the data sufficiently. If they had understood the societal costs—not the costs of the company, but the societal costs of losing that data, they never would have lost it, right?
Fowler: But this is their whole business, though, right? How could they not know the value of it?
Knake: They knew the value of it to them. The biggest risk of having the Equifax breach isn’t that I might have my identity stolen immediately. It’s the cost of data on everyone just went down.
Curry: Yeah, I think the challenge I see here is they still have the data. We say it was stolen, but they still have it. They’re still in operation. From a certain perspective, they are fully resilient to this cyberattack because they can keep doing what they’re doing, right? The problem is unlike in other areas, we didn’t say to them, “Look, you’re going to have to make every American whole for this.” Which would easily put them out of business, right? If they had understood that that was going to be the level of consequence, they would have made sure it never happened and that’s what we’re missing.
Knake: And that would have required different legislation.
Curry: That would have required different legislation. The analogy I used is this is the way we treat oil spills. We say to an oil tanker coming into US waters, “If a drop of oil lands in the ocean, you’re responsible for the full cost of cleanup and remediation and you need to prove that you have those financial resources at your disposal.” If we had that same kind of setup for a data leak, you could guarantee that a company like Equifax would be saying, “Okay, we need an insurance policy that nobody would write because we don’t have the ability to prove that we have security in place so that they’re not going to have uncontained losses. That’s what we need to address problems like Equifax. It’s not actually at the technical controls. Those will need to evolve over time.
Fowler: So Equifax was a data leak, as you called it. But Rob, if you were CSO or a CIO or even a CEO, what would keep you up at night right now? What ought to be keeping you up at night?
Knake: I think last spring within the cybersecurity community, a lot of fear had shifted from, “I’m going to lose my data”, to, “I’m going to have my data destroyed.” There were a series of ransomware attacks or really destructive attacks that had quite a widespread impact on global companies. And for many of these, they weren’t targeted attacks. It was a problem of contagion. And so I think that that’s where I think the fear should be right now. It’s, “Oh, my goodness. I’ve been so worried about somebody stealing my data and I’ve been thinking that my critical assets are my data when my critical assets may be the IT systems that allow my company to function at a basic level.
Fowler: Do you agree?
Curry: No, I don’t. But that’s okay. I think that’s a very good point. But you did ask what the thing that would keep them up at night is and as a CSO, very few things keep us up at night. We live in a constant state of concern and no human being can do that, right?
Fowler: So you don’t sleep anymore?
Curry: No, I have children, right? So they keep me up at night, but the biggest single concern is how to drive what we know we need to do in security to be taken seriously by the business. I’ll give you an example in the IT security domain, right? The sort of low-level hygiene functions. The patches that need to be deployed to stop a lot of ransomware and destructionware, we know what they are. It’s not like GI Joe where knowing is half the battle, right? It’s not like that. We know. The problem is that there’s an accumulated tech debt; every time you touch IT and that impacts people’s tickets and that impacts timed resolution for tickets and those are key metrics monitored by the business. So here you are as a CSO and you go, “Hey, we’ve got 100 severe tickets that we have to patch.” And if I patch all 100, I guarantee something will break. I guarantee it and on top of that, I guarantee there will be an interruption to business.
You’ll lose your five nines. So what does the CIO say? They say, “We’ll pick five, pick 10.” So you pick your 10 and guess what? Because you patched them, they never become an issue. So you’ve become the person who is always screaming and saying, “I need to patch these things.” If on the one hand, one of the ones you didn’t patch happens to get exploited, you’re out of a job, right? And so the question is how do you get a full seat at the table so you do what you know needs doing and know how to translate that and that is the hardest thing. So CSOs that are sitting there, they go, “Okay, I’m now going to do some triage and build some relationships that I can get some things done that I know need doing but in the end, I can’t get them all done.” And that leads to long-term anxiety, right? Some CSOs ask themselves, “Why do I even do this job?” But as for—
Fowler: That’s why they don’t last a year on the job?
Curry: That’s why. I met with one CSO a decade ago now and said, “Hey, how’s your data?” He said, “Great.” I said, “How’s your security?” He said, “Awful.” I said, “Well, do you want to fix it?” I had worked for a number of vendors and we sell software that does and he goes, “No.” He said, “The life expectancy for me”, at the time, he said is about 18 months in this job. He said, “If I’m not already succeeding in nine months, I’m really just cushioning things for my successor. I’ll take all of the heat to buy your solution and deploy it and let’s assume it works.” He said, “And then the pressure is on me to actually deliver something and I’m just going to sit here and wait, spending a lot of money with no results.” It was proving a negative. That’s the hardest thing but I don’t think it keeps them up at night, right?
Fowler: So, Rob, let’s say you just discovered that your organization has had a pretty critical part of its infrastructure taken over by ransomware and you attack and you have to decide, “What do you do? What do you do next? Do you pay the ransomware? Do you call the FBI?”
Knake: I’m a policy guy so I’m going to give a policy answer. You should never pay ransoms. You pay ransoms, you are exacerbating the problem for the rest of the community. So, you’re funding criminals who are going to take that money and they’re going to spend a lot of it on Lamborghinis and leather jackets, if they’re in Ukraine. They’re going to take the rest of that, and they’re going to hire more people, and develop new tools, and they’re going to come back at us three or four times harder. So, from a policy perspective, I would say don’t pay a ransom. I think it should be illegal to pay a ransom.
If you are a CISO or you are further up the chain—if you’re the CEO of a company, you’re most likely going to pay the ransom, if you believe—if there’s any evidence that paying the ransom will get your data back. That’s what most companies are doing in response to it. They’re—if you’re dealing with these kinds of attacks, and you haven’t put in place the prevention mechanisms to have a more secure network, there’s very little you can do after the fact. You can’t go back in time.
Fowler: Yeah, that’s why you want to pay the ransomware. Do you agree?
Curry: It’s a risk-based decision for everybody in the spot, it really is, and the job of a CEO is not public safety. It’s not. The job of a CEO is to protect the shareholders, and it’s to do right by the employees, and it’s to follow the mission of whatever their company is. If you’re a hospital, and you’ve just been nailed, and people are going to die if you don’t have data, what do you? This is not an easy answer that I think that we should—with all due respect, I don’t think we should be talking about making it illegal yet. I think it’s on us as an industry to make sure that the right measures are in place beforehand to get backups and recovery up quickly; to be able to say, “I’m actually more resilient to ransomware.” Make sure that happens.
Make sure that you have the ability to survive an attack and continue critical operations. But for us to blanket say that it’s going to be illegal is a horrendous thing. It is a tough risk-based decision. Now, I think we’ve got to hit them economically, these attackers. There should be less Lamborghinis bought illicitly in all parts of the world, but honestly, I don’t think that we can say quite that it is—I think we should say, if you can, don’t. It is—you know, they’ll be an inspection of why you did. There will be inquiries into it, and you shouldn’t be damaging the public safety, but it is a risk-based decision for that organization. They may go out of business and people may die, in some businesses, and that’s a tough call to make.
Fowler: Do you want to argue with that?
Knake: Yeah, I mean, I think the ultimate answer is, if you continue to take an approach where you don’t have to invest in cybersecurity, where it’s not a core fundamental part of your business, because of those kinds of outcomes—those bad outcomes that can come from these attacks—then you’re never going to make those investments in cybersecurity. And so, if what we need to do is create a level playing field in which companies say, we’ve got to make these investments, we don’t have a choice. They will make those absent that. It will always look like the right answer from a bottom-line perspective to say, you know, if we can pay a ransom of $20,000, why put $6 million into our cybersecurity? And I think that’s—
Curry: I don’t think that tradeoff is happening, though. Look, your point earlier was quite valid, that many of these attacks take the path of least resistance, and that’s too easy right now. But even if we raise the bar on it, these attacks will still happen. It’s not like anybody is going to have perfect security just with a checklist mentality, and say, “Okay, now I’ve done enough.” You can spend a lot of money and still get hit. So, I’m just very leery of anything that is a blanket policy like that. And one thing I’ll say is as a company, you have to think about not just the actual damage, but also how you will weather it. Companies can’t be victims. They can be heroes, or they can be villains, and when they do their crisis management, and when they think about how things will be—how the wounded will be bayoneted after the incident, they have to keep that in mind.
And so, do it right beforehand, but in that moment, I believe it’s a risk-based decision, and we should by all means put pressure on them to not pay it. But to just blanket make it illegal is very concerning.
Knake: We turned your tech discussion into a policy discussion.
Curry: And by the way, I’m a techie, so.
Fowler: No, I love it. We’re getting some questions here from Twitter, so let’s talk with—about a couple of them. David on Twitter asks, what are the new ways hackers and others are finding to monetize stolen data? What are they doing with it?
Curry: Well, a lot of them are tying into already existing organized crime’s way of monetizing things. There’s perfectly good—or perfectly bad—echo systems for how to take stolen goods and data and turn them into things. And so, organized crime is angling in, and in fact, we’re seeing nation states start to use this to generate cash and credit. North Korea is a good example; they have got sanctions and embargos, and they’ve gone after places like Bangladesh and Taiwan, in order to get cold, hard currency.
Knake: So, how are they doing that exactly? Can you sort of—
Curry: In the case of North Korea, it is moving it through—especially external to North Korea, systems and ways—there’s a ratio of how much you can steal electronically and how to get the hard cash out at the end, or turn into something you can use. But very complex echo systems that are very hard to trace, especially international ones. And to answer David’s question, they’re using tried and true money laundering and organized crime mechanisms.
Fowler: Any other new techniques you’ve been tracking?
Knake: I think what we see is—right, you steal the data, you take the data, you use it to commit fraud. And so, it’s tax fraud or it’s insurance fraud. It’s pretty basic. If I have stolen your name, your social security number, your date of birth and other information like that, I can pretend to be you with the IRS. I have everything I need to get your last year’s tax return, I take that tax return, I take the information from that, I use it to file a tax form, so I get a refund. I have the refund sent to me, sent to my bank account. Then you go, and you apply for your refund, and the IRS says, “Oh, wait. We’ve already received your return for this year.”
So, it’s that kind of play over and over again. It’s why this information is valuable. The solution that I think a lot of people are looking at now is to say, okay, we may not be able to protect all this data. It’s in so many places. It’s out there. If you had a security clearance, like a lot of people in this town did, anytime in the last 25 years, everything about you is gone. So, the solution here isn’t to say, okay, we need to stop all these future data breaches; we need to change the systems, so this kind of data is not valuable, so it can’t be used to reset passwords, it can’t be used to gain account access. It can’t be used to open up new accounts.
Fowler: We’re almost out of time, but if you’ve got questions, get them in now, #WPCyber on Twitter. We’ve got another one—a question from Erica on Twitter. Are we spending too much time and resources on cyber prevention rather than cyber law enforcement?
Curry: I’m used to the question of cyber prevention rather than detection and more advanced means of stopping stuff. I’m not used to the contrast with law enforcement, so I’m assuming she means after the fact in some way, or making it more of a détente to do it in the first place, more disincentive. I don’t think they come from the same budgets, by and large. I think within IT, there’s a lionizing of detection—sorry, prevention, rather, and we need more abilities to detect what has slipped past because of intelligent opponents, as we discussed earlier. But on the law enforcement angle, Rob, do you have a—
Knake: Yeah, I mean, I think we’ve gotten—there used to be this question of can we do attribution? Can we find the attackers? And I think the answer now is, yeah, we’ve gotten pretty good at that. The problem is, you trace the attackers, and they are in China, and they are in Russia, and they are in countries that have no interest in aiding investigation, arresting people, and extraditing them to the United States. And so, I think more resources for law enforcement would be useful, but there’s going to be many of these attacks that are going to be carried out from places where we’re just not going to be able to get law enforcement actions to happen.
Fowler: Got it. Well, I’m afraid we are out of time. Thank you, Sam and Rob for that discussion. We’ll continue the argument backstage about paying for ransomware, and now we’ll move onto the next section of our program. Thanks.
Content from Symantec: Spotlight on Consumer Protection
O’Connor: Good morning. I’m Nuala O’Connor. I’m the president and CEO of the Center for Democracy and Technology. I’m here with Fran Rosch. Fran is the executive vice president of the consumer business at Symantec, and we are grateful for your sponsorship today, and your support of many good things in Washington. We are here to talk a little bit about what individuals can do. We’ve talked a lot about what companies and governments can do, and we’ll talk a little bit about the work you’re doing at Symantec, but certainly post-Equifax, is there still hope for the individual to protect their own data? What do you think they should be doing?
Rosch: I think the Equifax breach is certainly a big wakeup call for a lot of people. 145 million identities were exposed, and that’s a lot. But I think that Equifax certainly was not the first breach, and it won’t be the last. Actually, there have been over a billion identity records that have been exposed this year already, through dozens of other smaller breaches that don’t get the front page. And I’m sure there are a lot of others that were never reported. So, a lot of our information is out there, and I think what we hear from our customers is sometimes a sense of powerlessness.
I mean, it’s out there now, so I might as well just go about my day and just know that I’m going to be a victim. And I think there are things people can do, and you know, we believe it’s very much of a shared responsibility. As we’ve heard earlier, companies have a responsibility to keep this information safe. They’re collecting it, they’re using it to make money, it should be kept safe. We think there’s a policy role to play here, especially around education, which we heard a little bit about earlier this morning. But even though consumers are the victims in here, there are things that they can do to kind of take action. They’re not powerless.
And one of the reasons is that just because a breach happens doesn’t necessarily mean that equals identity theft, or it equals a crime yet, or a crime that impacts the consumer. Sometimes, the cyber criminals will sit on this information for months or years before it actually gets put into impact. There’s the black market where this information flows out there. So, there are things consumers can do and we kind of—when we talk to our customers, who are not always super tech literate, we try to use analogies, and one we kind of do is flu season. And when flu season comes on, we don’t just sit there and do nothing, and say, “It could happen.” We try to protect ourselves; we monitor, and we take action. In that analogy, we protect ourselves by maybe getting a flu shot and wash our hands. We monitor for coughs and symptoms. We can stay home if we’re sick or go see a doctor.
I think there are things that we can do as consumers to protect ourselves. And as part of the Equifax breach, we’ve heard a lot about credit locking, or credit freezes. And what that basically is, is you have to call the big credit bureaus, and ask them to—and sometimes, pay them, unfortunately—to put a lock, so that if someone has stolen your identity, and they try to open up a credit card or take out a mortgage or an auto loan in your name, they check with the credit bureaus and they say, “No, that’s blocked. You can’t do that.” So, that won’t eliminate your risk, but it can help kind of lower it. As the last panelist said, it won’t help you for tax fraud, but it can at least help you for some of these types of things.
A lot of banks now are offering two-factor authentication. We recommend if it’s out there, take advantage of it. If not, ask for it. Silly—not silly things, simple things like better password management, these things will help kind of prevent you from becoming victims. You should also monitor—look at your credit reports, look at your credit card bills, look for weird things that could happen. And if you find problems, take action. Call the banks, get these things cleaned up before they become bigger problems. So, there are things people can do to take action before they become victims.
O’Connor: At CDT, we call that good data hygiene. We also recommend good VPN use and care about the data you hand out online. I will say this, not to be giving Equifax a hard time this morning, but in trying to do credit freezes, I successfully did that for myself. I could not do it for any of my minor children, so all of you in the data broker world, we’re coming for you next. Anyway, that’s some of the things we can do today. What do you see coming down the road? What changes would you like to see, either in the individual behavior, company behavior, or the policy environment?
Rosch: Well, I think one of the things that we’d like to see is just kind of more consumer control, more consumer participation, more consumer privacy. One of the things that we heard a lot from our customers is, like, I understand when there’s a breach at my bank, because I actually kind of gave my bank my information, or my healthcare company, but they didn’t even know they had a relationship with Equifax. They didn’t know that this data was being used and sold and monetized, and they didn’t feel like part of that process.
So, we think there’s a way consumers have to be brought into the system, and given some level of control, and say, “Okay, I authorize you to use my data for these services.” And if you’re going to use them for any other reason, you have to check with me, and let me give some sort of approval associated with that. So, I think more, you know, consumer control, more consumer privacy in this whole ecosystem, because a lot of people are making money of all this data, and we think consumers should have a strong say in that.
O’Connor: We’re liking the sound of that at CDT. It sounds an awful lot like opt-in, for those of you who have been following at home in the opt-in, opt-out debate of years and years ago.
Rosch: And you bring up a good point about minor children. I think when, today in this country, we have a social security number-based identity system. Right or wrong, a lot of people will say we need to move to something different, which we agree with, but for now, we have this social security number. And when your minor children have a social security number, credit files get opened for them, and the default when it gets set up is open. And we think it should definitely be—when it gets opened up, it should be locked right out of the gate, until you, then, either the children becomes 18 and they can unlock it themselves, but right now, it errs on the side of the companies and making money, and not on the side of the consumer, which we think should change.
O’Connor: And with all these data breaches, they’re included, whether they’re minors or adults.
Rosch: That’s right.
O’Connor: So, we are just a couple weeks away from black Friday and cyber Monday, very exciting for those of us shopping at home, but what are you worried about? An uptick in, perhaps in fraud or phishing scams or the like? And also, hopefully to end on a positive note, what are you excited about?
Rosch: Yeah, I think there’s—certainly we’re all going to be online a lot more, shopping a lot more online. I think there are two things. We talked about kind of the things people do to protect and monitor and kind of take action, but I think the holidays kind of bring up two unique things. One is a lot of us are going to travel, and as part of that, we’re going to be in airports and hotels, and coffee shops and things like that, and everybody wants to take advantage of the public Wi-Fi, because we just—we do it naturally now. And the bad guys know that, and there’s a lot of vulnerabilities in public Wi-Fi systems today, where people can be snooping and watching and collecting all that information when you’re on that Wi-Fi.
And a lot of people aren’t very sophisticated on this topic. They’ll be in an airport, and it says free airport Wi-Fi, and they say, well, that must be the airport who’s offering that, and it must be a really secure VPN, or I’m sorry, a secure Wi-Fi system, but it’s not. Any criminal can name it whatever they want. So, I think being very careful about using those public Wi-Fi systems. As you mentioned, there’s a lot of things you can do around downloading a VPN, and using a VPN for that, which allows you to keep all that data protected, and nobody can snoop on it. So, that’s one area.
I think the other—some of the hot gifts we’re going to see this holiday are IT or are connected. A lot of companies are racing to put products on the shelves at Best Buy or on Amazon that are connected to the internet, that allow you to manage it through your phone, which is great. It’s exciting. And some companies take security very seriously, and those you can—those devices will be protected. But many companies don’t, and they’re just racing to get these devices on the shelves, many of them manufactured in China, where we don’t necessarily have a good understanding of the supply chain of what’s going in. And those are many computers that are connected to the internet, and that’s bringing a lot of risks into people’s homes.
And you may say, well, it’s my camera on my front door, my baby monitor, the TV. Not only can these things be compromised, but they’re all connected. If you put a weak refrigerator in, the refrigerator becomes a vulnerability point, that then they break into your router, and all of your information is flowing back and forth through that router out to the internet, where it can be collected, spied upon, corrupted, so I think people should be really careful as they get into the holiday season, and take advantage of those things, but also kind of put security front of mind.
O’Connor: Now, a little bird told me that you might have a solution for that home router issue, and it’s a quite attractive looking device. Did you bring the brochure? I’m going to give you 30 seconds to plug this. Tell us about it.
Rosch: You know it is—I think that, you know, at Symantec, we’ve got about 50 million customers, and we spend a lot of time listening to what their concerns are. And for many years, for us, those concerns were around protecting their devices, their PCs, and a lot of progress has been made, as we saw with the WannaCry virus and ransomware—not solved. But their concerns have evolved. Identity theft has become a big concern; in the past, I think Equifax is kind of really ridden that to the top. We recently acquired LifeLock, which brings a kind of a new solution to help kind of protect identities. But also, we—yes, when we looked at that connected home, it’s very difficult for consumers to distinguish between what’s a good camera and a bad—a secure camera and an insecure camera? A secure TV or an insecure TV?
So, we’ve created a router called the Norton Core that can sit in the home, and kind of provide sort of an umbrella protection over all those devices. So, as a consumer, you don’t have to necessarily understand which ones are safe and which are not. You can just kind of be confident that by putting the Norton Core, which is attractive, thank you, kind of give that good protection for the home. So, even though consumers are absolutely the victims in these types of breaches that happen, we’re trying to kind of empower them with education, as well as tools that they can stay safe during the holidays and year-round.
O’Connor: Sounds good, and we thank you so much for your sponsorship, and we’ll turn it back to our hosts at The Washington Post.
Rosch: Great, thank you very much.
O’Connor: Thank you very much.
Protecting Ourselves: Best Practices for Cyber Hygiene
Fung: Good morning. My name is Brian Fung. I’m a technology reporter here at The Washington Post. Joining me today for our final panel, we’ve got three fantastic guests here. To my left is Mischel Kwon. Mischel is the founder and CEO of MKACyber, and as a former director of the US computer emergency readiness team, as well as the former deputy CSO for the Department of Justice. So, a lot of great credentials there.
Fung: Next to Mischel, we’ve got Dante Disparte, who’s the founder and CEO of Risk Cooperative, a strategy and risk advisory firm based in DC. And finally, last but not least, we have Eva Velasquez, who is the president and CEO of Identity Theft Resource Center, and she focuses her work on the consumer aspects and the victims—the human element of a lot of these data breaches that we’ve been seeing in recent weeks. So, I’d love to kick it off, again, just by reminding everyone that you can ask questions of the panel by using the hashtag #WPCyber. By this point, you probably know the drill.
Thanks for joining us. I’d love to just start out a little bit by talking about what some of the lessons we’ve learned from a lot of these data breaches we’ve seen in the recent weeks and months. What lessons can we apply toward our approach to cyber security and to privacy and data, as we move forward?
Kwon: Well, I think if you look at the problem, and you look at the solutions, there is no silver bullet, but one of our biggest problems is that we are not taking care of our systems. This is as much an IT problem as it as security problem. It’s really hard to look at how to stop the sophisticated adversaries when the unsophisticated adversaries are winning because we’re not patching or replacing older machines. So, it’s not necessarily something fancy and something new; it’s more going back to looking at what lifecycle management is; it’s looking at actually patching machines.
I mean, the fact that with Petra, Microsoft had to go back and release patches for systems they said they weren’t going to patch anymore, is really an indication that we are not taking care of our IT systems, and we’re not putting the money towards the things that are now running our lives. From a consumer perspective, I think we have to really look at the companies we’re doing business with, and say, “Are these companies doing the most current IT things that they can be doing?’ It’s not just security; it is important for them to have two-form factor, and for you to use safe passwords. All those things are important. But even more important are the companies spending that spend to create these IT systems so that they are secure for you to transact your personal data on.
Fung: Dante, anything to add?
Disparte: Yeah. I would say it’s fitting to have this conversation at The Washington Post, where you talk about democracy dies in darkness. Well, cyber threats thrive there, and I think if there’s any big lessons learned from 2017, it’s the first is that this is a systemic risk. WannaCry spread to 150 countries in three days over a weekend. The Equifax breach affects, effectively, 100% of the US workforce. And so, I think what we now need from a policy vantage point, and from a business vantage point, is to start meeting systemic threats with systemic solutions.
So, I’ve long advocated for this concept of a cyber FDIC; some risk-sharing and risk transfer mechanism that would pool the economic consequence, create a clearing house for best practices, be it technological, and I think most importantly, probably the governance component of this risk is the cheapest solution that we could bring to bear, but it’s the one that we all too often fall on. This is an emblem of that. Equifax, in five years of their annual reports, the words privacy, cybersecurity, cyber risk aren’t mentioned even once in annual reports, for five years. Meanwhile, growth, profit, shareholder value, etc., is the dominant theme in the corporation, and governance at the end of the day doesn’t thrive in darkness, either, and we need to have a consistent value system for these risks.
Fung: So, a couple of times this morning, we’ve heard this analogy of cyber security being much like health or hygiene. And I just wonder, clearly being healthy requires a lot more than taking a pill. It’s clearly not simple and cyber security isn’t simple either, but there are some things that we know do work in the health space, like putting calorie counts on restaurant labels, or stop light food labeling in a grocery store. And those types of innovations sort of take our natural inclinations to behave certain ways, and then either short-circuit the negatives, or they find a way to take advantage of them. And I wonder if there are any technologies or techniques that are being developed in the cyber space, that could potentially allow us to do the same thing.
Kwon: Well, actually, I think if you start to look at hygiene related to the types of attacks you’re having, before you have the attack, it’s doing exactly what you’re saying. So, what we do with our watchtower methodology is that we look at the hygiene when we’re doing the threat analysis. So, if a threat analyst comes up with the TTP and IOC that you’re going to put a signature in your IDS, and say, “I want to look for this kind of malware, this type of attack,” at the same time, they should be looking for the CVE that’s related to that TTP or IOC, so that they can see how vulnerable the system is to that type of attack.
So, it’s that same thing; not just patching for the sake of patching, but let’s patch the kinds of things that you are vulnerable to, and reduce your risk. Often, we do the patching after we’ve detected it’s already happened. So, starting that remediation earlier in the process is a good piece of it.
Fung: Eva, have you noticed—
Velasquez: That relates—that’s why the health analogy is so great, because when it comes to our physical health, often we don’t do anything about that until it becomes a problem, right? Okay, oops, one too many cheeseburgers and the scale is too high. I think I need to address that. And that is very common with identity hygiene. Victims of identity theft will put those practices in place. What we need to do, I think, is unite all the stakeholders, and I really think that the consumers and victims and underutilized by industry and government as kind of evangelists for better cyber security, and for better processes.
Let’s communicate a little bit more to them, industry and business. Tell your customers how they can help you in this effort. Are there little things that they can do? I think they will embrace it. I think about the things like the two-factor authentication, and every time they sign in from a new device, how much more tricky it makes for organizations to actually verify that that is their customer. Are there things they can do as simple as, well, when you’re going to do your online banking, only do it from one device. I know that’s over-simplifying it, but I’m talking about those types of things.
That is a component of that good identity hygiene, and getting people to practice those things, as you said, there’s no silver bullet. There’s no panacea. It’s not one thing, just like taking care of your health. You don’t go into your doctor, and go, “What’s the one thing I need to do to take care of my health?” Brush your teeth. All right, I’m going to do that once, now it’s done. I never have to think about this again. That’s just not the world we live in, and so as we continue to educate our consumers and tell them, “This is part of the world we live in, and your life, and here are some of the things that you can do on a regular basis.” Again, the multifactor authentication, and I heard—we’ve talked about what those different steps are, and I think we just need to continue to educate folks on what they can do to be part of the solution, instead of the weakest link and part of the problem.
Fung: So, that’s a great transition to my next question. As a reporter, I feel like part of my job is to help folks understand, you know, what the stakes are, what the threats are, how to protect yourself, but have those of us in media done a good enough job of communicating that to people? And if not, what can journalists do better, maybe in terms of approach or tone to help the general public on this front?
Velasquez: I think one thing is to actually focus in on where the problem actually is. I think sometimes we get a little bit of shiny object problem in focusing on the malware and saying how bad the malware is, how fast the malware spreads, how many machines the malware affected. When I think going back to WannaCry or Petra, and saying, “Wait a minute, this is an older patch. This should have been patched. This was clearly a hygiene problem.” You could have replaced your network for less than a third of the cost of your actual loss. And actually, identifying it, not as a cyber security problem, but as an IT problem, as a lifecycle management problem, that the company did not spend enough money on keeping the equipment that your data is riding on, up to date.
If as consumers, we are educated and we can make decisions as to whether or not we are going to spend money with companies that don’t take care of their systems, or if we’re going to spend money with companies that take care of their systems.
Kwon: I would agree with you, absolutely. That needs to be the litmus test for not just seamless and frictionless service, but actually maybe this is going to take a little bit longer to authenticate me, but that’s good for me, and that’s the company I want to do business with. It is going to take a huge change in our national perception.
Velasquez: And it’s going to take a change with us as consumers. Am I going to pay a higher credit card because their fraud detection is so much better than another credit card? That’s the kind of consumer change we need to make.
Disparte: So, I think the question of reporting, it’s about orientation. Too much of the media is obsessed with the ambulance chasing component of cyber security and cyber risk, so it’s very easy to pick on Equifax and it’s very easy to pick on organizations that are in the headlines, but the consumer component of this is a little bit, again, back to the financial crisis, exactly like the systemic risk. We’re privatizing gains, and we’re socializing losses. So, if you start to discuss, I think the American public has become inured and indifferent to cyber threats. But then when you start to tell the tale of it’s not just about your information being leaked; it’s about information being changed. What do I care if my medical records are stolen? I would care if my blood type was changed, and it would profoundly change for the rest of my life the type of treatment I could get. That’s where these risks are evolving to.
Increasingly, the threat actor is not concerned with just information; they’re concerned with disinformation. Look what happened in our election last year. These things are not partisan. This is much, much bigger than just threat and vulnerability and certainly much bigger than technological solutions alone.
Fung: How much more control should consumers have over their own data? You have some folks who’ve floated this idea of data portability, and the idea that you should be able to—I think Fran was talking about this a little bit earlier—make clear decisions about who you’re going to release your data to. How would—how do we establish that kind of a system, and is that something that seems plausible and realistic?
Kwon: Well, let’s go back in history. We actually used to have the responsibility for managing our own data, but it wasn’t out on a server somewhere. We had our offices and our hard copy papers and we did have a lot more control about who we were sharing that with. And I think that we need to examine that and look at where the new normal lies. There is not enough consumer control right now. People don’t even understand how many data points they’re creating and how much data about them is out there, and I think one of the reasons that the Equifax breach had so much public outcry was—and we’ve been tracking data breaches for over a decade.
And when we hear from people that are notified of a breach, generally they have a business relationship. They made a choice to do business with that entity and so when that happens, they go, well, I was their customer. And I think what Equifax is helping to highlight for consumers is that you will have these relationships where you are the customer; you will also have relationships where you are the product. And that’s—being able to opt-in or opt-out of those types of sharing relationships where you’re the product, I think people want more control over that.
Fung: Are we, in some respect, too far gone now? Are you not going to be on Facebook? Are you going to commit not to using Google?
Kwon: I’m not on Facebook.
Fung: To search. There’s a lot of discussion right now about whether tech companies enjoy too much power over our economy and our society. I mean, the reality is, for a lot of people, there is no functional alternative to those companies, and so what does that—
Kwon: At the very least, they should understand what the relationship is so that they can make an educated decision. That’s really what I’m advocating for. I’m not saying social media—I personally—Facebook just got too hard to monitor. I personally don’t find any value in it, but my family—my mother is on Facebook and loves it and gets a benefit from it, because she’s able to connect with her relatives across the country. But she’s made that choice knowing what the risks are, and what she’s giving up, and so I’m definitely just pushing for more transparency in those relationships, so that consumers can opt in or opt out, and know what the benefits and the risks are.
Fung: So, May 2018, the Europeans embark on the world’s most stringent privacy and security regulatory regime. And in it, and I think one of the most important facets of it is this right to be forgotten. But however, GDPR will also create a series of unintended consequences, the first and foremost being the concept of an information security arbitrage. We run the risk, and we’re already seeing it today, of having a big gap—a trans-Atlantic gap between European privacy and security standards, and what we’re seeing on this side of the Atlantic. And yet, global corporations that are stateless, like Google, like Facebook, and like many others, are operating in an extra territorial manner. So, that kind of race to the bottom that we run a risk of is perilous, and that’s imminent, coming in May 2018.
Kwon: Well, I think we’re also not thinking about new ways of using data, and we’re not thinking about the consequences of some of the things that we’re partaking in from a technological perspective. I mean, if we—and we’re unaware. If you think about the things that are done with our data and our choices, and recorded every day, that we have no idea are happening to us. I mean, the fact that now you ride on 66 and a camera takes a picture of your license plate every time you drive down the road; the fact that when you go to the grocery store and you buy something in one aisle, and your cart has a device in it, and it tracks what aisle you’re in, and they’ll change the music in the grocery store, so you might want to go to a different aisle and choose something different.
We don’t realize that all of these technologies are actually happening today, and that they’re collecting data on us. And this data can become more powerful over time, and we—I think maybe in some instances, we’re putting the cart before the horse, and not looking at the consequences of those kinds of technologies. So, I think it’s not just the attacks and the malware and the problems in the IT, but it’s also how we’re looking forward and leaning forward with technology. We have to make sure we’re also leaning forward with public policy and even laws to keep track of where we’re going with that.
Fung: Are there types of data that we don’t collect today, or that companies don’t collect today, but you think might be really important in the future?
Kwon: That they don’t collect today? I’m not sure there is.
Velasquez: Is there anything that isn’t being collected? If we just look back at the misuse of some of these identifiers, they were developed for a very specific purpose, and then we went, okay, hey, this is very handy. I’m going to use it to do X over here, and now this data is being monetized by the thieves. I am 100% certain that there will be new forms of identity theft, and they will—thieves will find new ways to monetize our data. Data points that—it’s going to blow our mind, and we’re going to, how did they figure out how to monetize that? That is going to happen over the next couple of years.
And, as we keep encouraging this collection of data, and well, we don’t know what we’ll use it for yet, but we’ll figure it out later, and that’s definitely an industry attitude. I don’t know—collecting it all, I’m using this, we’ll figure out what we’re going to use this for. Well, the thieves are doing the exact same thing. They are looking at what we’re collecting and figuring out how we’re going to use it, and as we get into a world where we’re going to use more data points for authenticators, and multi-layered, which is really the way we need to go, well, those data points are going to become more valuable. So, it’s a tough proposition because we have to do this for the solutions, but then we are creating more value for the thieves so they can—and they’ll figure out how to make some money on it.
Kwon: But it’s also important—
Fung: I’m sorry, I want to get to Dante.
Disparte: Well, no. So, I was just going to add a point to this. Everybody genuinely is obsessed with data, and we only learn that data is valuable when we lose it. And so, having had countless conversations with organizations about cyber insurance, and how much insurance was enough, what we realized was that there was no proxy for data valuation. And so, that’s where many people like Eric Schmidt, the executive chairman of Alphabet and Google, are talking about data being the new oil. But that’s where the comparison stops, because oil has universal valuation; it has geographic restrictions, etc., where data does not. And so, in that sense, it gets back to my message about this is a systemic risk.
We only cared about liquidity in banks when there was a risk of a run on banks. We only care about informational assets when it’s exposed. And so, I think the regulatory platform for the future has to require organizations, especially publicly quoted ones, and the systemic institutions like Equifax that are hiding in plain sight, to disclose to the public what share of their valuation is derived from informational assets, because everybody owns it but nobody wants it at all, which is why we’re rushing to put things into the cloud, we’re rushing to create vendor relationships to offset the risk, but we still own it, at the end of the day, just like liquidity in banks. And you only know that it’s valuable in a stress scenario, or when people are claiming it.
Fung: Mischel, did you have something to add?
Kwon: Well, I want to make sure that we also touch on the fact that it’s more than data now. We’ve pushed it past data with IOT and with the way in which we’re using computers, whether it’s hotel locks, whether it’s different types of picture storages that are now in your house, or that are attached to your routers, or other types of operational type of systems that are also at risk. And we have a big, broad landscape now to look at, that’s even broader than just data.
Fung: So, I was looking through some of the questions on the hashtag a little bit earlier, and one of them came up that I think is really interesting. I don’t think it’s been asked yet, but have cryptocurrencies added to the growth of ransomware attacks, and it really gets to this question of cryptocurrency as being used for good and for ill.
Disparte: Can I jump into that? So, I do a decent amount of work with BlockChain and sort of the whole digital currency universe. And the short answer is, while most professionals in households would have first learned of bitcoin when their computers were held for ransom, the truth of the matter is, it’s leaving a digital crumb and a digital trail that is much, much harder for criminals to escape with than regular currency. And as an example, the WannaCry attack went from zero to 150 countries over a weekend; they asked for bitcoin for the ransom, but they only succeeded with about $70,000 worth of ransom money.
So, if you think of that enormous dragnet for a ransomware attack, to get that little economic value out of it, says a lot about digital currency and cryptocurrency being a secure platform for value transfer.
Fung: Does that say more about the security of the platform, or more about the usability of the platform?
Disparte: Well, I think it’s a bit of both, because the digital wallets are traceable, while the identity of the owner of the digital wallet is anonymous, the wallet is not. And that traceability really creates a lot of protection, especially at the tail end of a risk, and from a forensic vantage point, it’s easier for authorities and others to trace it backwards. Whereas with M2 money, or regular, traditional currency, that traceability doesn’t exist. So, I think there’s an innate security in digital currency and these types of technologies that we’re not necessarily seeing in more traditional forms of value.
Kwon: Well, in my conversations with the law enforcement folks that we work with, particularly the Secret Service, what I have heard them tell me is that it does create the pathway for the thieves, but it isn’t necessarily the huge rings, that it creates the—it’s lowered the bar on your ability to commit fraud, because if you’re just a small-time guy, this is a great way for you to do this. It’s very easy to do those transactions in bitcoin. And so, that’s—those have been their comments to me over the last couple of years.
Velasquez: Well, it’s interesting to me that adversaries are coming up with more interesting ways to use technology, and I think we need to be doing the same thing on our end, and looking at the problems that we’re facing, and figure out new and different ways. I mean, I think a lot of the things that we do today are actually really old. And IDS is a very, very old technology; antivirus is old technology. If we are spending as much time creating some of these innovative types of technologies from not just the security perspective, but also the IT perspective, looking at how we transmit information from an encrypted perspective.
Is it just encryption? Is it actually a new technology? I think it should make us stand up and turn around and say, “Hmm, maybe we need to be being more innovative on the IT side.”
Fung: After the Equifax breach, there was a lot of talk about the social security number and whether we should still use. If not, what the alternative might be. I’d love to put that questions to you guys and see what you think.
Disparte: Well, if there was ever a great business case for digital identity, this was it, and the—as I think the earlier panel discussed, when we have a social security based identity system, and your kid gets a social security number—an analog identity system in a digital era—that’s the mismatch. We’re not approaching systemic issues around cyber, around data theft, around information privacy with systemic solutions.
Flashback to just before the takeoff of e-commerce; many people were afraid of buying online because of people stealing their money and fraud and these types of issues. And when the banks said, collectively, you the consumer have a zero-liability proposition, e-commerce took off. And I think we have to confront cyber risks with similar big thinking that starts to address, at least at the consumer and the middle market level, this kind of zero liability proposition, both our legal system, our regulatory environment. We’re using really blunt force instruments to attack a pretty sophisticated risk. So, I think digital identity is—has to come post-Equifax.
Fung: Eva, any thoughts on that?
Velasquez: Well, there have been a lot of different discussions on do we completely get rid of the social security number, and issue another identifier, which is absolutely the wrong way to go. We are just creating the same problem that we’ve already had. As you said, it’s a static number; it’s ever-changing. It’s a good identifier; it’s not a good authenticator, and so, in my mind, we need to be looking at adding—and this is where the privacy and security conundrum come into play—what’s the saying? The tree of security is watered by the blood of privacy, and I’m really starting to realize that even more, because we can go ahead and keep the social security number system as it is, but pretending that this is a secret is just silly.
It’s not a secret to anybody. We need to add more layers of authenticators, and I think we need to be really innovative as to what those are. There are technologies out there that are private but not necessarily personal. Things on your phone—how the angle that you hold your phone, what hand you use, the pressure and the pattern for when you enter your passcode. All of those things could be added as an identifier, not to replace the social security number, but just to add more layers in there, so that, okay, this is really you. So, no, I don’t think we should completely get rid of it, and take away more pieces. Doesn’t that just make it easier? I’m using five pieces of data to authenticate you; well, now you can only use four.
Kwon: Well, it totally makes sense. We’re telling people they have to change their passwords, and they have to make them complex. And that a password is not enough; that you have to have two-form factor authentication. But our very most fundamental password that everyone in the United States has never changes, and is a single password. So, obviously, we have to do something. We have to do something, but more importantly, as we do that something, and as we do it with technology, we have to make sure that that technology stays current, and we have to make sure that our privacy is taken into consideration and protected.
Fung: Well, I think I’d like to close with this question that you kind of just alluded to, which is this idea of consumer fatigue and security fatigue and hacking fatigue, and how do we sort of circumvent the problem of folks feeling like, oh, now here’s another thing that I have to do in order to stay secure?
Velasquez: You know, I don’t see it as fatigue. I see it as this is the price you pay for using technology. And it’s a change in the way we do life. Everything that we’re doing with our technology is something that we did manually before, and we had to take care of things before in a different way. It’s a different way of living today, and I think we have to teach that early, we have to teach that to our children, and we have to continue to be vigilant about taking care of our technology, taking care of our personally identifiable information, and caring deeply for our own personal privacy.
Disparte: Look, I think it’s problematic because in a country with an educational system that depends on a zip code lottery, to now add a barrier of digital literacy being the difference between getting credit or not, or having your information extorted or not, I think it’s very, very complex from a policy point of view. I would like to see things that get the risk off the personal balance sheets of the most vulnerable, and create mechanisms where it is owned systemically, just like the FDIC took the risk of bank runs away from financial institutions. It’s a trust issue.
If you can withstand sunlight, you could withstand cyber threats. And it’s the gap between values and actions behind the scenes that hurt companies the most.
Fung: Eva, anything?
Velasquez: Well, and here’s this crazy lady wanting to talk about identity hygiene again. I do think that it’s about access to resources, and expertise for people. I completely agree with you; we have to start young, just like we teach our children how to take care of their physical bodies, we are going to need to teach them how to take care of their identities, and their safety online. It just has to be a part of how we raise our kids, and we need access to resources that will help people to navigate that space because it’s complicated.
Fung: Well, thank you so much for joining us today. I really appreciate it. Please join me in thanking our panelists, and that concludes our agenda today. Please remember that you can keep an eye on future events at WashingtonPostLive.com. Thank you.