Democracy Dies in Darkness

Post Live

Transcript: Cybersecurity Summit 2017

October 3, 2017 at 4:41 PM

On October 3, 2017, The Washington Post brought together government leaders, security experts and advocates to discuss and debate top issues in cybersecurity.

This transcript has been edited for readability.

The View from the White House

Nakashima:     Great.  Thank you very much, and welcome, everybody.  Welcome to our Washington Post cyber summit.  Rob, it’s great to have you here today.  As you know, Rob is the White House cybersecurity coordinator, and in that position he is responsible for the development of national and international cyber strategy and policy.  He is probably the most technically qualified person ever to hold that position, in that he spent more than 27 years at the National Security Agency including in top positions on the defensive or cybersecurity side of the house and on the offensive side of the house where they hack for foreign intelligence.  But my favorite detail from his bio is that every year around the Christmas holidays Rob runs a computerized light show synchronized to music that he claims is visible from the International Space Station.  [LAUGHTER] Yeah, and he says he hasn’t blown the electric grid in his neighborhood yet.  So good work.

Let’s get started.  And I’d like to invite our audience to participate in this discussion.  You can feel free to tweet your questions to Rob using the hashtag #WPCyber.  And I will try to take one or two at the end.  Rob, you’re a technologist at heart.  What do you think cyber conflict will look like in 10 years?  Will it be fought in cyber warfare?  Will it be information warfare?  Will it be fought predominantly by the private sector or by governments, by human hackers or artificial intelligence and algorithms?

Joyce:              I think we’re on a path where all of the above are going to be involved, and that’s one of the reasons we’ve got to work so hard on the norms and expectations in this international space.  I think that governments are seeing the asymmetric value of doing some of these operations, but the most impactful seem to be the information operations.  Right?  I don’t think we’ll see governments and nations doing things exclusively in the cyber domain.  It’s when you connect the physical, the kinetic, the information, the intelligence sides all together that things get really impactful.

Nakashima:     So that’s very interesting that you mention information operations, which has come to the fore in a big way in the last year for instance.  What are you in the Trump administration doing to prepare the government and the private sector to counter information operations, especially those that could be harmful to our democracy?

Joyce:              Certainly.  I think the biggest thing we’ve got to do is we’ve got to build that ability to impose costs for people who are operating outside the norms.  Right now there is very little cost, and there is often big benefit for doing things in the information realm or the cyber realm.  And so our focus has been figuring out how we can impose those costs.  And the costs are whole-of-government things, whether it’s diplomatic tools, whether it’s judicial tools, subpoenas, arrests.  The idea that we can use sanctions, that’s a very powerful tool for the U.S. government.  There is cyber-on-cyber, but that’s rarely the case where cyber solves cyber.  And then there’s even military options when things are really egregious threats.

Nakashima:     Well, in fact you just rattled off a long list of tools and options.  And many of those the Obama administration has already used.  They’ve deployed economic sanctions; they’ve indicted nation-state actors; they’ve kicked out Russian operatives.  How are you in your strategy for the Trump administration going to go beyond what they did?

Joyce:              I think it’s the focus pace and operational use of it.  What we’ve found is we’ve often in the past tried to get big groups together to go after the problems of the day using the U.N. or even the U.N. Security Council.  And what we’ve found is the lowest common denominator in that group is usually unwilling to act.  And so our focus is now looking at bilateral agreements where we can find a friend, a partner, someone willing to come along with us to push back and impose those costs.  And then at that point we can act and we’ll bring a coalition along with us, but we’re operating at the speed of the two most willing nations rather than the speed of a large coalition and the least willing, which is often either no action or very delayed action following an issue.

Nakashima:     Hmm.  And so when it comes to some of these multilateral bodies where you have China and Russia present do you still think you’ll participate in those?

Joyce:              We absolutely will.  And they’re important for setting those norms.  But those multilateral are often not the best place for imposing norms.

Nakashima:     And what is the most impactful type of action you think you can take to impose costs on a nation-state adversary?

Joyce:              I think it’s a whole of government campaign.  It really is applying just an array of the tools we have at our disposal and acting in concert with allies and friends.

Nakashima:     Great.  So let’s talk a little bit about the critical infrastructure sector or those companies, Section 9 entities they’re sometimes called, where if there is an attack there could be catastrophic consequences.  How will you and this administration better enable which government agencies to help these companies and in terms of deterring attacks?  And are you thinking at all of giving the military or intelligence agencies more of a direct role in helping them?

Joyce:              So first and foremost the primary responsibility for critical infrastructure is DHS, but they lead in that community with a group of friends.  So the intel community is often supplying information about the threats.  The law enforcement community is following up; every hack is a violation of law.  And then there is a role for the military.  Cyber Command, DOD has a role defending the nation.  When we think about a missile attack on the East Coast, the military is not going to look at trying to stop a missile only headed for a military base; they’re going to look at defending that whole East Coast.

Nakashima:     So if they detect an attack coming at, you know, Wall Street or Sony or something, are they going to stop it?

Joyce:              I think we need to look at the capabilities to do that.  But today those architectures, those capabilities don’t exist.  And so it really is, again, that portfolio approach, a belt and suspenders.  We have to have some tools that we use from DOD.  DHS has been doing exceptional work out there with the critical infrastructure sectors.  And that’s going to be the predominant push, but I think we’re going to use a little bit of everything.

Nakashima:     Okay.  So let’s move to a topic that’s much in the news of late.  It’s the Equifax breach.  Equifax, the breach exposed the personal data, as you know, of 143 million Americans.  And when a company has almost half of Americans’ personal sensitive data should there be a government role in protecting that?

Joyce:              Yeah.  So I believe there needs to be and should be a government role in some of that.

Nakashima:     What should it be?

Joyce:              If you look at Equifax, you know, you and I didn’t decide for Equifax to have our personal information.  That’s their business model.  They aggregate our information, and by holding our information we don’t get a choice as to whether they hold it or not.  And so I think Congress is looking at what are the requirements and regulations that ought to surround those personal information activities, what is required for breach notifications, and we’ll participate in all those dialogues.  I think it’s really clear there needs to be a change, but we’ll have to look at the details of what is being proposed.

Nakashima:     Governor Cuomo in New York has called for a requirement that these credit-reporting agencies protect their data or lose their licenses.  Would you support such a move at least for the credit reporting agencies?

Joyce:              I think there needs to be some oversight and regulation in that.  One thing I’m convinced of though is we need to be careful about balkanizing the regulations.  It’s really hard on companies today with state regulators, local regulators, maybe multiple federal regulators, federal law, international law.  And so one of the things I want to be careful of in the response to Equifax is that we don’t balkanize the regulations and accompany doing business in 50 states and internationally has to contend with 75 different rules and regulations.

Nakashima:     Right.  So what the Equifax breach also brings to mind is that one of the reasons it’s so easy today to conduct identity theft is that at the basis of that is the Social Security number, which is almost as old as the abacus.  It’s used—with that and your birthdate you can have access, get credit cards, you know, all sorts of other services online.  Should we be looking at whether we should continue to use the Social Security number as our number one building block for identity.

Joyce:              Yeah, I feel very strongly that the Social Security number has outlived its usefulness.  It’s a flawed system.  If you think about it, every time we use the Social Security number you put it at risk.  By interacting with it you’ve given a key piece of information out publicly, maybe to a private company, maybe to an individual, but that is the identifier that connects you to all sorts of credit and digital information online.  There are technologies we can look at advancing.  And, you know, I think our call is let’s look at what would be a better system.  Certainly the idea that we could use a public and private key, something that I can use publicly but not put the information at risk, something that could be revoked if it’s known to be compromised.  Right?

How many people here today have changed your Social Security numbers knowing that the Equifax breach happened?  Right?  Nobody.  Right?  So it’s a flawed system that we can’t roll back that risk after we know we had a compromise.  I personally know my Social Security number has been compromised at least four times in my lifetime.  That’s just untenable.  So we’ve got to move.  We’ve got a modern digital age.  We have got to find a way to use that modern cryptographic identifier to help us drive down that risk.

Nakashima:     So will the Trump administration be rolling out a proposal to use a sort of public-key, private-key system [ph] in the near future?

Joyce:              Yeah, so at the policy coordination level in a group I run we’ve called for the departments and agencies to bring forward their ideas, and let’s start talking about the Social Security number and the vulnerabilities in the cyber world.

Nakashima:     All right.  Okay, a little bit of news there.  One more question on Equifax before I move on is do you have any clearer sense today of who was behind it, a nation-state or a criminal organization and what the motive was?

Joyce:              Yeah, I don’t have a good sense.

Nakashima:     Don’t have a good sense, okay.  I don’t think we’ve seen any of the data appear yet on the black market, have we?

Joyce:              We’ve not.

Nakashima:     So that might—

Joyce:              Yeah, so there are people that hold that opinion.  You know, I don’t have any facts, no [ph].

Nakashima:     That it might be [OVERLAPPING].  Okay.  Well, let’s turn to a topic that again is also very current, foremost in the minds of many watching, which is Russia.  And, as you know, the intelligence community is absolutely certain that Russia interfered in election last year, and many senior or former senior officials have said that they fully expect Russia to try again.  So what are you doing in the administration to try to prevent that?

Joyce:              So at the administration level, you know, President Trump did come out and accept the intel community’s assessment on the election hacks.  But what we’ve been very clear is that those hacks were not influencing the outcomes.  Those hacks were involved in scanning.  The majority of them we saw were scanning the election system, looking at the voter rolls and information in that space, which in and of itself is a strong concern for us.  So DHS has the lead.  They’ve stood up first, naming the electoral system as critical infrastructure which then allows the states to request assistance from the federal government, which we’re willing and able to provide.  So that starts that cycle.  But they’ve also set up a coordinating council on election security.

Nakashima:     DHS has?

Joyce:              DHS.  And NIST has developed a guide for election security standards.  And so I think a comprehensive effort—it is work between the federal government, the state government, and the local government to address this [ph].

Nakashima:     Because it’s really the states and locals that run their own elections and electoral systems.  But are you seeing any signs today that Russians are targeting these electoral systems or still scanning them, probing them?

Joyce:              We’re not in the middle of an election cycle, so I don’t know that—I would expect it, but—

Nakashima:     Yeah, there’s one next year.

Joyce:              Yeah.  Absence of it doesn’t mean that it’s not happening or going to happen.  You know, the best thing is we’re going to get ahead of it and be prepared.

Nakashima:     And as soon as you do see some of that activity, will you be notifying the states?

Joyce:              So another thing we’ve done is we’ve cleared the state election official lead as well as key state leadership so that they’re cleared at the secret level to facilitate those kind of coms [ph].

Nakashima:     Another element of the interference was the purchase of thousands of ads on Facebook and in other social media and the pushing out of posts and stories that tended to stoke racial and religious divides in the country.  What do you think the government’s role is in countering influence operations of this kind?  And what do the social media companies need to do?

Joyce:              You know, I applaud Facebook for the research.  I think the companies that are involved supporting these platforms, supplying these platforms are the best positioned to understand who is using their platform and to what ends.  So I think the research Facebook did was really insightful in showing us what ads were bought and how they were directed.  I think from that point you’ve got to do a little intel work to understand what the intent is, so that’s a role of the government.  But overall, you know, we can’t allow other nations to hold our free and democratic process at risk.

Nakashima:     And when you do get that intelligence are you sharing it and should you be sharing it with companies?

Joyce:              To some extent.  There is a difference.  Not all people in all companies are cleared, certainly not all.  But there are key people we can interact with.  And we don’t necessarily have to share the intelligence to warn them that there is a concern going on.

Nakashima:     Some of the companies have complained privately that they brought their information to the government and they were never then given any information back that could help them, let’s say, really understand the threat here or figure out who the actors were.  Do you think the government could do more?

Joyce:              I don’t know.  I haven’t heard that complaint, Ellen, so I don’t know.

Nakashima:     Okay, but do you think the intelligence community could do more to share some of this information with the companies so that they can at least identify some of the actors.

Joyce:              Well, since I didn’t know what that interaction was, I can’t comment.

Nakashima:     Okay, fair enough.  Another question is—well, one of the big controversies last year was the Obama administrations reluctance to publicly name Russia in the interference because of its concerns that it would appear to be overly partisan.  So to address that, some have suggested that there be a requirement that the director of national intelligence and the FBI director, say, several months before a federal election be required to disclose to Congress the state of the sort of election systems in the government.  Do you think that’s a good idea?

Joyce:              I don’t know about the requirement, but I have absolutely no doubt Congress will be asking in the run-up to the election.  So, you know, they put good oversight on the executive branch, and I think there is constant communication.  For example, the lead cyber officials at DHS, Jeanette Manfra and Chris Krebs, are up on The Hill today talking to Homeland Security Committee.  I have no doubts that whether there is a requirement or not Congress will call some hearings to understand the state of preparedness before the election.

Nakashima:     Well, here’s the thing, the point of making it a requirement, making it mandatory is it removes the partisan tinge and then that way there is no question, “Well, you have to disclose.”  It’s not like if I disclose will I look to be that I’m putting my thumb on the scales for one party over the other.  And so—

Joyce:              Yeah, I think you wind up in the same place.  If you get called to a congressional hearing on election security, it doesn’t matter if it’s in a law saying you’ve got to do it or if it’s just the congressional scheduling that says we’re going to have this hearing.  You know, the DHS, FBI, the intel community will go up there and be forthcoming if they’re asking.

Nakashima:     And should it be public?  Shouldn’t it be public?

Joyce:              It can’t be public.

Nakashima:     It can’t?

Joyce:              Some of it, because of the intelligence.  So there is an aspect of it that can be public, but there is always going to be some of the most influential secrets that we’re able to understand at the government level that they’ll do in the intel committee behind closed doors.

Nakashima:     I’m just saying I guess if you want to try to inoculate the public to sort of an influence campaign like this, shouldn’t it be public disclosure rather than just to intelligence committees behind closed doors?

Joyce:              Well, understanding that it’s going on and inoculating is different than the level of detail we would share with Congress about all of the details we understand.

Nakashima:     Mm-hmm.  Okay.  You know, you mentioned earlier that President Trump has acknowledged the ICA, the intelligence community’s assessment, and said though that it never changed any result.  But he also has repeatedly voiced some skepticism about Russia’s election meddling.  Given that skepticism, people are wondering how aggressive will this administration be, you think, in seeking to deter Putin from waging another influence operation?

Joyce:              I think we have to.  The basis of our nation is this free and democratic society.  The electoral process is a bedrock to that, so we’ll push back.

Nakashima:     You are certain we’ll push back?

Joyce:              Mm-hmm.

Nakashima:     Great.  Thank you.  Let’s move to an issue I know you care deeply about, what to do about the software flaws that the NSA uses to obtain foreign intelligence but at the same time could potentially put at risk Americans’ cybersecurity and the military and DOD cybersecurity.  So these flaws are at the heart of something called the vulnerabilities equities process where agencies weigh whether or not to disclose the flaws to the software maker or hold onto them so the agencies can use them.  How are you planning to reform this process, Rob, to make it more transparent?  Are you going to reveal more information on the number and types of flaws?

Joyce:              So, you know, it’s clear to me from some of the public discussion on the vulnerability equities process, the VEP process, that it’s not well understood.  And that’s our fault in the executive branch.  In the previous administration the VEP was an executive privilege, run-out-of-the-White-House process where the charter, the participants, and the decision criteria weren’t made public.  Upon arriving one of the things I sat down with the interagency was starting asking the questions of “Why can’t we talk about this?” and there wasn’t a good reason.

So we kicked off the policy making process.  We are right at the goal line of finalizing the charter.  But the intent is a VEP charter that is public and unclassified.  So it will be seen and understood by you and the world as to what we weigh in terms of making the decision to disclose a flaw or protect it for an intel community activity.  It will include the criteria that the panel ways, and it will also include the participants.  Right?  There was a lot of smoke filled room mystery surrounding who was in there, but, you know, it is the intel community, but it’s also the defensive side of NSA; it’s multiple elements in DHS; it’s the Department of Commerce; it’s the Federal Bureau of Investigation and others.  So we’ll talk about who is in it and how they make that.  And then there is an element of talking about the outcomes, you know, what do we put in and what goes out so that people understand.  I think NSA was on record of something north of 90% of the vulnerabilities they discovered were disclosed and patched.  And so that’ll give you an ongoing understanding is that still the case and is the government in that space.  00:26:45

Nakashima:     Well, that’s good.  I mean, would you also reveal more information on—is the 90% relate to five or 50 flaws?  What’s the denominator?

Joyce:              Yeah, so understanding the quantity going in and the quantity coming out.

Nakashima:     Great.  What is the lifetime on average between NSA’s discovery of a flaw and its disclosure?

Joyce:              Sure.  And, just to be clear, it’s not just NSA bringing things.

Nakashima:     Fair point.

Joyce:              NSA, CIA, DOD, FBI.  It varies.  So to get a little technical here, the discovery of a flaw is hard to determine.  So there’s a process you can do to find flaws in software called fuzzing, which is you throw a whole bunch of inputs at a computer and an operating system until it crashes, and at that point technically you have a vulnerability, but that may be a whole pile of stuff.  The next step is looking through those results and deciding if there is a vulnerability that somebody could exploit.  And that takes some experience and some analysis [ph].

Nakashima:     Exploit, turn it into a tool or a weapon, something that can be used to actual hack into a computer?

Joyce:              Yes, correct.  And so at that point you understand now some would define vulnerability at that first technical point.  Some would define it when you’ve actually shown that there is an exploitable flaw.  Either way the point we do it is that second where you can describe the exploitable flaw.  And that, because it’s an art not a science, is sometimes days, sometimes weeks, and sometimes even a year or two.  But it’s unhelpful to talk about things before you can describe what the potential use is and what the potential vulnerability and even danger is, so the defensive folks can’t make a good assessment of how important that flaw is.

Nakashima:     But on average what is the average lag time between the time you decide you can exploit a flaw and the time you eventually disclose it?

Joyce:              It depends.

Nakashima:     Could it be years or?

Joyce:              No, usually not years unless we’ve made a conscious discussion to withhold that.  And if we’ve made a conscious decision to withhold, they’re periodically reviewed.  The current policy is every six months we re-review a decision to hold [ph].  So these are the kind of details I’m happy to push out there and share once we get the final interagency.

Nakashima:     And I’m also just curious how long on average is an exploit useful these days given that there’s constant software patching and updates?

Joyce:              You know, the half-life is shrinking, so it really depends.  So browser vulnerabilities, those are very short lived these days.  The browser industry does a great job of understanding their flaws, and there’s a lot of people poking at it.  Some esoteric device that never has an upgrade path will have what we call a forever day, meaning that there is a flaw and nobody is updating that software, and even if you brought it to the original manufacturer they’re not going to issue a patch.

Nakashima:     Okay, great.  And you mentioned that the rate is over 90%.  Is that good enough when you have viruses like WannaCry out there, which was built around a repackaged NSA hacking tool that had gotten breached and put out there?

Joyce:              So it depends.  You know, in the end it’s the individual vulnerability that we have to make a decision on all of the equities at play.  So we think we’re doing a responsible job in terms of the U.S. talking about this and being proactive in even having this process.  This is something most nations don’t do.  I can tell you the companies in Silicon Valley have gotten a lot of vulnerabilities from the U.S. government over the years.  I don’t personally know of the Chinese, the North Koreans, the Russians bringing any, even though they’re looking at that same software.  So they have vulnerabilities that they’re exploiting and they’re using, and they have no intention of doing responsible disclosure the way we’re doing.

Nakashima:     So let me move to a couple of other questions that arise.  One arises from the news.  Reuters yesterday published an article that found that Hewlett-Packard let the Russians review source code for a cybersecurity system called ArcSight, widely used by the U.S. military so it could sell to Russia’s public sector.  How concerned are you about that and should you as tech companies be letting Russia and China inspect their source code as a condition of doing business?

Joyce:              I’m very worried about the protectionist rules that are going up in a lot of countries.  The idea that you can’t enter China’s market without offering up your intellectual property in this way, without agreeing maybe to hobble some of the security and privacy features of it, like encryption.  Russia is heading that way.  A bunch of the totalitarian regimes are headed that way.  It’s a problem for the free and open internet that we designed and we pushed out there for the world’s benefit.

The security aspects of those disclosures, they’re problematic.  I’m a little more worried about it from the intellectual property point of view of our innovation than I am the security side of it.

Nakashima:     Why?

Joyce:              It gives you an advantage if you can look at the source code but even having a copy of the system; it just makes it a little easier.  If you can get a copy of ArcSight and that’s commercially available around the world.  You can bring it into a lab and you could likely find a flaw.  But if you give your source code to China as a condition of entering that market, you’ve got to wonder if competitors are then going to start to adopt those features and we’ve seen some examples of that in the past and that really concerns us.

Nakashima:     DHS stated in a report this year that all U.S. phone carriers are vulnerable to SS7 exploits, where vulnerabilities that can be exploited by criminals, terrorists, and nation-state actors to track phone and intercept calls in the United States.  Do you agree with DHS that these vulnerabilities threaten our national security?

Joyce:              I do.

Nakashima:     And what are you doing to make sure that the phone companies secure their networks?

Joyce:              So the vulnerability they talk about is actually a known problem.  There are some industry solutions to it.  There are add-on boxes for the networks.  Some of the vendors have patched their software to lessen the impacts of these.  I think there was a 60 Minutes demonstration last year of some of the capabilities you could do by exploiting the vulnerabilities in the protocol.  The protocol was built at a time when we were a little more naïve about the way phone systems could or would be exploited.  So they didn’t have the necessary security protocols.  So it’s clear we’ve got to retrofit and start closing down these holes.

I’ve personally talked to at least two of the carriers and it’s their intent to move and close down these holes.  They can’t do it overnight.  It’s an investment decision.  I know of at least one carrier that has moved pretty aggressively to tighten up those flaws and I think with the light being shined on it, they’ll be pushed harder.  I also know there’s some congressional interest in maybe even pushing or mandating the closure of those holes as well.

Nakashima:     So speaking of Congress, the Senate recently passed a government-wide ban on Kaspersky software in government on the grounds that it provides a channel for Russian government espionage.  What kind of evidence can you provide to justify such a move and what kind of precedent does it set for the U.S. to name a particular company in a ban like this?  Do you think other companies might retaliate by banning U.S. companies that they might accuse of cooperating with the NSA?

Joyce:              Yeah, so I think that we made bad decisions in the past and there’s no reason to perpetuate those.  You won’t find major U.S. software providers operating at the root level on the Russian foreign ministry, on the Russian intel services?  They just didn’t make those mistakes in the past so we looked at that and said if you think about what antivirus is, it is something that runs at the very lowest level of the operating system beneath all of the other software and checks you have.  It’s designed to scan every file on your computer.  It scans those files looking for things based on a series of commands that come from the company.  That company is a Russian company.  We have plenty of examples of Russian companies being compelled and cooperating with Russian intelligence and there’s even a law requiring participation in intelligence activities by those communications and computer companies in Russia.

So as that data comes off your machine and goes back to Russia, it’s vulnerable and available.  And so we looked at that and made a risky decision that we can’t tolerate these on government networks and that’s the basis for it.  People keep asking for the intelligence justification.  There’s no reason we should be compelled to show the intelligence.  That in and of itself is a very compelling case.  Just on those merits, we would be making poor decisions if we put that on sensitive U.S. government networks.

Nakashima:     But do you know for a fact that there has been sensitive data that was siphoned off by the Russians to the Russian government through Kaspersky software from the U.S.?

Joyce:              I’m not going to address our intelligence sources and methods.

Nakashima:     So you’re not the cyber czar, but if you were, what’s the number one structural or organizational change you would make in the U.S. government?

Joyce:              I’m not going to push that rock uphill, right?  [LAUGHTER] The U.S. government, if you look at the way we’re presently organized, Cyber has so many committees of jurisdiction.  The great thing about cyber is it’s an apolitical issue.  Democrats and Republicans alike understand the risk, want to help, are passionate about driving down the risks we face in cyber.  Unfortunately, what we see is the committees of jurisdiction have very strong opinions about the way we should be structured and who should be in charge and who should have a lane.  And we can either expend a lot of energy on trying to shape those decisions or we can go and optimize what we’ve got and that’s what I’m focused on right now is optimizing what we have.

Nakashima:     And I’ll try one more time.  Forget Congress, what would you say?  Forget Congress.

Joyce:              Nice try, Ellen.  [LAUGHTER]

Nakashima:     Finally and we have two minutes left.  Just can you give us an update on your reports?  I think you think you have a deterrent strategy due soon.  When are you going to come up with that and then maybe another report on just how you’re going to organize to help critical infrastructure to sector nine entities?

Joyce:              So through the executive order that President Trump signed out, there was a series of studies and reports kicked off.  Those have been rolling in.  All of them that were due through October are drafted and a fewer and final review from the secretaries.  But those were meant to inform our greater strategy work and what’s it done is galvanize the interagency to come up with great ideas and talk about where we are and more importantly, where we need to be.  So I’m really happy with how they’re responding.

Nakashima:     When do we expect to see a deterrent strategy to come out?

Joyce:              So the deterrence report that comes in is not intended to be public, right?  It is intended to inform us in a greater cyber strategy.  And so hopefully, depending on interagency coordination, you’ll see a cyber strategy later this fall.

Nakashima:     Later this fall?  Okay, well, thank you very much, Rob, for your time today and up next, we have my colleague, Craig Timberg, who will be speaking with General Michael Hayden and Richard Clarke and let’s give Rob a big hand for coming here with us today.  Thank you, Rob.

Threats Facing America

Timberg:          Good morning, everybody.  I’m Craig Timberg.  I’m the national technology reporter at the Washington Post.  Beside me, we have two of the giants of cybersecurity and intelligence in Washington for the longest—

M:                    Whoa.  We’re old.  [LAUGHTER]

Timberg:          For decades now.

M:                    This is how he means we’re old.

Timberg:          This is General Hayden.  He’s the former head of the NSA and the CIA.  Richard Clarke here is a top White House advisor, a counter-terrorism advisor to presidents for some years now and I wanted to dive right into the news that has been occupying all of us at the Washington Post and other news organization for a while and get their perspective on this.  We’ve been writing a lot about Facebook and propaganda that emanated from the Russians and having covered cybersecurity for the past few years, this is something of a novel take on the issue, right?  I think we all think of cybersecurity has hacks, penetrating systems.  And here, we have a foreign adversary essentially using tools that Silicon Valley created to influence a U.S. election.  So I’m curious if we should think about this as a cybersecurity issue and what sort of lessons we should take away from what’s happened.  Why don’t we start with General Hayden?

Hayden:          Sure, first of all, thank you for your reporting and as I follow it along, the wheels start to turn here as to what does this stuff in the foreground mean for how we should think about the deep background.  And Craig, what I’m reminded of, I kind of got dipped into this cyber thing in the mid-90s.  I’d come out of the Balkans where there was a war that was Medieval in cause and conduct and I’m dropped into San Antonio to take command of the Air Intelligence Agency and we’re kind of leading-edge, thinking about this within the Department of Defense and a lot of the things that have happened today, you can see the roots back in Security Hill in San Antonio in the ‘90s.

Cyber command, cyber as a domain.  Those concepts, which are now American organization, the American military doctrine.  We had a debate; I still recall this.  We had the AIA, the Air Force unit I commanded and I also headed up something called the Joint Command and Control Warfare Center, which is by definition, a joint body.  And the philosophical discussion and we looked like Jesuits at a Medieval university.  [LAUGHTER] As to what is it we’re talking about and there was a body of thought that we talking about cyber.  Land, sea, air, space, cyber.  It’s a domain.  It’s an operational environment.  We’re going to fight there.  And then there was a body of thought that what we really were talking about was information.  And so if you thought it was the latter, then all of a sudden, what it was you need to be concerned about was not just your network, not just your computer, not just digital things, but information things.  Looking backward, it included defense suppression, electronic countermeasures.

Looking forward, you would actually bring in psyops, or what we’re now calling MISO, psychological operations.  We even thought you probably had to bring the public affairs guys into this bubble, all right?  So we had this grand debate.  I think history will judge.  We broke over here.  We broke over here with cyber and that’s how you get a cyber command and that’s been quite successful and that’s a good news story and this is not that.  The Russians went here and if you read the writings of General Gerasimov, who is now the chief of the general staff, he talks quite plainly, quite openly about combat in the information domain.

And now, we see the Russians performing mightily against our political structures in the information domain that even now, we are still trying to look at it through our lens over here, which is cyber; they stole our stuff.  When the intel community finally got up to Trump Tower on 6 January, you had Jim Clapper and Mike Rogers and Jim Comey and they finally laid out all of the cases that, “No, the Russians stole the DNC data and then they did this and then they did that.”  The Trump campaign, no longer able to kind of deny fact-of redefined what they had just been briefed on as, “We’ve really got a cyber problem.  It’s a bad cyber problem.  The Russians, the Chinese, that 400-pound guy.  We’ve really got a cyber problem.”

And that is a reflection of our lack of definitional expertise.  We think about this as a digital thing, the theft of data and the firewall.  All-important and we’ll get to that directly.  But what the Russians are doing is taking the game to a completely different level.  They’re trying to dominate in the information space.  It’s just not a cyber issue.

Timberg:          And Mr. Clarke, does he have it right?

Clarke:             He does but let me frame it as two distinct points.  This is a VIN diagram between cybersecurity, which is clearly an aspect of this and psychological warfare, which is clearly an aspect and they overlap.  The Russians are using cybersecurity techniques as a way to conduct their psychological warfare operations but it’s not new.  The Russians have been doing what they call dezinformatsiya and active measures since the Russian Revolution and before and again, let me compliment you on your great reporting.  Thank God for the Washington Post. 

Timberg:          Can I get a t-shirt that says that?  [LAUGHTER]

Clarke:             I actually have one that says something about the—

Clarke:             Or how about this: “American intelligence says thank you, Washington Post.”  First time ever.  But I do have a slight criticism of your reporting.  It’s very tactical and we want those tactical stories and revelations but we need to step back and look at the strategic issue here.  What’s going on?  What’s going on is that Russia has decided that it’s going to be aggressive in regaining its standing in the world and what happened in the United States is a piece of that.  What continues to happen in the United States is a piece of that.  They seek to have countries divide into little bits.  They’re very pro-Brexit.  I’m sure they would be supportive of California breaking off.  In fact, they were.  They’re very supportive of putting one faction against the other and that’s why they’re behind the Facebook ads and whatnot.  They’re trying to create chaos and as we go forward, as Bob Mueller comes out with the results of his investigation, we need to recognize that their goal is chaos.  In the United States and Western Europe.  We have to constantly remind people of that and we need to think about that in everything we do.

We don’t want to give them the goal that they seek and when they’re involved in manipulating things to advance chaos in our society here and in Western Europe, we need to call them out.  That’s the first point.  The second point is that this presidential election was the first presidential election after social media became probably the major way in which American voters get news.  That’s an important point.  The majority of Americans get news from social media.  They don’t watch the ABC News at 6:30, but they should—since I work for ABC News.  [LAUGHTER]

M:                    CNN is good too about that.

Clarke:             I hear that.  I wouldn’t know.  I don’t watch it.  It’s on cable, I think [ph].  [LAUGHTER] But most Americans get their news now from social media and therefore, it’s not surprising that this was the first election in which social media were used as a weapon by a foreign government.  Because a foreign government has now decided to get involved in our elections, both through social media and through hacking of political campaigns, political parties, and state election offices, we can no longer afford the approach that federal elections can be left up to county commissioners.  As much as I love county commissioners and I know they’re trying to do a good job on a small budget; federal elections cannot be the security.  Federal elections cannot be their responsibility.  The federal government needs to reassert and under the constitution, it can.

The Constitution says the federal government can designate the manner of these elections.  The federal government needs to reassert its priority—control of federal elections and monitor them for security and adequately fund their security. And not give that over to county commissioners who don’t have the expertise or the money.

Timberg:          It’s interesting to hear you say that.  Back a year ago, I was working on a story about could the U.S. election be hacked, right?  And I was looking at those issues, the voting machines and paper ballots and those touch screen ballots and looking back on that reporting and our thinking about it at the time, we really kind of missed the boat.  The election wasn’t going to be hacked in the formal kind of—

M:                    The count.

Timberg:          Yeah, the count wasn’t going to be hacked but the election apparently—

M:                    Affected.

Timberg:          Yeah, it could be affected by these forces that we didn’t perceive adequately at the time and it makes me maybe a tiny bit more sympathetic to Facebook’s account.  That, “Oh, we just didn’t see this coming.”  So Mark Zuckerberg famously says a few days after the election that it was a pretty crazy idea to think that fake news could have altered the outcome of the election and maybe he’s right.  But I think it is very clear now that they had trouble conceptualizing of the kind of attack that actually was leveled on them and on the U.S. political system, which makes me wonder.  Is it reasonable to think that somebody should have understood this sooner and put up alarm bells in a more forceful—let me start with you, Mr. Clarke on this one.

Clarke:             I’ve just written a book called Warnings.  It’s available on Amazon.  [LAUGHTER] In which I look at 14 case studies across a whole variety of disciplines asking why didn’t people pay attention when an expert gave a warning that later turned out to be true?  And I have a little methodology for figuring out which experts are likely to be true.  The bottom line is this happens in every field of endeavor; that you miss warnings.  There are institutional settings that look for warnings.  Sometimes they’re good, sometimes they’re not.  This one, I think people can be forgiven for not seeing coming.

Timberg:          General Hayden, you were warning before the election about the prospect of the Russians influencing this?

Clarke:             Yeah.

Timberg:          Did you conceptualize the kind of—

Clarke:             No, no—

Timberg:          Sort of attack?

Clarke:             Your reporting suggests to me that they had gotten their game to a point beyond which I had assumed.  So let’s parse out what happened.  They stole the data.  I think that’s general agreement.  That’s a baby step.  By the way, I’ve got to tell you: stealing the DNC emails, hacking into that that is actually honorable international espionage.  All countries do that.

Timberg:          Including ours?  Would we do that?

Clarke:             Yeah, if I were director of NSA and we could penetrate United Russia; just Putin’s party and we actually really cared what they said, that would be a legitimate foreign intelligence target.

Timberg:          Hypothetically speaking?

Clarke:             Yes, yeah.  It would be a legitimate target under accepted international practice.  What happens next then, that’s the big deal.  That’s the term we’ve been doing is weaponizing the information.  So it’s Wikileaks, it’s DCleaks.  It’s pushing it forward.  That’s kind of linear.  I’d kind of expect that.  Then the army of trolls who touched the data and who kind of convinced the Google algorithms that these stories are trending.  Whoa, now, wow.  That’s not linear.  That’s a discontinuity.  That’s getting more interesting and now, what your stories are reporting is micro-targeting of specific groups in the American electoral process.  Whoa.  You can see the revelations growing here that sophistication of the Russian effort was I think far greater than even—I’ve talked to John Brennan and Jim Clapper.  They kind of got wind of, “This isn’t right”, about April, May.  It was different of a kind than we’ve seen in the past.

By August, they were urgent.  That this is different in scale, it’s different in style, and it’s different in that it’s approved at the highest level of the Russian Federation.  They had a little trouble getting the political attention that they needed because frankly, it forced the president with a really ugly decision.

Timberg:          And it had never happened before.

Hayden:          Yeah, and the instinct is to make sure you’re really sure before you force an ugly decision on me so you lose some time.  So in August, John gets to throw a brushback pitch at Bortnikov, who is his counterpart, saying, “Knock this off.”  September, the president does the verbal warning to Putin at the G20 and then in October, Jim Clapper and Jay Johnson go out with kind of the facts of the case.  But even the facts of that case don’t get to what it is you’re now reporting.

Clarke:             But if you look back for the warning, I think the warning was in, again, American journalism.  Great piece in 2015 in another newspaper, the Sunday Magazine about the Internet Institute in Petersburg.  And a fascinating story about the trolls, about the bots.  And the key to this story was that the Internet Institute in Petersburg had run a story in U.S. social media that a refinery had blown up in Louisiana.  And it was very well-done, very convincing.  It looked like CNN stories and whatnot.  And of course, the refinery had not blown up and the article asked, “Why would they have done that?  Why would they have tried to convince people that a refinery had blown up in Louisiana?”  And I remember sitting at the dining room table on a Sunday, looking at that story and saying, “This is an important story and I don’t know what’s going on.  I don’t understand why Russia would want to convince people that a refinery had blown up in Louisiana.  But we need to keep thinking about this.”  And then I forgot about it.

Hayden:          [LAUGHTER] Before you move on, Craig, can I just add—and this is getting harder, all right?  Because there are Americans who are using everything, we just talked about here to challenge the legitimacy of the current president of the United States and I think it’s really important intellectually for us to separate those two things.  Did the Russians affect the election?  I think the answer is absolutely.  How much?  I have no idea.  It’s not just that I don’t know.  It is unknowable so we just have to cabin that or put that away.  There’s no question about legitimacy and now we’ve got to get on with this.

Clarke:             There is a question about collusion.

Hayden:          And we’ll find more about that, yes or no?

Clarke:             I think we will, yes.

Timberg:          Well, as fascinating as this is.  [LAUGHTER] Let’s pivot to another issue that’s been in the news that’s more of a classic cybersecurity issue: the Equifax hack.  One of the worst in history by various measures: 145 million consumers now seem wrapped up in it.  Incredibly personal data, social security, names, birthdays, credit card numbers and such.  Let me start with you, Mr. Clark.  Aside from like the obvious stuff, like you need to patch your software, at a policy level, what are the lessons learned with the Equifax hack?

Clarke:             Well, there’s policy level within companies and then there’s policy level at the national level.  Let’s start with companies.  Policy levels in companies is you have to have a good risk profile of your company, know what could go wrong.  That matters and you have to have a governing structure that monitors that.  So it cannot be the case that you get hacked and the CFO doesn’t know that day or the CEO doesn’t know.  I’m a member of corporate boards and I’m not surprised, the guy who is supposed to be keeping an eye on cybersecurity for these companies at the board level. I need to know the day if one of my companies gets hacked.  That day.  So they had a governance problem.  But part of the governance is knowing whether or not you’re doing enough.  Obviously, they were not doing enough.  They didn’t have network discovery.  They didn’t know what was going on on their own network.  They apparently didn’t even know they had machines on their network that hadn’t been patched.

Companies need to realize that you need to spend money to do this.  Beginning in the 1990s, we underpriced information technology.  We put information technology into companies across the board, every kind of company.  We did it quickly.  American productivity went through the roof as a result and we said, “Wow.  We can do things so much more cheaply now that we have information technology”.  Wrong.  The calculation on how much information technology costs never priced in security and companies today are on average, spending between 3 and 5% of their IT budget on cybersecurity.  That is a recipe for disaster.  If you’re spending 3 to 5%, I know it’s a gross metric, but if that’s where you are, you’re going to get hacked.  You cannot do all of the many, many, many things you need to do to secure a network today.

We know how to secure networks today.  There are companies that do not get hacked.  There are government organizations that do not get hacked.  We know how to do it.  It takes a lot of skill, a lot of skilled people, and a lot of software.  And it costs money and it costs much more, like 8 or 10% or 12% of your IT budget than three to five.  So that’s at the corporate level.  At the national level, we do—I agree with Rob.  I think I overheard him say that we need a national standard for breach notification.  We do.  That national standard cannot be less than the existing California standard.  The last time we tried to do this at the national level, the Chamber of Commerce and people like that came out of the woodwork and said, “Oh, yes.  Let’s have a national level and let’s make it meaningless”.  We do need a national breach notification law and it needs to be at least as good as the best state law.  Number one.

Number two, companies like Equifax will continue to screw up until there is a penalty for doing so.  My colleague, Rob Knake, he used to serve at the White House, said the other day, “The way we solved oil spills and made it something an oil company really doesn’t want to have happen—oil spills—is we charged them by the gallon for what they spilt, by federal law.  And after that federal law, oil spills plummeted.”  If we charge people by the data loss, if public information, public identifiable information is lost, you should pay for it by the person.  You should pay $10 for every one of them.  Multiply that times 143 million.  If you pass that law, this problem will go away.

Hayden:          I was just going to pile on.  I was going to bring up Rob Knake’s article as well.  And Rob reasons by analogy with the Exxon Valdez.  And after the Exxon Valdese is passed, indeed, Congress says, “You’re going to pay for it by the drink.  If it goes out, you’re going to pay for it, which then oil companies to look for insurance.  And so there’s a private sector motivator.

Clarke:             And insurance requirements came in.  If you’re going to get insurance, you have to do X.

Hayden:          It’s not the government saying “thou shalt,” or how you should do it.  It was, “Here’s the cost for doing it,” and then they want to go get insured.  Then the insurance companies say, “Well, let me give you the number here.  There’s your number.  If you just do it in single-hulled tankers.  On the other hand, if you invest in double-hulled tankers, let me give you the insurance.”  And everyone moved to double-hulled tankers.  So what Rob is suggesting is a government role [ph] that doesn’t compel how you do this, sets up a penalty, and American business models with insurance will take you to a very happy place, the way it did with oil.

Clarke:             Right.  And in that example, an insurance company would have said to Equifax, “You have all of this data.  You have to encrypt it if you want to be insured.”  And encryption doesn’t work unless it’s paired with good identity access management, multifactor privileged access management.  So have a multifactor PAM system, have encryption, and we’ll insure you.  If that had happened, we wouldn’t have an Equifax problem.

Hayden:          But do you see the government rather than, you know, whistle, red hat, clipboard walking in and giving you check marks, the government sets up a structure that drives the business case over a long term to be responsive and responsible.

Clarke:             But that will never happen as long as the Congress thinks that regulation is a four-letter word.  We need additional regulation.  All regulation is not bad.  And this nonsense that we have to get rid of two regulations every time we put one in is just silly.  We need to do regulation like the one we’ve just discussed.

Timberg:          So here we all agree that we haven’t always seen the new threat coming, other than Mr. Clark, who’s made a career of seeing the next threat.

M:                    I missed the last one.

Timberg:          So tell me what aren’t we seeing now?  Like what other threats are coming around the horizon that you all, in the business, can see?

Hayden:          When I was director I used to get this question, “What do you think is going to surprise you next year?”  [LAUGHTER]

Timberg:          So, director, what’s going to surprise us?

M:                    There’s a difference between intelligence and fortune telling.  [LAUGHTER]

Hayden:          You know, Craig, I don’t have a good answer.  And I didn’t have a good answer for that question too, what’s going to surprise you next year.  I just kind of made a face, you know, from the last several years in government.  I will offer a couple of thoughts, kind of factor bearing on the problem.  This is my best shot.

There either is a tendency, particularly within government, where the walls are high, and you keep the secrets in.  There is a tendency in government to underplay discontinuities.  There’s a tendency in government to think tomorrow’s going to look a lot like today.  I had the phrase used to me recently by an intelligence veteran, “We’re not so good at looking around corners.”  And what you get is the tyranny of expertise.  So one very concrete example, you have a poor, unfortunate setting fire to himself in a midsized Tunisian city.  Recall?  And the original estimate out of the American intelligence community is that Ben Ali, the dictator of Tunisia, been there, done that, saw the movie.  He knows how to deal with this.  They’ll hit white water for 7 to 10 days, but in the end Ben Ali will still be there.

Well, not only was Ben Ali gone, there were a million people in Tahrir Square in Egypt within five or six weeks.  It was the tyranny of expertise, the projection of continuity that captured this.  And answering this, Craig, is a government guy.  One way the government might be able to reduce that danger is to actually open the doors to the views of the broader American society who actually has some interesting perspectives on things going on out there.  And your morning, Sunday magazine guy had a thought out there—no intelligence, no purloined secrets, no espionage, but just because he’s not trapped into, “I’ve got the PDB by tomorrow”—is able to look at these alternative scenarios.

So I don’t have a good answer, but I do have a way to perhaps get us to a better place, and that is a far more permeable membrane between our security services and the wisdom of the broader society.

Timberg:          Mr. Clarke?

Clarke:             Well, that’s what I call for in the book Warnings, which is available on Amazon.  [LAUGHTER]

Hayden:          I have a book there too, but it’s not the same title.  [LAUGHTER]

Clarke:             The things that we look at in Warnings for the future are the internet of things, artificial intelligence, gene editing, CRISPR/Cas9, among others.  Of those, appropriate for this audience, I think is both AI and the internet of things.  If the numbers that Gartner, for example, uses on the internet of things are right, we’re going from approximately 5 billion devices today on the internet of things to approximately 30 billion devices in the next three to four years.  That’s an extraordinary revolution.  A quiet revolution.  One that will occur without your seeing it.

But to use the jargon of the cyber world, that vastly increases the attack surface.  Most of those devices are being rushed to market without embedded security.  And every time that has happened in the past it’s been disastrous.  So if I were forced to make a prediction, which I don’t like to do any more than you do, I would at least thing about things that could result as consequence of the internet of things.

And then on AI, there’s a whole debate going on about whether or not we need to control, regulate, limit AI.  While I’m in favor of some regulation, and I don’t run away when someone says the word “regulation,” I think the idea of regulating AI right now is crazy.  We need to see where that innovation can take us, keeping an eye on it.  Maybe the people who worry about AI are right.  I doubt it, but keeping an eye on it.  But please do not suppress the innovation that can come from AI.

Hayden:          And just to take a thought that Richard had here about artificial intelligence, internet of things, we’re just going to get more of this, right?  So the broad assumption, I think, and run the clock back 10 or 15 years, as you get more and more immediate, intimate, communications between and among human beings, I think there is a natural instinct that that would lead broadly, over time, in the direction of greater understanding because we get to know each other better.

It has led in the opposite direction.  It had driven most of our species back to tribe, back to family, back to faith, and away from a convergence in terms of a common human identity.  That was unexpected.  I guess one of the questions I would ask is do we continue down that path, which is kind of nativist, nationalist, populist?  Do we continue down that path, or is that just the biproduct of first-generation drivers in the new domain?  And as we become more experienced drivers, does what we had expected in the beginning become more and more prominent?  I don’t know.  But right now, this thing that connects us magically, technologically seems to be most useful as a tool for those who would divide us.  Is that a permanent condition or not?

Timberg:          Thank you both, gentlemen.  I really appreciate the robust discussion we’ve had.  If everyone can give a round of applause for this—[APPLAUSE] Thank you.

M:                    We have to disagree more.  [LAUGHTER]

Timberg:          Next time.  So up next is my colleague, Brian Fung, and a panel of experts to keep up the discussion.  Thank you all for being here with us today.  We’ll do this again sometime soon.  Thank you, gentlemen.

Cybersecurity and Civil Liberties

Fung:               Well, good morning.  Thanks for joining us.  This is going to be the third and final panel of the morning.  The congressmen will be joining us very shortly, but for the moment, I’m joined by—here we go.  Welcome, congressman.

Hurd:               Thank you.

Fung:               So I’m joined by Congressman Will Hurd, who represents the 23rd District of Texas.  He is a former CIA operative, and a former cybersecurity professional.  To his left, we’ve got Michelle Richardson, who is with the Center for Democracy and Technology.  She formerly was doing legislative issues at the ACLU, and is an expert in civil liberties and privacy.  Next, we’ve got Chris Furlow, who is at Ridge Global, which is a risk management firm.  Chris is a vaunted expert in risk management techniques and practices.  And finally, we’ve got Marcy Wheeler, who is an expert national security journalist, and knows about all things government surveillance and privacy.

So let’s just jump right in.  Congressman, based on your professional cybersecurity background, just tell us a little bit about how you think this administration is doing on cybersecurity issues thus far, nine months in.  And if anyone else wants to jump in after that, feel free.

Hurd:               Well, I think cybersecurity is probably one of the few remaining bipartisan issues in Capitol Hill.  And that’s something that I’ve enjoyed working on.  I think one of the difficulties that we’re looking at now with the number of acting chief information officers, the CIOs, through the agencies, I’m worried that we don’t want to stall some of the progress that we’ve achieved over the last couple of years.  But there’s recent legislation that I was involved in.  It’s called the Modernizing Government Technology Act, which empowers CIOs to introduce the latest and greatest technology into their digital infrastructure, which is important.

So we’re trying to change the tools that the federal CIOs have in order to protect the .gov space.  This is something that this administration has made a priority.  They’ve let the agency heads recognize and understand that cybersecurity is one of their functions.  And that they’re going to be ultimately held responsible for cybersecurity.  We still have agencies where the federal CIO doesn’t report directly to the agency head or the deputy agency head, which would never happen in private sector.

But the broader conversation on what is a digital act of war, and what should our response be, those conversations—and again, this is not just being critical of this current administration.  We had the same issues under the last administration.  We all know that if North Korea had launched a missile on the Equifax headquarters, we all know what the response would be.

F:                     We’d cheer.  [LAUGHTER] Kidding.

Hurd:               And by the way, I do not condone violence against any commercial building.  We know how we would respond.  And so why do we not have some of those same rules of engagement when it comes to the digital space?  A response to a digital attack may not always be a digital response.  Look to the U.N. on what they define as an act of war.  Messing with someone’s utility grid is considered an act of war, but when the Russians did it against the Ukrainians, what was the response?  Nothing.

There’s still so many countries that don’t have criminal laws when it comes to hacking, and that’s why I think the elimination of Chris Painter’s position at the State Department was a bad move because we still need to be working with many of our allies to tighten up criminal laws so that we have an international framework to push some of these issues.  So there’s still a lot of questions that we need to answer.  But again, because this is a bipartisan issue, I think we can keep pushing and getting to those answers.

Fung:               Anyone else want to jump in on that?

Furlow:            I would just like to say this morning the conversation is centered quite a bit around policy that happens here in Washington, D.C.  And we’re talking about global paradigms and the like.  We can’t lose sight when we’re talking about regulation, when we’re talking about legislation and the like, without thinking about those small and medium-sized businesses.  I think that oftentimes when we think about the private sector as it related to cybersecurity, we forget they are a really important element in all this.

Intelligence community aside, all those things are very important.  But if you look at who’s being targeted, quite often, frequently, it’s small and medium-sized businesses.  They have to be a part of that conversation.  So as we look at what the administration is doing relative to the EO, for example, and they’re looking for inputs coming in through that process, we have to make sure that we don’t take that element off the table.  Because if you think about the big companies in this country, you’ve got small and medium-sized businesses in their supply chain.  They can potentially be a backdoor.  So we need to include them in this national conversation that’s ongoing.

Fung:               Marcy, how well is that process happening of including smaller and medium-sized businesses in this conversation?

Wheeler:          I live in Michigan, which is one of the kind of frontlines of IoT because people are putting stuff in cars that has no business to be in cars.  But you know, I’ve been in Silicon Valley talking about cybersecurity with people in Silicon Valley.  They’re like, “Why can’t the car companies deal with X, Y, and Z?”  And I’m like, “They’re dealing with the same problem Google can’t fix with updating phones.  If Google cannot update Android phones, we shouldn’t expect Chrysler to be able to update their Jeeps.”  I mean, it is absolutely true that most people who are in this conversation have no idea what the development cycle in the automotive industry is, and have no idea—you know, you’re talking five years of exposure to some provider who has no margins.  So it’s—[OVERLAPPING]

Hurd:               And automakers have a long track record of working in safety, and they have their minds attuned to it.  So I’ve just recently started installing some smart devices in my house, and this is very top of mind for me.  What worries me—

Furlow:            Still the packages at your door, right?

Hurd:               Right.  What worries me is, you know, what happens when a company whose experiences in making dishwashers, or making toasters, they don’t have the background in privacy by design, or security by design, to handle the threats that might arise.  So how do you deal with that?

Richardson:     Yeah.  And this is somewhere that the administration can go forward and in a way that actually protects civil liberties and advances cybersecurity at the same time. And Mr. Joyce has talked about the internet of things and the concern there.  And almost everything is connected.  It’s collecting the type of data that’s just never existed or been available before.  And it’s not protected by privacy law, right?  It doesn’t shoehorn into something that exists.  So we need to have a thorough conversation about all the people who are going to have access to this.  And that means the companies, maybe hackers, and the government, and what are the rules around that?  That’s a way that we can move both the civil liberties and the cybersecurity forward at the same time.

Hurd:               When I leave here today, I’m going to do a hearing on the cybersecurity around the internet of things.  Many of the questions, yes, let’s not make the same mistakes when we talk about the internet of things that we made with the internet.  We should be having security be at the beginning, and not an afterthought.  But one of the questions when you legislate, can you legislate for outcomes?  Because if we legislate it, we should do X, Y, and Z, that’s not going to evolve over time.  By the time we actually have passed the legislation, is it already old news?

An example, I would say, is political advertising, right?  When the federal election law was written, we were thinking about broadcast, and news, and radio, but we weren’t talking about social media.  So that’s an example of how you have legislation that doesn’t evolve over time with things that we can’t even think about right now.  And yes, a lot of the data that’s being collected, people haven’t been able to opt into it.  Who opted in to Equifax?  Nobody.  And so now that has changed how you do authentication.  What I find interesting is that the Equifax breach has probably not reached, in my opinion, kind of the anger that we saw with OPM.  And part of it is because we’re not going to see the impacts for months, if not years.  It really has changed the way that major banks or things are able to do authentication.

So if we want to legislate, how do you make it to where it evolves in an ever-evolving industry.

Fung:               So I want to come back to this issue of legislation and making the case for legislation, because it often seems like in these types of situation, you have industries who argue, “Well, look, this is a newly evolving industry.  Don’t rust to regulate.”  I think one of our earlier panelists, I think it was Rob Joyce, who said, “Don’t rush to balkanize the regulation.”  How do you change the conversation around that so that legislation becomes possible?

Hurd:               Well, it starts with raising the common body of knowledge on these topics.  It’s hard.  It starts with educating staff.  You’ve got to have members that are dealing with these issues and understand it.  So many times, many of my colleagues think that the dark web is the direct messaging function of Twitter.  [LAUGHTER] That is not the dark web.  Or that the cloud is emerging technology.  The cloud has emerged.  [LAUGHTER] That is some of the difficulties that we’re having to deal with.

But the same principles and theories that we’ve always been dealing with apply now.  When we talk about encryption, I’m on the side where you strengthen encryption, not weakening it, right?  Encryption is good for our national security.  Encryption is good for our economy.  We should be making it stronger.  We’ve had these debates when locks came out, or the latest safe came out.  So we’ve had these conversations already.  We just have to figure out how do we apply the principles that we all hold so dear to the latest technology.  That means you have to understand it.

And by the way, this is all going to change when quantum computing becomes a thing.  Quantum computing is going to be here sooner rather than later.  To me, that is what is the issue of who masters quantum computing is going to be the hegemon in the world because how we do encryption and think about protection will be null and void.

Wheeler:          I think you’re right, Congressman Hurd, but one of the things I find in talking to people on the Hill—and I’m sure, Michelle, you’ve had the same experiences—is there really aren’t the staffers.  I mean, certainly, members of Congress, who tend to be old, don’t the technical experience.  You’re one of the few people.  There are probably eight that I can go to who I can trust to know something about technology.  The staffers aren’t there either.  So on the Hill, we really have a dearth of knowledge about technology, the impact of it, the risks of it, and that’s something I think we need to do better at, especially on the intelligence committee.

Furlow:            And that’s what’s happening at the policy-maker level, right?  So within companies as well, you also have catchup that needs to be done.  So if you think about it, most of the folks who are in leadership positions in corporate entities today, or businesses today, or they’re on the boards, for example, they weren’t taught this type of risk management when they were in business school.  They were taught financial risk management.  This is a real challenge.  We’ve got to bring people up to speed.  So we can talk about the quantum computing and those things, but right now, that’s over some folks’ heads.

I want to be very clear.  I’m not saying you can’t remove the accountability at the corporate level for what goes on in the cyber environment of the company.  You can’t do that.  But let’s be realistic about the education process that needs to take place, and try and pull public and private resources together in order to address that issue.  So that from a governance standpoint we can begin to make progress because what’s happened so far is that the technology has outpaced our ability to define a very good governance system, both at the corporate level and, frankly, at the policy level.

Richardson:     Well, Brian, I think one way forward is to focus on what the government purchases itself, which is why I’m excited about Mr. Hurd’s bill, or the American Technology Counsel out of the White House.  And this idea that if we are going to be updating our systems and software, we make sure there are cybersecurity standards in it, right?  There’s privacy by design, we would argue, and these are things that will stand the test of time.  The government is at its zenith when it’s purchasing its own products, and hopefully it will lift all boats though overtime, as some of these things are built into widely used products.  That would be a way to cabin the issue, and do something more specific than trying to address the entire technology sector at one time.

Hurd:               We’re talking about some fairly high level, maybe midlevel, concepts here.  But there’s some basic stuff that we’re still getting wrong.  Patch your software, okay?  [APPLAUSE] None of the hacks that we’re talking about are zero-day exploits.  Zero day is a previously unknown vulnerability that’s been taken advantage of.  That’s not what’s happening.  It is people not patching their software; misconfiguring your systems; not doing basic stuff like having two-factor authentication.  You’re a network administrator using the word “password” as a password.

M:                    Folks being careless.

Hurd:               Who is still doing that?  [LAUGHTER] And so it’s basic digital hygiene.  So how we can improve this in the government, oversight of Congress.  That’s why we do a thing, it’s a FITARA scorecard, but we’re turning it into a digital hygiene scorecard.  So we’re looking at how many people are consolidating data centers.  Every time we do a review, some other agency finds out, “Oh, we have more data centers than we expected.”  You were supposed to do review of this five years ago.  How are you still now finding that you have extra data centers that you never knew about?

Or licensing a software.  We are introducing this into our scorecard.  If we would have graded everybody last January, everyone would have gotten an F.  They still don’t know what software they’re using on their systems, right?  It’s unimaginable if that were to happen in private sector.  We can’t let it continue.  That’s why it’s important for Congress to continue to shine a light on some of these problems.  I realize some of my hearing are really boring because I bring them over and it’s like, “Why are you doing this?”  And they’re like, “Oh, congressman, we fixed it yesterday.”  Okay, great, you know?  But at least it’s getting done.

So we have to continue to shine a light, and it’s great to have other outside organizations that are looking at these issues, and helping to guide and talk about some of the problems we should be looking at.

Fung:               At this point, does the general public face more of a risk on cyber or privacy from the public sector or from the private sector?

Wheeler:          Yes.  No, both.  I mean, the lesson of OPM and Equi hacks [ph] is that you can’t trust either the government or the private sector with your data, right?  I mean, I think there are some tech companies that I might trust with my data, except they’re using it to advertise to me, or selling ads to the Russians.

So I think that one of the things that happened while we’ve been sitting here is that Europe ruled in favor of Max Schrems again.  For those who know, he’s an activist.  So Europe continues to say, “You need to be able to tell consumers what you’re doing with their data.”  In Europe, they laugh at us for how freely we, as Americans, give our data over.  Europe tends to add pressure to the Facebooks and Googles of the world, and we either need to start doing that in the United States as well, or we need to really leverage what Europe forces our tech companies to do, and our government to do.

Hurd:               But let’s not use Germany as any example because the German government owns a third of the telecommunications infrastructure in that country.  The B&D;, their intelligence service, can do things against German citizens in Germany that the NSA would never even think about.  So I get frustrated with some of our European allies that want to act like privacy is not a concern in the United States.  That’s completely untrue.  It’s disingenuous for the Germans to act like they’re holier than thou when recently the B&D; was targeting a German reporter in Germany.  That kind of stuff is unacceptable.

And so privacy is important.  Our civil liberties are important.  But saying that Europe has a higher standard than we do is completely erroneous.

Furlow:            This conversation has taken place on so many different levels.  Obviously, that’s what makes it complex.  So you have the security issues and the privacy issues just on their face value that you’re dealing with.  Then there’s the intelligence and the military that are in the national security interests, or economic interests of a nation, and then beyond that you get into the political, right?  So you’ve got all these different layers and it really makes it complex.

So, for example, many of you are familiar with GDPR, the General Data Protection Regulation that will be coming out in 2018 across the E.U.  Well, on its face, obviously, it’s about protecting the privacy of European citizens and that right to be left alone.  That’s important and laudable.  But by the same token, is it perhaps a backdoor trade sanction against companies from other nations outside the E.U.—

Hurd:               Yes, it is.

Furlow:            —if they don’t follow the compliance?  Again, there’s enough wiggle room in GDPR to where that potentially could happen.  So, again, some of these conversations, there are multi levels to the discussion, and we have to take all of that in, and understand that.

Fung:               Michelle, you’ve dealt with a lot of policy making on privacy.  What’s your take on what’s happening in Europe, and how much of that could migrate to the United States?

Richardson:     Well, you know, we often hear, here in the United States, that a lot of the functionality, and control, and granularity we would like to see in our products is just not possible.  But what Europe shows us often is that it is, and it’s really a choice.  And we do hope that some of the options that are over there will migrate here.  This is hard because there is no international privacy agreement.  There are little pieces in treaties about privacy or expression, but they haven’t been explored fully.  You’ve gotten the first decisions just in the last five years, 10 years, and we don’t know where it’s going to go.  But hopefully, over time—this is going to be a very, very long-term project—there will be something more international with a standard about privacy that we hope will lift all boats.  It is hard, though, because the Europeans almost see it the opposite that we do; they’re more concerned about companies, where we’re often more focused on what government does.  So sometimes you feel like you’re having an apples-and-oranges conversation about who’s responsible, where the rules should fall, and what you should actually even be concerned about in the first place.

Hurd:               And one of the reasons I think that there is that disconnect is because we do something in the United States that many other countries don’t.  We have congressional oversight of our intelligence organizations.  When I’m in Germany meeting with my Bundestag colleagues, they oftentimes ask us a question about intelligence cooperation between the U.S. and Germany because they know we know more than they do on what their own intelligence service is doing, because the oversight we have of ours.  And so that sheds a perspective, and I think some would say that they wish we had more perspective.  But we have that oversight over our intelligence communities, is something that we don’t champion or laud or are proud of, because not many other—even many of our European allies don’t have that opportunity.  So we have an insight into the government activities that many of our allies don’t, and I think that’s why you see some of that focus, and I believe there is a tinge of protectionists in some of these activities that we see in Europe.

Furlow:            Harmonization of cyber norms is really going to be critical moving forward, but it is exceptionally difficult for the reasons I had mentioned earlier.  But that’s really the challenge, right?  Because if you look at it, you’ve got GDPR, which is obviously emerging, about to go into effect in the E.U.  Singapore has its data protection regulation.  You’ve got the Chinese that are moving towards a new standard that potentially could come out.  And all these are—you know, in some laws, it’s very vague about what the standard might be.  So getting towards something that is more harmonized is really going to be critical in the years ahead.

Hurd:               I’m sorry to keep going in, but I think there are some positive examples.  Estonia.  You know, Estonia has at one point 3 million people, which is basically the size of my hometown of San Antonio.  But they have—they basically provide all services digitally.  There is a trust between the Estonians and their government, and the government has established that trust with them.  And they’ve been involved in a thing called the Tallinn Manual.  Tallinn is the capital of Estonia.  About how, you know, NATO allies should be working when it comes to cybersecurity.  So there are some pockets of examples of how this has worked, and one of the things I think we’re all harping on is the lack of trust—lack of trust in the businesses to protect your information, lack of trust in the government to protect it, or to provide oversight.  And that is something that undergirds all these problems.

Wheeler:          And I would say that as we move forward, whatever the legislation says—702 is up for reauthorization this year; the government was permitted to share 12333 data, which is data collected overseas, raw data, with other intelligence agencies this year.  There’s CISA, which was passed in 2015.  On all of those, there should be a default for oversight, both congressional but, more importantly, for statutorily independent oversight.  So 12333 does not require the inspector generals of these agencies to be involved in the sharing and use of raw data collected overseas.  That’s a problem because, over and over again, we’ve seen at NSA things don’t get fixed until the inspector general comes in and actually looks at it; it’s not enough for Congress.

So those are, I think, ways to build trust and the intelligence community sort of doesn’t want to do that, and didn’t do it with 12333 and doesn’t necessarily even have that with 702, which has a ton of oversight on it.  And that’s, I think, a way that you can build trust in the government that isn’t there now.  I mean, like pCloud [ph] got thrown out of CISA at the last minute.  No reason for that.

Fung:               Let me jump in here with a question from Twitter.  Someone is asking, is blockchain a solution for information assurance?  And I might expand that a little bit.  What kind of technologies generally are emerging that could help us address the cybersecurity issue?

Hurd:               So blockchain, for those that aren’t watching, it’s basically distributed ledger and you are able to keep track of every piece of information that changes along that chain.  And the answer is yes.

I think when you look at what Maersk is doing when it comes to their supply chain—even Walmart is doing it within food safety, and Maersk is working with the Department of Homeland Security.  So we’re seeing the technology and how it can improve processes, reduce from point A to point B, but also provide a level of protection.  And I think this is something that we should follow, and I believe the financial services industry—I think JPMorgan is one that really is kind of on the cutting edge of how you do this in private transactions.

So it is absolutely, I think, something that—[OVERLAPPING]

Wheeler:          But did that help Maersk come back from NotPetya?

Hurd:               I don’t know.

Wheeler:          So, I mean—

Hurd:               Yeah.

Wheeler:          Yeah.

Hurd:               Patch your software.  Have a password that’s more than 14 characters.  You know?  Don’t click on emails that you don’t know the sender from, right?  You know, you can follow those—[OVERLAPPING]

Wheeler:          Well, and supply chain.  Until you fix supply chain, which I think was that problem.  And you know, we’ve seen it again.  That we, even in the United States, don’t have visibility on what the supply chain is, and until you have that, you’re not going to know whether Maersk is going to be shut down for shipping because of some Ukrainian—[OVERLAPPING]

Furlow:            Technology is going to be really important.  Blockchain is one of those technologies that could really be transformational, particularly in the financial services—insurance and other industries.  But we can’t forget the people part of this equation.  I think sometimes when we’re talking about cyber, we forget about that.  The people and processes [ph] part is really, in some cases, for example, allowing a hack because someone does click on the wrong email because they don’t have enough training or awareness to know what they shouldn’t do.  You know, those types of issues, inside companies and inside government agencies.  Obviously, we have the same issues from a governance standpoint; look at the FCC [ph] and the recent breach there, and the notification process that took place.  It mirrors some of the things we’ve seen on the private sector side.

We’ve got enough work to go along here, but it has to be integrated.  It has to have each of those elements.  We have to be addressing the governance and organizational issues, the process piece, and the technology, leveraging that, all in concert.

Fung:               Let me just piggyback on that.  A lot of what I write is sort of geared towards the average consumer.  And you know, often we go through these tips—“Don’t click on links from email senders you don’t know or anything like that.”  But it often seems like these rules tend to change with every hack.  I mean, we just found out that SS7 vulnerabilities make two-factor authentication via text messages insecure.  So, you know, how do we avoid a situation where consumers become disenfranchised with the state of security and throw up their hands and say, “Oh well”?

Hurd:               The SS7 vulnerability is if you’re using a 3G phone, right?  So again, I think you have to—

Fung:               Right, and the details associated with all of these little revelations, it’s really hard for people to keep track of every twist and turn.

Richardson:     I would say we have to just accept that we already are disenfranchised with our devices, right?  We are coming into a world where you’re not going to have the option of buying a disconnected car or appliances, right?  And the terms of service will say you have to share your information or this thing won’t work, we’ll brick it, right?  We’ve now lost control over this data.  And so at this point, it’s not responsible to put the burden on the individual.  For systemic change, it is going to have to come from the companies and the government.  I don’t say that lightly; it really concerns me because there are serious surveillance or access to information issues involved when the government are making systematic changes.

But I think that’s just how we have to honestly deal with the problem.  Individuals are never going to be able to control dozens of devices with different rules that were not even made to be durable in the long term, right?  You paid $20 for this internet camera and you expect to have it for six months or a year.  How much do we expect people to be invested in this thing?

Wheeler:          One of the things that’s come out in recent discussions about Russian tampering is, you know, you’ve got Facebook before Congress.  You will have Twitter and Google before Congress.  And there is a discussion about the responsibility of these companies that—I mean, if you want to fix the security of the average user, you fix it Facebook, you fix it at Google, you fix it at Twitter.  You start there.

And we are having a discussion about the responsibility of Facebook for policing Russian advertisements on its site.  But why can’t that discussion be larger?  That says, “Why can’t Twitter have a functional two-factor authentication?”  Right?  How can we make it harder for these tools that everybody uses and that really are the infrastructure of our digital world?  That’s where you need to fix it.  And we’re now having that discussion because of the Russian stuff.  Let’s have the broader discussion about how to ensure that the security is there for the simple tools that everyone uses.

Furlow:            So we’ve come full circle on security and privacy by design, right?  Because we have to start having that conversation at that level.  Look, there’s a paradox that goes on here.  You know, we want openness, we want access, we want convenience; at the same time, we want security, we want safety, and we want privacy.

We live in a world here where as our interconnectedness grows, the public and private sectors are more and more dependent and interdependent upon one another.  So I think to Michelle’s point, it’s going to require to have business at the table and government at the table.  But we haven’t gotten to the point—I mean, we’re talking big strategies, but we haven’t gotten to the point where we’re addressing each of these issues very systematically.  I think that’s part of the challenge that we have as we all play catchup.

Hurd:               And also, when I’m driving down Interstate 10 and there’s not a stop, I want my Whataburger delivered to my car.

Furlow:            By drone.  Right through the sunroof.

Hurd:               By drone.  Right?  You know.  So, you know, we should still embrace the opportunities we have, but we have to be able to continue to manage these issues around it [ph].

Wheeler:          Right, but I mean, the thing about going after Google and Facebook is their margins are huge.  They’re richer than God.  And so if you can’t get Google to be able to update an Android phone, you’re never going to fix the problem.  Because they have the margins to do it.  Your drone manufacturer in Texas doesn’t.  Yet [ph].

Fung:               How much of this is just a failure to conceptualize the root causes of the problem, right?  Like we talk about cybersecurity primarily as being a technical problem.  But you know, what we see with disinformation or fake news, this type of thing is more of a sociological thing than a technical issue, right?  I mean, how do we address that?  How do we get Google and Facebook and Twitter to address that from a human perspective?

Wheeler:          I think that some of them are trying.  Some of them are not.  And there needs to be—one of the neatest changes in cybersecurity in recent years is that Consumer Reports has started focusing on it, and Teen Vogue has started focusing on it.  If you can get Teen Vogue and Consumer Reports to start talking to your average soccer mom about what she needs to look for to keep her teenager safe online, then you’re going to have a much more responsible public than you are if it’s—no offense to Congressman Hurd sitting here in D.C. saying, you know, “Patch your software.”  Consumer Reports is a much more important, I think, voice in this than—no offense.

Furlow:            That’s why I mentioned earlier the importance of small business.  I mean, let’s face it.  They’re the backbone of the economy.  We all hear that all the time.  But these are folks that don’t have a lot of resources.  They’re like the soccer mom, right?  So education is a really important key to how we’re going to make the societal change to where people understand, hey, everything about your life is on your devices, right?  It’s your tax information.  You’ve got health information on there.  You’ve got photos.  All of that stuff that people really care about is on their devices.

I think we just have in our rearview mirror, we don’t have enough time behind us for folks who have actually realized this now.  The millennial generation, that’s a totally different conversation.  But we still have folks today that just haven’t gotten to that point where they understand the risk that they have.  Whether it’s the individual at home or in a small business.

Richardson:     I don’t know how this is going to look, but at some point we would have to have accountability injected into the system.  And if you look at other areas, they really made huge steps forward in their safety and usability because they were either sued or regulated.  Right?  And I don’t want to say that’s a great approach.  But those accountability mechanisms don’t exist in tech.  And so I don’t know how long we can persist in that sort of framework if no one is going to be forced to change the products or offer something different.

Fung:               So in the last panel Richard Clarke proposed the idea of penalizing companies that suffer from data breaches with monetary penalties, kind of like how we penalize oil spills.  Congressman, is that something that you’d support?

Hurd:               I would slightly disagree with the premise because I think it was this week’s Fortune—there was a great article that looked at a study that was done by all the companies that had suffered some major breach.  And there was a significant impact to their bottom line.  And the reason that you’re seeing the cybersecurity industry grow in valuations of companies and all these purchases that are happening is because companies are recognizing that this is a gigantic impact to the bottom line.  I don’t think that is something that is discussed as much, but there is an impact.

I do believe that FTC [ph] when it comes to—should be looking at issuing fines, and some of the issues when some of these companies that do that, so that there really is a stick in this area.  But there is an impact.  And on your previous question, I’d like to add—when I was in middle school and high school, I took a music class and a history class.  I am neither a musician or a historian.  We should be educating some of our kids in how does the internet actually work.  What does TCP/IP actually mean?  There should be a functional understanding.  And it’s not because they’re going to grow and become computer scientists or cybersecurity professionals, but it’s something that we need to understand so that we can be informed citizens.

Fung:               Marcy, were you about to add something?

Wheeler:          I want to just say on the liability issue, I think that the industry is better at assigning liability for financial data than they are at the stuff people really care about, which are the pictures I took at vacation with my cousin.  And so that needs to be part of the question, is the risks of the personal pictures are actually more important to your soccer mom than her financial data because her bank is going to take care of that.

Furlow:            In terms of the regulatory piece, I think it’s important.  We need to flip the model a bit.  So there’s the regulatory approach, right?  So it’s the stick.  On the carrot side, corporate leaders need to understand the enterprise value of the data that they possess, because then we can put it into business terms.  And if we start to do that, then we’ll start to see that turn that we’re all hoping to see in terms of an appreciation of the data—why it needs to be protected, why the resources need to be brought inside a company to protect it.

Fung:               Michelle, anything to add?

Richardson:     No.  I think I put myself far enough out there for today.

Fung:               I think we’re out of time.  Thanks, everyone, for coming, and thank you to our panelists.   [APPLAUSE]

And make sure to come back on Thursday when my colleague David Ignatius is interviewing Senator Tom Cotton.  It will also be livestreamed on WashingtonPostLive.com.  Thank you.

Sponsored Remarks:

Lush:               Well, good morning everyone.  Thanks for coming out.  HPE is pleased to support this timely cybersecurity summit along with The Washington Post.  You know, before I start my comments today thought if we could take just a few moments of silence for all those that have been impacted by those terrible events that have occurred in Las Vegas just yesterday.  You know, unfortunately we live in a world today where cybersecurity impacts our organizations, our agencies, and the security of this great nation we live in.  The application of cybersecurity standards, policies, and accountability is no longer something that is nice to have.  Cybersecurity is a priority for all of us.

Many agree that the first cyber hack actually involved sending insulting Morse code messages through a secure wireless telegraph technology in 1903, to the largest hack on record today which has been recorded at $460 million.  Now, that’s not lost revenue to the organization; that’s actually funds stolen from that organization.  Good cyber health is the best prevention from a cyberattack.  From improving cyber health hygiene to active threat detection to staff awareness and education, cybersecurity health is team effort and should be an active process.  My father’s words still echo in my ears, “It’s all about results.”  And HPE continues to deliver results.  HPE labs receive funding that exceeds $1 billion annually, providing the latest in cybersecurity technologies for many solutions with our most recent solution, our HPE Gen10 server.

There is hope.  Many organizations develop key technologies designed for systems that simply cannot be compromised.  The time is now to develop our cyber strategies and embrace key technologies.  From our ability to stop a server boot upon notice of a compromise with a silicon root of trust to NIST-enabled infrastructure, our continued inspection of our organizations’ cyber security strategy is a key to all of our success.  We hope that you enjoy the cybersecurity summit today.  Our individual security efforts collectively make us all a little more secure.  And HPE will continue to produce results both financially and morally invested in the cyber protection for us today and into the future.  Thank you again for your time.

Post Recommends
Outbrain

We're glad you're enjoying The Washington Post.

Get access to this story, and every story, on the web and in our apps with our Basic Digital subscription.

Welcome to The Washington Post

Thank you for subscribing
Keep reading for $10 $1
Show me more offers