Cybersecurity and the Human Element
Guest: Kevin Mitnick
Thursday, Oct. 3, 2002
No matter how strong a computer security system is, it is always vulnerable to the human element: People can accidentally leave passwords open to prying eyes. A skilled criminal can pretend to be someone else and request sensitive information over the phone. Computer users can fail to practice e-mail and Internet safety, unwittingly downloading viruses and other rogue programs.
It's the human factor that cybersecurity experts need to worry about most, according to Kevin Mitnick, author of the new book, "The Art of Deception: Controlling the Human Element in Security" (John Wiley & Sons, Inc., 2002). Mitnick ought to know. He went to jail twice (in 1989 and in 1995) for cracking into corporate systems. He cracked the systems by using his technical know-how and by pretending to be who he wasn't, applying what he calls "social engineering" to prey on the human element.
Mitnick's parole expires in January 2003. He has cofounded Defensive Thinking, a Los Angeles-based security consulting firm.
Mitnick joined us on Thursday, Oct. 3 to take reader questions. Per the terms of his parole agreement, Mitnick is barred from using nearly all computers except a court-approved laptop. He is also prohibited from sending e-mail or surfing the Internet. As such, Mitnick dictated his responses to a washingtonpost.com staffer who transcribed. Washingtonpost.com tech policy reporter Brian Krebs moderated the discussion.
An Edited Transcript Follows:
Editor's Note: Washingtonpost.com moderators retain editorial control over Live Online discussions and choose the most relevant questions for guests and hosts; guests and hosts can decline to answer questions.
Brian Krebs: Hello, Kevin. Thanks for joining us today. Much of your book and new business details the tactics used by "social engineers." Could you describe for us in general terms just what a social engineer does and how he does it?
Kevin Mitnick: A social engineer, rather than manipulating technology, manipulates people who have access to technology or information to perform a particular task or reveal information that gives the attacker an advantage. Oftentimes, the social engineer impersonates somebody who he's not, and is able to convince his or her target that based on their role that they're playing, that they're authorized to make a request for information or an action item.
Brian Krebs: You describe a number of intriguing and somewhat disturbing social engineering tactics in your book. Are these fictional, or are you simply recounting successful attacks that you carried out in the past?
Kevin Mitnick: All the stories in my book are completely, 100 percent fictional, except for a story regarding a contest that at the time a colleague hacker and I successfully won. With respect to the other stories, they're all completely, 100 percent fictional, but the techniques and tactics and strategies are real. In other words, the techniques and tactics are real, but the stories surrounding them are actually fictional. I had to do that because I had to sign a particular agreement as part of settling my federal criminal case back in 1999 that I wouldn't write the Kevin Mitnick "story" and profit. To respect that agreement that I signed, the book had to be written in this form.
The readership shouldn't really take these as fictional stories, because they are tried and true techniques that would be successful in these particular circumstances. While they're fictional, the techniques have been exploited before.
McLean, Va.: Hi Kevin. Biometrics seem to be a good way to reduce human security weaknesses, but they don't do much to prevent social engineering. Is there other way to prevent social engineering security weaknesses, other than using common sense along with clearly defined and enforced policies? Thanks.
Kevin Mitnick: You can use technology to reduce the risk of social engineering. For example, in a company that has a customer service department, when a customer calls that company, they have to identify themselves with a password or Social Security number. What social engineers often do is call and masquerade as an insider. They say they're assisting a customer on another line. The software oftentimes allows customer service reps to look up a Social Security no. to verify that they're speaking with a real customer. What the problem is here is that the system should not be designed to allow customer service reps to look up identifiers, but should allow customer service reps to input the password into the software and allow the customer service rep to see if the information is valid or not. Customer service reps no longer would be able to look up the information for a possible social engineer.
Maryland: So, like, you're a good guy now? If so, what made you give up your old principles?
Kevin Mitnick: First of all, through the trials and tribulations I've been through, I've certainly learned my lesson. I'm no longer interested in pursuing my hobby of computer hacking. My hacking hobby intrigued me, and my motivations for hacking were really for the intellectual stimulation, the challenge and the thrill of being somewhere where I shouldn't have been. Now I've grown up, I've learned my lesson, and I'm using my background experience and knowledge to help government and businesses to protect themselves against security threats. And I realize that a lot of people are skeptical about what my intentions or motivations are. And unfortunately, many people have formed their opinions of me by reading stories about me in the press. Those stories have not necessarily been accurate, and in some cases have been falsified to sensationalize my character.
Manassas, Va.: Cybersecurity experts have been saying for years that "people are the key factor in securing the information systems in an enterprise," but it appears that there is little effective leadership to transforming that phrase into effective action. In your opinion, what is the biggest failing of leaders that contributes to this situation? What is the most effective remedy to correct it?
Kevin Mitnick: One problem is that the people who are the decisionmakers may not be thoroughly convinced that the threat stems mainly from the human element, but feel that the security technologies will solve their problem. In reality, it's a combination of good policies, good security processes, security technologies and security awareness training that really go far in mitigating the risk.
The remedy is motivation. The remedy to creating a more security-conscious enterprise is by motivating each and every employee that security is part of their role and responsibility, and by participating they're not only protecting their company's interest but their own personal information that the company maintains on them.
Baltimore: What would you consider to be the most common and dangerous security hole in modern corporations, and why?
Kevin Mitnick: Not being vigilant at keeping up with security patches and by installing computer systems with default configurations. I believe these shortcomings are on the SANS top 20 list. Also, people in organizations aren't familiar with the social engineering threat, and don't understand the value of information that they negligently release.
Falls Church, Va.: Should a security audit of source code by an
objective third party be a required piece of R&D?
Kevin Mitnick: That's a hard question. I certainly believe that companies should have a process in place to effectively audit their source code. I believe that that process is necessary. Whether it takes place with in-house experts or is outsourced is a decision that should be made by the company. In the present day, even large companies like Microsoft release buggy code... I need to think about whether there should be any govt regulation. I think the consumer base could choose whether or not to purchase products from companies that choose to go to the extra length of auditing their code.
Brian Krebs: Why do you think people are so prone to fall victim to social engineering attacks?
Kevin Mitnick: People have a desire to trust. People inherently give others the benefit of the doubt, especially in a corporate environment, and are pressed for time and trying to get the job done. We have these positive human traits that go to love thy neighbor, and this is our, I guess, our meta programs, the thinking processes of human beings...we think in such a way that it gives the social engineer the advantage.
Brian Krebs: Someone reading your book could easily take the view that the only way to protect themselves is to trust no one. Is it possible for companies to strike a healthy balance between pure paranoia and security?
Kevin Mitnick: Yes. I mentioned in a keynote that I did a couple years ago that my mantra was trust no one. But obviously that suggestion would be inaccurate. They have to strike a balance, by implementing every security policy that I recommended in my book. It is absolutely necessary for companies to develop that balance based on their own culture, structure and the sensitivity and criticality of the information they're trying to protect. If everyone was operating from a paranoid state of mind, it would be very difficult for businesses to be productive in their core competencies. It's not one size fits all.
Alexandria, Va.: Uh, forgive me Kevin, but your media bashing strikes me as self-serving. The fact is, you went to prison twice for doing things you should not have done. Why the heavy-handed jibes at the press?
Kevin Mitnick: I have not made an argument that I haven't broken the law in the past. But on the other hand, there were certain journalists who have intentionally printed false and defamatory stories about me on the front pages of major newspapers around the world. One doesn't justify the other.
If you have any further questions in this regard, please ask.
Silver Spring, Md.: You said you never profited from your hacking. Is that really true? You _never_ profited?
Kevin Mitnick: That is absolutely correct. I have never profited from my computer hacking activities. If I had, the federal government certainly would have alleged it in their charges levied against me.
San Antonio, Tex.: Of your known exploits, which would you consider to have caused the most damage? That is, the govt. alleges many millions, but what figure would you support?
Kevin Mitnick: I believe my hacking caused some damage, but nowhere near what the government has alleged. The basis of the government's estimation of the loss stems from the research and development of computer source code that I nearly accessed.
These companies, which are all publicly traded companies, did not report any losses to the Securities and Exchange Commission that were attributed to me. And according to federal law, any company that suffers a material loss must report these losses to their shareholders. The loss theory that the government used was simply the amount of money the companies invested in R&D to create the products that I accessed.
Microsoft current service packs for Windows XP and 2000 contains a clause in the end licensing agreement that gives Microsoft the right to revise your operating system without your knowledge. What is your opinion about this agreement? What is the possibility of introducing new security holes with this automatic updates? Thank you.
Kevin Mitnick: I think it's a very scary proposition. I would be very concerned about using that product. That would go into my decisionmaking process to see if I even wanted to use that product. Giving a software manufacturer the right to revise your OS without consent or knowledge is a very dangerous thing in my opinion.
Washington, D.C.: If President Bush called you up and asked you for one piece of advice on cybersecurity, what would you tell him?
Kevin Mitnick: That there be some sort of regulation requiring software companies to turn on their security defaults. Windows XP has a built-in Internet firewall. These features should be turned on by default. The clueless end-user, or even IT professionals at organizations, leave themselves vulnerable to attack and to having attackers take over these computer systems and use them against other government or business computer systems.
They can hijack these computers who don't have security defaults on, that ask users to enter a secure password. These systems are ripe targets for hackers to take over and use as a tool to break into even more systems.
People also shouldn't choose "password" or "blank" as their password.
Takoma Park, Md.: What are the concerns/pre-cautions should a company decide to hire an ex-hacker to protect their information systems by breaking into it? Is it a good idea at all? Many thanks!
Kevin Mitnick: It really depends on the individual's background. Rather than stereotyping a hacker as a bad guy, you have to look at what a person has done, what their ethics and morals are, whether they have learned their lesson. Then you determine the risk. If I was a bank, would I hire a hacker who broke into Citibank and stole funds through accessing a banking system. No. I wouldn't. Would I hire a hacker who's been able to hack into systems or compromise security, but who hadn't intentionally damaged or crashed or destroyed information, but was simply after the challenge of circumventing computer security? Yes I would hire that person, based on their talent.
Bethesda, Md.: Are you doing a book tour? Will you be anywhere in the D.C. area?
Kevin Mitnick: Yes, I am doing a book tour. I will be in DC, I believe around October 30-31. Maybe until November 2-3 in Fairfax, Va., Washington and a couple other places maybe. People can check at www.defensivethinking.com, and I have my itinerary for the whole tour.
Arlington, Va.: What do you think of groups like CERT.org and the SANS Institute? Are they doing a good job promoting IT security? What more could groups like this be doing?
Kevin Mitnick: I like SANS, believe they are doing a good job. They have a good security and training curriculum. I think CERT is too slow to respond. Typically, security vulnerabilities are reported to CERT and they have a slow turnaround time getting information to the community. I think that's why we have mailing lists like Bugtraq to get information out to the community as soon as possible.
Brian Krebs: You mentioned "the clueless end-user." What - if anything - can be done to educate consumers about their security responsibilities? Do you think the companies who sell consumer broadband service have an obligation to notify consumers of the risks involved in ignoring good security practices?
Kevin Mitnick: I think that ISPs certainly owe some sort of advisory information to the end consumer to advise them about security. I think that the software manufacturers should step up, and also ISPs should step up and have some sort of program in place either through documentation or some sort of tool that can be used to educate their customers on best security practices. The information should be not a boring read but should be relevant, entertaining and informative to encourage people to want to read the information so they can understand it.
Not only do they have to understand it, it has to be written in a way that causes people to care. A lot of people are of the mind that they have nothing to hide so they have no reason to secure that information. A colleague of mine, Winn Schwartau, said that they were checking for open wireless networks in his neighborhood, and his son found an open network using net stumbler, a common tool. When they advised the owner of the wireless network that his whole hard drive was open to the world, he basically stated " I have nothing to hide." The important thing is to motivate and encourage these people who don't care that securing their computer systems is important. Not only to protect their own information but to protect the national infrastructure, because their systems can be used to compromise other systems.
Washington, D.C.: What do you think of the various proposals to allow copyright holders to use various tactics to defeat P2P sharing of intellectual property?
Brian Krebs: We've got several variations on this question.
Kevin Mitnick: I don't believe that the government should permit big business to compromise people's computer systems to insure that they are complying with copyrights. They shouldn't have the right to intrude, because the only way to determine if there is unauthorized information is by accessing it. And obviously, unauthorized access is not a good thing.
In the peer-to-peer networks people usually have advertised the information people have shared on their hard drives, by the way. But I don't believe companies should have unauthorized license.
Brian Krebs: You said you believe CERT is too slow in getting alerts out to the larger community. CERT says the delay is often to give companies that make the software in question adequate time to come up with a fix for the problem, and that releasing the information before a patch is available will only exacerbate the problem. Do you disagree with CERT's view on this?
Kevin Mitnick: No. I believe that the software manufacturers should be notified prior to releasing exploit code or details of a hole, but it's historical that CERT's past performance has been less than acceptable where security vulnerabilities were not disclosed to the public for months and even years. So based on this performance, I question whether or not there are better avenues to distribute this information.
Brian Krebs: The consultancy you're starting - Defensive Thinking - sounds like it's creating a niche market. Are there other firms out there that currently train companies and their employees to thwart social engineers?
Kevin Mitnick: I'm not aware of other companies that are doing it, but there are companies that will try to do social engineering as one part of a vulnerability assessment.
Alexandria, VA: Congress is exclusively directing funds toward College and University Phd. and Masters cybersecurity programs as the fix for addressing the "human" element of cybersecurity and largely ignoring all other avenues of IT training. Despite criticism of the President's cybersecurity plan, it recognizes "training and education" as key components. In a recent poll, 80 percent of respondents pointed to systems/network administrator "human error" as the root cause of security breaches. What are your thoughts on Congress' approach and direction of funds? How important do you think it is to train mid to entry level IT professionals (IT Foot-Soldiers) in foundational level cybersecurity skills? Thanks.
Kevin Mitnick: I believe training is essential, but at the same time, deployment of security technologies is also essential. And there has to be a balance; to have a good security program in place, you also have to deploy cost-effective security technologies, have education and training and good processes.
Brian Krebs: Talk about some of the things that should alert employees to a potential social engineering attack?
Kevin Mitnick: There are several things that can be done. For example:
The caller. If they refuse to give a call-back number, if they come up with every excuse in the book - their cell phone is dying, they're going to a meeting, they're calling from an 800 number - those are definite red flags.
The person that's receiving the request might notice it's an out-of-ordinary request. They're not normally asked to do the types of things that the social engineer is calling about. The employee should ask "why me? I don't even know this person."
If someone claims to be management, part of the executive suite, claims to be an authority - and the request is unusual and they haven't spoken to this person before.
If the caller stresses urgency - that's so the employee can't give the request much thought.
That the requester threatens negative consequences if the employee doesn't comply - they'll be fired, reported, demoted. That's using an intimidation tactic.
When questioning the person making the request, the employee might be uncomfortable asking question. Social engineer hates hearing "What's your name again?"
Being over-complimentary, flattering.
Flirting is another one - if it's a girl doing the social engineering and the target is a guy, she'll flirt because he's thinking he'll score a date or something.
That's a basic overview.
Vienna, Va.: Have you had a chance to read the National Strategy to Secure Cyberspace? If so, what's your overall opinion on the strategy?
Kevin Mitnick: I haven't had an opportunity to read it.
I plan to read it shortly, and I have read media reports. Seems like it was really nothing new.
Portola Valley, Calif.: Is Social Engineering generally utilized for nefarious purposes? Are there any instances when it might be used for the "greater good?"
Kevin Mitnick: The government uses social engineering to track down criminals. Bill collectors use it to track down deadbeats. Divorced mothers use it to track down deadbeat dads. The one thing that social engineering has, it's all about using psychological factors and influence to use people to comply with a request.
But it has the element of deception. In an honest society, deception is a bad thing, so social engineering is more in a sense a negative practice, but it can be used for good.
What about tracking down a kidnapper? The police might have the ability to spoof their caller ID to make it seem like a co-conspirator's telephone number to track down the real kidnapper.
So, it could be used for good and could be used for evil.
Brian Krebs: The Bush administration's latest draft of its national strategy to secure cyberspace has been criticized for not requiring companies to report vulnerabilities - among other things. Some have suggested that companies won't begin make security a top priority until the government enacts requirements to hold companies responsible for shoddy IT security. Do you think new liability laws would help, or would they only encourage companies to be less forthcoming about security problems?
Kevin Mitnick: You would think that imposing regulations to make companies come clean with the public would encourage companies to comply. There'd probably be a sense of plausible deniability -- that the companies would deny that they even knew about the vulnerability.
Manassas, Va.: It appears that one strength that the hacker community has in defeating the "establishment" is that it employs widespread collaboration in the sharing of tools and techniques, i.e. best practices and vulnerability intelligence. How is society going to collaborate if we foster a default atmosphere of distrust?
Kevin Mitnick: In reality - prohibiting underground sharing will never work. I don't see how this fosters an atmosphere of distrust. If people want to share negative things, how can that be stopped?
Pretty much anything in this world can be abused, so you can't censor or suppress information just on the fear that it might be abused.
I'm not sure if this answers the question, as I don't quite understand it.
Brian Krebs: Kevin, you'll officially be a free man in January. What do you want to do first that you can't do now? What are your long-term plans?
Kevin Mitnick: First of all, I would love to read e-mail because it's such an integral part of communication these days. My girlfriend, friends and business colleagues pull down e-mail for me from a laptop. When I want to write e-mail, they press the "send" button for me. I can use computers and a local network, but I can't cause that information to be transmitted.
This is until January 21, 2003.
I'm interested in using Internet messaging technology, and what I'm doing with my career at the moment is that I just finished authoring this book.
I'm also starting up a company as a security consultancy to help businesses mitigate the risk of security threats. Computers and technology have always been of great interest to me, and I intend to pursue a career offering my background, experience and knowledge to make this a better world for all of us.
Brian Krebs: That's about all the time we have. Thanks to everyone who made this discussion a success, and thank you, Kevin, for your time and thoughtful answers. Best of luck to you and your new business.