As the digital economy has exploded, tech companies are collecting untold amounts of data on everyday Americans. At the center of the discussion of how to protect that information is the Federal Trade Commission, which has increasingly played the role of the country's top cop on digital privacy and security.

Edith Ramirez, a Harvard Law School classmate of President Obama, took the helm of the agency in 2013. Since then, it has secured settlements against tech giants, including Google and Apple, for allowing children to make mobile purchases within apps on their platforms, and against Snapchat for promising that photos taken with its service would disappear forever (they don't).

The job is only getting tougher as Americans grow more dependent on smartphones and tech companies wield more power in Washington.

The FTC has been doubling down its efforts, in part, by opening a new research office to bolster its technical expertise. It also plans to host public workshops examining "sharing economy" companies such as Uber and Airbnb, as well as advertisers' ability to follow users online from their computers to their smartphones.

Ramirez sat down with The Washington Post to talk about what's next in the FTC's quest to keep consumers - and their data- safe. This interview has been edited for length and clarity.

Q: Oftentimes the FTC is described as the government's de facto privacy cop. Do you see it that way?

A: We absolutely are in my mind the key cop on the beat when it comes to privacy. We do a very effective job on enforcement and are also thinking on the policy side. It's very important for us to stay on top of technological developments, so we're not only thinking about what's happening today and ensuring companies are complying with the law, but also about what companies will do tomorrow.

These are issues that are complex and challenging. We want to deal with them in a way that allows companies to innovate and new players to develop new products and lead new frontiers. But how do we allow that to happen while at the same time make sure that consumers are in a marketplace they can trust?

Q: The commission's enforcement in the area of privacy and data security comes primarily from its authority to guard consumers
from deceptive and unfair practices. Could you explain how that oversight applies to technology?

A: In terms of the deception principle, it's really very simple: We expect companies that make promises to actually fulfill those promises. If a company makes a particular promise in their privacy policy or through some other mechanism, we expect them to comply.

Similarly, when it comes to data security, if a company makes a particular promise to consumers about providing reasonable protections, we expect them to fulfill that promise. It's quite a simple test, and we've used it very effectively because we find that companies say things about their practices and don't follow through.

In terms of our unfairness authorities, there the test is a bit different. It's basically: Has something caused consumers significant harm that they could not have reasonably avoided, and that isn't outweighed by some other benefit either to consumers or to competition?

One core example is the area of data security - we think a company's failure to provide reasonable data protections constitutes an unfair practice, because we think it's a reasonable expectation for a consumer. If a company is making use of personal financial information, they ought to have appropriate protections in place to make sure that information isn't compromised.

Q: It seems that companies facing FTC enforcement often get their first strike free, then get slapped with fines the next time, once they are under an enforcement order. Is that generally how it works?

A: As a general matter we don't have civil penalty authority - we can't simply fine a company because they failed to comply with Section V [which contains the authority to protect consumers from unfair and deceptive powers]. If, however, a company is under order [from the FTC] and they violate the order, at that point we do have civil penalty authority.

The commission as a whole has been urging Congress to enact data security legislation, and as part of that we believe we ought to have civil penalty authority. There are other areas where Congress has given us that specific authority, although privacy and data security isn't one of those areas right now.

What we can do, however, is seek monetary relief for redress to consumers - so it's not always the case that a company we don't have an order against won't be subject to a judgment that would encompass financial penalties. A lot of our privacy cases rely primarily on injunctive relief - where we mandate that the company put in place comprehensive privacy programs and also enjoin them from similar actions in violation of the FTC act going forward.

Q: How has the FTC's approach to technology evolved as that technology has evolved? Where do you think that's going next?

A: One of our responsibilities has always been to stay on top of evolving business models. We were certainly looking at online commerce when the Internet first became popular.

Back in 2000, only a small percentage of Americans used mobile devices. Today, more than 60 percent are using smartphones.

As a consequence of that, we've been increasingly placing priorities on making sure that consumer protection extends beyond the brick-and-mortar world and into the mobile ecosystem. We brought a number of cases, even just in 2014, that emphasized to companies the need to ensure that consumers have adequate information about purchases and that disclosures are made effectively on mobile devices.

Q: Do you think consumers are generally aware of the trade-offs they're making when it comes to privacy?

A: Most of us carry our phones all the time and that means a lot of information is being collected. That brings a lot of benefits to the consumer, but then also raises certain risks.

You might purchase a smartbed that could monitor your heart rate and your respiration, as well as capture snoring patterns. It might also permit you, from the comfort of your bed, to lock your doors or turn off your lights.

Not only is this smartbed collecting a lot of health information, it's also now providing connectivity that could raise security issues.

We want to highlight what those risks are, but also think about ways to mitigate those risks. We encourage companies to think hard about privacy and data security from the get-go. From the time they conceive a service or product, we want them to be thinking about how to incorporate protections.

Q: How has your background in corporate law influenced your approach to enforcement?

A: Being both on the side of defending companies and as an enforcer gives you an important perspective - so it's not just one side pitted against the other. I think it's important to understand that the vast majority of companies do want to comply with the law. You need to have a constructive relationship with them and provide guidance.

Learning how to assess a case, learning how to evaluate whether it's appropriate to move forward with an enforcement action, determining what type of relief is needed - these are all things I was familiar with from my days as a litigator.

At the same time, being in an agency that places such an importance on policy and research has been new terrain for me, but it is something I feel is valuable.

Q: Do you feel like your background in litigation played into the commission's decision not to pursue antitrust action against Google for its search practices?

A: I'm not going to get into the Google situation. We issued a statement that articulated our thinking and why it was that we felt it appropriate to close that particular investigation.

But as a general matter, the first question we ask when we are determining whether to bring an enforcement is: What is the right outcome here? Has there been a violation of the law? Sometimes we deal with issues that are very complex, but that's the first and foremost question that is in my mind when I'm helping to decide whether we ought to proceed.