(Reuters/Gary Cameron/Files)

Companies are tracking more data about consumers than ever. Practically every click you make online creates new records in some distant database, and your real world actions, too, can increasingly be tracked through your mobile phone or new commercial surveillance advances.

But the Federal Trade Commission, one of the government's chief privacy watchdogs, just warned companies to think twice about how they use those vast data troves.

The agency on Wednesday released a new report that advises companies on how to avoid hurting the most vulnerable as they push further into the booming "big data" economy. For example, lenders can't refuse to offer loans to single people or offer them less favorable terms even if big data analytics suggests single people are less likely to repay loans than married people, the report said. That would violate the Equal Credit Opportunity Act.

“Big data’s role is growing in nearly every area of business, affecting millions of consumers in concrete ways,” said FTC Chairwoman Edith Ramirez in a press release. “The potential benefits to consumers are significant, but businesses must ensure that their big data use does not lead to harmful exclusion or discrimination.”

The report, based on a workshop held by the Commission in 2014, outlines potential benefits and risks that the growth of big data poses to low-income and underserved populations. "We think of this a natural progression from the work we did looking at the data broker industry," Ramirez said in an interview with The Washington Post. "We need to be mindful of data, which is the currency of today's economy and tomorrow's economy."

A company using algorithms to parse through information should consider the accuracy of the predictions it draws from the information and whether its models account for bias -- as well as whether the way it relies on big data raises ethical or fairness concerns, according to the agency.

The report also outlined how some big data practices might fall afoul of laws already on the books, including the Fair Credit Reporting Act, equal opportunity laws, and the act that established the commission.

But discrimination that results from big data may not always be intentional or obvious to spot, experts say. In some cases, it may happen because of problems with the underlying data or issues  with how an algorithm is designed.

"You can't ask a computer what a good employee is. You have to define what a good employee is, and in doing so you choose one definition or another that may very well bias the outcome," explained Andrew Selbst, an attorney who co-authored a paper on the subject published in the California Law Review last year.

Privacy advocates praised the FTC's report as a step toward a more concrete approach warding off potential big data pitfalls. "The commission’s message is clear — companies must proceed with caution as they use consumer surveillance tools made possible in today’s 'Big Data' era," said Jeffrey Chester, the executive director of the Center for Digital Democracy in a statement.

Those connected to the data industry were less enthusiastic. Hudson Hollister, the executive director of the Data Transparency Coalition, said he didn't see much new in the report and argued that the issues it highlighted weren't unique to big data.

"There's always been a risk of violating equal opportunity laws if marketing reflects bias and it crosses the line from differential marketing into differential services, but it's not a new risk," he said.

But others argue that the scope of big data has changed the landscape. "Big data allows inferences to come about in ways that weren't necessarily possible with small data," according to Selbst.

That can actually help eliminate bias in some cases, but if people handling the data aren't careful it can make things worse, he said. "Any data miner has to watch out for introducing new bias or exacerbating old."

Hayley Tsukayama contributed to this story.