Apple CEO Tim Cook talks about the newest iPhones during an Apple launch event in San Francisco last week. (EPA/Monica Davey)

Apple's digital assistant gets an upgrade in the newest version of iOS. Users will no longer have to press a button to summon Siri, instead they can set their smartphones to constantly listen to their conversations -- just waiting for an opportunity to give directions or tell a joke.

Convenient? Sure. But privacy advocates worry that this type of technology could become a hazard that users aren't prepared for.

"When you enter the realm of always-on devices, there are real privacy implications that need to be addressed," said Marc Rotenberg, executive director of the Electronic Privacy Information Center (EPIC). "Even if the user of the device consents, it doesn't necessarily mean other people who are there consent to the routine recording of everything they might say."

As technology makes our lives more convenient it is also getting more personal. Digital assistants can sometimes seem like your best friend. Google Now, an app for Android devices, can give you tips on how to avoid rush hour traffic. Amazon's Echo speaker can turn on your favorite tunes. Even your Xbox One may just be waiting for you to tell it what to do.

So maybe it's time to ask, "Hey Siri, are you keeping my secrets?"

In previous versions, users had to turn on Siri manually by pushing the home button on their phone or they could activate the "always listening" function while their phone was charging. Now, users have the option of letting Siri listen all the time -- even when they're not charging their phone.

When enabled, Apple says that Siri is listening to everything you and those around you say -- constantly comparing what it hears to the trigger phrase "Hey, Siri." That process is done locally on users devices, Apple says.

But when Siri recognizes the user saying the trigger phrase, an encrypted audio clip of what a user asks Siri is sent to Apple.

"In no case is the device recording what the user says or sending that information to Apple before the feature is triggered," Apple told The Washington Post in a statement.

Apple automatically assigns every iPhone a random identifier that helps mask the user's identity, the company says. That allows Siri to learn users' quirks. And if a user disables Siri, the data associated with the device is deleted, according to Apple. If the user wants to start using Siri again, the learning process starts over fresh.

Making sure iPhone users feel protected is important to Apple, which has used customer privacy to differentiate itself with the online advertising-driven business model of other tech companies like Google. And the way Siri handles trigger words by processing them on a local device is one of the better approaches from a privacy perspective, according to Rotenberg.

But not all tech companies are forthcoming about how they handle such data, and  some consumers are not fully aware of the privacy risks posed by this new generation of listening tech, Rotenberg said. Earlier this year, EPIC sent a complaint to the Department of Justice and the Federal Trade Commission over the privacy policy of Samsung SmartTVs. The voice recognition technology used by the television could help users track down good movies, but it was also transmitting voice commands to third parties insecurely, according to security researchers.

In some cases, data collected by digital assistants may end up in places consumers don't expect. Earlier this year, several tech news sites reported that people were being paid pennies by companies to listen to audio that appeared to be from smartphone-based virtual assistants and verify text transcripts, presumably to improve their accuracy.

Apple did not immediately respond to an inquiry about whether it uses such third-party services.

The privacy concerns sparked by digital assistants are a reaction to a trend toward digital documentation that goes back to the dawn of the commercial Web. Much of what we do online can be tracked, sometimes by the Web sites themselves or specialized data mining trackers that have colonized much of the Web. If you use a computer network at school or work, that can be monitored, too.

The information often becomes part of the fundamental trade-off that powers much of the Web: The data gets used to power targeted advertising that funds the free services (e-mail, social networks and access to news sites) that users have come to expect. And with the rise of social media, consumers started offering up more and more information about themselves for harvest.

Those digital breadcrumbs can be used to draw some very detailed conclusions about a person's life, potentially revealing everything from where they live, what they do and the issues they care about to potential medical conditions and sexual preferences.

The mobile revolution has taken things even further: Devices that learn about consumers are in their pockets, complete with cameras, microphones, address books and programs that promise to use the digital world to help people navigate the real world.

Now with the looming "Internet of Things," those sorts of capabilities are moving into even the most mundane of devices -- from refrigerators to Barbie dolls -- enveloping consumers in a Web of convenience that could connect nearly every aspect of their lives.

Siri appears to be at the center of how Apple is approaching this new era -- even the new Apple TV comes with a button you can press to use Siri's voice control features to help you find a show. And given Apple's commitment to not selling data about its users to advertisers, some may be optimistic about the position the company is staking out in this new market.

But even if Apple and other companies don't share the data created by this new tech, that information will still exist, and it's hard to predict how it might be be used down the line. So by engaging with this new wave of technology, consumers are adding to the data troves that increasingly defines their lives  -- and ultimately trusting that corporations will keep it secure.