Jolly Roger
2019-07-27 20:41:23 UTC
<https://techcrunch.com/2015/09/11/apple-addresses-privacy-questions-about-hey-siri-and-live-photo-features/>
Being able to say the phrase at any time to activate Siri is convenient,
but raises some questions about what Apple means by 'listening' and
whether any of that stuff is recorded.
Hey Siri is an optional feature that is enabled by an opt-in step in iOS
9's setup. You can choose never to enable it. If you do enable it,
nothing is ever recorded in any way before the feature is triggered.
"In no case is the device recording what the user says or sending that
information to Apple before the feature is triggered," says Apple.
Instead, audio from the microphone is continuously compared against the
model, or pattern, of your personal way of saying 'Hey Siri' that you
recorded during setup of the feature. Hey Siri requires a match to both
the 'general' Hey Siri model (how your iPhone thinks the words sound)
and the 'personalized' model of how you say it. This is to prevent other
people's voices from triggering your phone's Hey Siri feature by
accident.
Until that match happens, no audio is ever sent off of your iPhone. All
of that listening and processing happens locally.
"The 'listening' audio, which will be continuously overwritten, will be
used to improve Siri's response time in instances where the user
activates Siri," says Apple. The keyword there being 'activates Siri.'
Until you activate it, the patterns are matched locally, and the buffer
of sound being monitored (from what I understand, just a few seconds) is
being erased, un-sent and un-used -- and unable to be retrieved at any
point in the future.
Of course, as has always been the case with Siri, once a match is made
and a Siri command is sent off to Apple, it's associated with your
device using a random identifier, not your Apple ID or another
personalized piece of info. That information is then 'approved' for use
in improving the service, because you've made an explicit choice to ask
Apple's remote servers to answer a query.
"If a user chooses to turn off Siri, Apple will delete the User Data
associated with the user's Siri identifier, and the learning process
will start all over again," says Apple.
Meanwhile, at Amazon and others...
<https://www.bloomberg.com/news/articles/2019-04-10/is-anyone-listening-to-you-on-alexa-a-global-team-reviews-audio>
Amazon Workers Are Listening to What You Tell Alexa
A screenshot reviewed by Bloomberg shows that the recordings sent to the
Alexa reviewers don't provide a user's full name and address but are
associated with an account number, as well as the user's first name and
the device's serial number.
Occasionally the listeners pick up things Echo owners likely would
rather stay private: a woman singing badly off key in the shower, say,
or a child screaming for help. The teams use internal chat rooms to
share files when they need help parsing a muddled word--or come across
an amusing recording.
<https://www.dailymail.co.uk/sciencetech/article-6910791/Alexa-listening-conversations.html>
Alexa IS listening to your conversations: Web giant ADMITS clips are
analysed by Amazon workers - including your most intimate moments
<https://www.dailymail.co.uk/sciencetech/article-6956531/Amazon-employees-listening-Alexa-recordings-customers-live.html>
Amazon employees listening to your Alexa recordings can also easily find
customers' home addresses, report claims
Being able to say the phrase at any time to activate Siri is convenient,
but raises some questions about what Apple means by 'listening' and
whether any of that stuff is recorded.
Hey Siri is an optional feature that is enabled by an opt-in step in iOS
9's setup. You can choose never to enable it. If you do enable it,
nothing is ever recorded in any way before the feature is triggered.
"In no case is the device recording what the user says or sending that
information to Apple before the feature is triggered," says Apple.
Instead, audio from the microphone is continuously compared against the
model, or pattern, of your personal way of saying 'Hey Siri' that you
recorded during setup of the feature. Hey Siri requires a match to both
the 'general' Hey Siri model (how your iPhone thinks the words sound)
and the 'personalized' model of how you say it. This is to prevent other
people's voices from triggering your phone's Hey Siri feature by
accident.
Until that match happens, no audio is ever sent off of your iPhone. All
of that listening and processing happens locally.
"The 'listening' audio, which will be continuously overwritten, will be
used to improve Siri's response time in instances where the user
activates Siri," says Apple. The keyword there being 'activates Siri.'
Until you activate it, the patterns are matched locally, and the buffer
of sound being monitored (from what I understand, just a few seconds) is
being erased, un-sent and un-used -- and unable to be retrieved at any
point in the future.
Of course, as has always been the case with Siri, once a match is made
and a Siri command is sent off to Apple, it's associated with your
device using a random identifier, not your Apple ID or another
personalized piece of info. That information is then 'approved' for use
in improving the service, because you've made an explicit choice to ask
Apple's remote servers to answer a query.
"If a user chooses to turn off Siri, Apple will delete the User Data
associated with the user's Siri identifier, and the learning process
will start all over again," says Apple.
Meanwhile, at Amazon and others...
<https://www.bloomberg.com/news/articles/2019-04-10/is-anyone-listening-to-you-on-alexa-a-global-team-reviews-audio>
Amazon Workers Are Listening to What You Tell Alexa
A screenshot reviewed by Bloomberg shows that the recordings sent to the
Alexa reviewers don't provide a user's full name and address but are
associated with an account number, as well as the user's first name and
the device's serial number.
Occasionally the listeners pick up things Echo owners likely would
rather stay private: a woman singing badly off key in the shower, say,
or a child screaming for help. The teams use internal chat rooms to
share files when they need help parsing a muddled word--or come across
an amusing recording.
<https://www.dailymail.co.uk/sciencetech/article-6910791/Alexa-listening-conversations.html>
Alexa IS listening to your conversations: Web giant ADMITS clips are
analysed by Amazon workers - including your most intimate moments
<https://www.dailymail.co.uk/sciencetech/article-6956531/Amazon-employees-listening-Alexa-recordings-customers-live.html>
Amazon employees listening to your Alexa recordings can also easily find
customers' home addresses, report claims
--
E-mail sent to this address may be devoured by my ravenous SPAM filter.
I often ignore posts from Google. Use a real news client instead.
JR
E-mail sent to this address may be devoured by my ravenous SPAM filter.
I often ignore posts from Google. Use a real news client instead.
JR