Why, oh why, are we allowing our privacy to be trampled so? That’s a question that we should all be asking ourselves in 2019.
Social media, and the internet a a whole, is a powerful tool that we likely haven’t had the scope of capacity to fully understand the impact of. We are now functionally omnipotent beings, who are able to reach into their pocket and know just about anything, which is a step forward for humanity the likes of which we have never experienced before. Need a recipe for parmesan crusted salmon? Ask the internet. Trying to make a bomb? Also, just ask the internet.
Freedom of information is fantastic, but it comes with a responsibility that we haven’t yet grasped. And by “we”, I mean everyone – including the corporate megaliths around which the rest of the world wide web orbits.
That means that Google, Amazon, Facebook, Apple, and others are constantly, unabashedly, harvesting your data in an attempt to make a buck. This sounds nefarious, because it is, and we have led ourselves to this slaughter by continuing to use products from these corporations without a care in the world about what it says inside those lengthy, fine-print “terms and agreements”.
As it turns out, this personal digital leniency has allowed these companies to exploit us in ways that we would have never imagined, including allowing our phones to listen to us at our most vulnerable and intimate times.
Apple contractors regularly hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control, or “grading”, the company’s Siri voice assistant, the Guardian has learned.
Although Apple does not explicitly disclose it in its consumer-facing privacy documentation, a small proportion of Siri recordings are passed on to contractors working for the company around the world. They are tasked with grading the responses on a variety of factors, including whether the activation of the voice assistant was deliberate or accidental, whether the query was something Siri could be expected to help with and whether Siri’s response was appropriate.
Apple says the data “is used to help Siri and dictation … understand you better and recognise what you say”.
Here’s where it gets even scarier:
A whistleblower working for the firm, who asked to remain anonymous due to fears over their job, expressed concerns about this lack of disclosure, particularly given the frequency with which accidental activations pick up extremely sensitive personal information.
Siri can be accidentally activated when it mistakenly hears its “wake word”, the phrase “hey Siri”. Those mistakes can be understandable – a BBC interview about Syria was interrupted by the assistant last year – or less so. “The sound of a zip, Siri often hears as a trigger,” the contractor said. The service can also be activated in other ways. For instance, if an Apple Watch detects it has been raised and then hears speech, Siri is automatically activated.
The news comes after months of concern over Amazon’s Alexa smart-home device, which has been linked to a number of embarrassing and potentially damaging leaks of recordings made within the supposed sanctity and sovereignty of our own adobes.
Don't forget to Like Freedom Outpost on Facebook and Twitter, and follow our friends at RepublicanLegion.com.
Become an insider!
Sign up for the free Freedom Outpost email newsletter, and we'll make sure to keep you in the loop.