Quantcast
Channel: help me, jeffrey
Viewing all articles
Browse latest Browse all 7245

imaginedsoldier: the-tired-tenor: tankies: Me:...

$
0
0


imaginedsoldier:

the-tired-tenor:

tankies:

Me: *crying*

Alexa: This seems sad, now playing Despacito

Y’all need to have a greater degree of 1- healthy suspicion in Alexa and corporate surveillance devices personal assistants, and 2- understanding of how dangerous this kind of algorithm is in the hands of a multinational company (and anyone for that matter.) 

To begin with, that data is both available for sale and able to be subpoenaed by the government. Alexa’s records and recordings have already been used in criminal trials. In the US, a digital record of your emotional patterns can be used to deny you housing, jobs, and to rule on your ability to exercise your basic rights. Consider that psychiatric stigma and misdiagnosis can already be wielded against you in legal disputes and the notion of a listening device capable of identifying signs of distress for the purpose of marketing to you should be made more clearly concerning. 

Moreover we have already seen the use of algorithms like this on Facebook and other “self-reporting” (read: user input) sites capable of identifying the onset of a manic episode [1] [2] [3], which have been subsequently been linked to identifying vulnerable (high-spending) periods to target ads at these users, perhaps most famously in selling tickets to Vegas (identified in a TedTalk by  techno-sociological scholar Zeynep Tufekci where she more generally discusses algorithms and how they shape our online experiences to suggest and reinforce biases). 

The notes on this post are super concerning- we are being marketed to under the guise of having our emotional needs attended to by the same people who inflicted that emptiness on us, and everyone is just memeing.


Viewing all articles
Browse latest Browse all 7245

Trending Articles