‘Unethical’ (?) Facebook Denies Selling Ad’s Based On User’s Emotions
Yesterday ‘The Australian‘ made public a report in which it claims that the social media giant utilizes algorithms which allow it to target its advertising to users who are emotionally vulnerable, including those as young as 14.
The report is based off of a confidential document that was meant for internal purposes, speaking of monitoring posts, pictures and interactions following which FB’s real time algorithms would be able to identify “moments when young people need a confidence boost.”
The report claims that Facebook allows for advertisers to target an audience based on its algorithmic understanding of their emotional state, from feeling “stressed”, “defeated”, “overwhelmed”, “anxious”, “nervous”, “stupid”, “silly”, “useless”, to even feeling like a “failure”. Furthermore the document stated that this detailed information is “shareable under non-disclosure agreement only.”
Techcrunch contacted Facebook to confirm the newspapers claims, with the spokesperson responding that the report was “misleading“, denying any such practice of providing advertisers a way to target users according to their emotional well-being.
The spokesperson added that “the premise of this article is misleading. We do not offer tools to target people based on their emotional state. The analysis done by an Australian researcher was intended to help marketers understand how people express themselves on Facebook. It was never used to target ads and was based on data that was anonymous and aggregated.”
However, Facebook is yet to deny the existence of such a document, or the social giants ability to gain an understanding of their users’ emotional state based on their online activity.
Facebook had also, apparently, apologized earlier stating that they had begun an internal investigation in order to “understand the process failure and improve our oversight.” Furthermore, Facebook said, “We will undertake disciplinary and other processes as appropriate.”
This is not the first time that Facebook has come in the limelight for its potentially ‘unethical’ practices. Back in 2014, reports emerged that Facebook had, without any consent, had performed an experiment on around 700,000 users to see if they had the ability to influence their feelings. This exercise, which was carried out in 2012, involved showing users more positive or negative news, in order to influence their mood.
Last year, the company also faced criticism for its ad targeting practices once it became known that the company had been allowing advertisers to target based on a users ethnicity – which led to further concerns on discriminatory advertising. Following the negative press, the feature was removed by Facebook for specific categories, being housing, employment and credit-related ads.