Why you shouldn’t trust Facebook so blindly
Would you tell Facebook you’re happy to see all the bared flesh it can show you? And that the more gratuitous violence it pumps into your News Feed the better?
Obtaining answers to where a person’s ‘line’ on viewing, what can be controversial types of content lies, is now on Facebook’s product roadmap — explicitly stated by CEO Mark Zuckerberg in a lengthy blog post last week, not-so-humbly entitled ‘Building a global community‘.
Make no mistake, this is a huge shift from the one-size fits all ‘community standards’ Facebook has peddled for years — crashing into controversies of its own when, for example, it disappeared an iconic Vietnam war photograph of a naked child fleeing a napalm attack.
In last week’s wordy essay — in which Zuckerberg generally tries to promote the grandiose notion that Facebook’s future role is to be the glue holding the fabric of global society together, even as he fails to flag the obvious paradox: that technology which helps amplify misinformation and prejudice might not be so great for social cohesion after all — the Facebook CEO sketches out an impending change to community standards that will see the site actively ask users to set a ‘personal tolerance threshold’ for viewing various types of less-than-vanilla content.
On this Zuckerberg writes:
The idea is to give everyone in the community options for how they would like to set the content policy for themselves. Where is your line on nudity? On violence? On graphic content? On profanity? What you decide will be your personal settings. We will periodically ask you these questions to increase participation and so you don’t need to dig around to find them. For those who don’t make a decision, the default will be whatever the majority of people in your region selected, like a referendum. Of course you will always be free to update your personal settings anytime.
With a broader range of controls, content will only be taken down if it is more objectionable than the most permissive options allow. Within that range, content should simply not be shown to anyone whose personal controls suggest they would not want to see it, or at least they should see a warning first. Although we will still block content based on standards and local laws, our hope is that this system of personal controls and democratic referenda should minimize restrictions on what we can share.
A following paragraph caveats that Facebook’s in-house AI does not currently have the ability to automatically identify every type of (potentially) problematic content. Though the engineer in Zuck is apparently keeping the flame of possibility alive — by declining to state the obvious: that … continue reading here.
Content Source: TechCrunch
Image Source: EDM Tunes, She Leads Africa