This is a summary post of an article published open access in Information, Communication & Society.
What reactions and rationales do Facebook users have when confronted with their algorithmically generated profiles? We surveyed 292 US-based users and led them to their “Your Interests” and “Your Categories” sections on their Facebook profiles followed by open-text questions like “What kind of data do you think Facebook has about you?” or “How do you think Facebook inferred these categories?” Buried deep in the user preferences, these lists of inferred interests and categories seem to reflect Facebook’s attempts at transparency and simplify the practice of algorithmic profiling substantially.
We found a mixed picture of emotional reactions and a broad range of folk theories, where uncertainty and speculation about Facebook’s practices dominated. Some expressed surprise about the extent of personalized attributes offered to advertisers. But there was an opposing theme of “algorithmic disillusionment”, a sense of disenchantment and feeling underwhelmed or simply amused and not worried.
I think the amount of data Facebook collects is terrifying.
It says Some of my interests are motherhood (I’m not a parent) and cats (I hate cats, dogs are where it’s at).
I think Facebook probably infers my political viewpoints based off of half the people I went to school with constantly posting political stuff all the time, including my age group and the Twin Cities is overall liberal too. Not sure how Facebook determines who is multicultural. I have black family members and a few close black friends that I interact with on Facebook, so I suppose that’s why.
Facebook is often portrayed as extremely data-rich, powerful, technologically sophisticated, and manipulative (e.g., ‘The Social Dilemma’ but also academic papers) – many might have developed an image of Facebook that was more advanced than what they were actually confronted with. A dominant theme in users’ responses was uncertainty about what data Facebook has and how ads are allocated, which is related to concepts such as privacy cynicism.
These user narratives based on incomplete or non-available information may serve corporations to continue to normalize their data practices and blur the lines of what users think is permissible under various regulations including data protection law. Perceptions of algorithmic profiling is confined to what users know or think they know about platforms’ inner workings. Would disillusionment change with better information or when more personally significant decisions based on inaccurate information become evident?
Social media platforms are increasingly a critical social infrastructure – users either are faced with the difficult choice to give up on an important part of their social life, or to have their privacy and personal autonomy expectations violated.