The wicked problem that is today’s social ecosystem

erin malone
5 min readNov 18, 2021

It is generally agreed upon by experts across a variety of disciplines[1] that the current state of affairs with the current monopoly of giant social media companies — Facebook, Twitter, Google (YouTube), Reddit — is unacceptable and untenable. Something has to give. Research outside and within the platforms[2] show that they have contributed to misinformation about Covid, about the US elections; incited violence in countries like Myanmar, in the US after the election including the January 6th insurrection; and have contributed to the loss of self-esteem and worth of a generation of girls who hold themselves up to influencers in ways that are not realistic.[3]

The promise of online community and the ability to get together and share interests with others-like-you from around the world that was so promising in the early 1990’s has come about in all the wrong ways. Unfortunately, the extremists, conspiracy theorists, anti-vaxxers and quacks have all found each other as well, and they seem to scream louder than the rest of us. The tools we have created for ease of sharing and communicating have left us open to manipulation. The bad actors are using these spaces to spread hate and misinformation in ways, scales and at volumes no one really predicted.

The platforms are working at a scale of users and scale of engagement never imagined when I co-wrote Designing Social Interfaces the first time in 2008/9. Our goal as authors was to create more usable spaces that encouraged people to share with others and provide tools for designers to create welcoming safe spaces.

Escalating spiral of hate — taps into Triggers, Responses and then the desire to Share that Response which in turn Triggers more negative reactions. Social media platforms grow and thrive on this engagement with reactionary and inciteful content.

Unfortunately, our emphasis on reducing friction for people, increasing virality and engagement, became a trap for a business model out-of-control at the expense of the people who fuel it. The collection of data — demographics plus all those interests bringing people together — provided further incentives for the platforms to double down on increasing engagement. Clicks sell ads and rage incites engagement.

We soon saw algorithms taking over homepage feeds rather than posts by people we care about or posts based on recency. Interest and clicks bringing more like this, replaced seeing posts from friends and family. Settings to tell the system that you want to see items by time, items by my flagged friends, are continually ignored in the service of promoting items that incite rage and anger further triggering people to lash out — creating a vicious cycle of hostility and toxicity.

The question now that is being asked in conferences, in webinars, in classrooms, in civil society organizations, and in legislatures around the world is how do we fix this? Do we tear it all down and start over knowing what can happen? Do we break these companies apart into smaller, more manageable, high-touch, personally moderated, spaces? Do we throw more computer power at them? More moderators? We know that moderation of the posts that make it through the tech is harmful to human moderators. They burn out, they commit suicide, they suffer from PTSD. People are horrible and this first line defense suffers because of it.[4]

We see when groups are de-platformed that they pop up on other smaller platforms and don’t really dissolve. But being kicked off the major platforms can throttle their amplification even if that doesn’t dissolve the group. Societal fears and issues underlie these groups and until that is resolved, they aren’t going away. Holding ISPs and hosting services accountable is another way to slow these groups down.

Some have proposed breaking up the pieces of the big platforms into their smaller components and open sourcing elements like Identity[5] — take your identity with you wherever you go and present how you see fit; Social Graph — make the graph portable too as you move from place to place you bring your network with you and take it away when you leave[6]; User driven algorithms — can users decide what’s important to them and what they would like to see amplified in their recommendations and in their feed.

Creating an anti-hate social pattern library as a counter to traditional pattern libraries brings recommendations to consider how a specific interaction might mitigate hate and in most cases, includes supporting research for the interaction direction[7]. Using design techniques and design patterns brings some of the responsibility for making social experiences safer beyond policy and into the realm of the user experience designer and developers. A pattern library won’t solve all the problems, but will slow down some of the speed at which hate and harassment is amplified and bring ethical considerations into the design process.

Social platforms and experiences are just one part within the ecosystem. Societal concerns, education, early interventions and counter-speech all need to be considered when thinking about how to make online hate less pervasive. Offline work needs to happen earlier in the cycle to mitigate some of these beliefs from taking root in the first place. Trauma and harm to impressionable children and adolescents need to be addressed so that the backlash in the form of hate and bullying stops before it can start.

As Donella Meadows wrote about in her essay Dancing with Systems,
“The future can’t be predicted, but it can be envisioned and brought lovingly into being. Systems can’t be controlled, but they can be designed and redesigned. We can’t surge forward with certainty into a world of no surprises, but we can expect surprises and learn from them and even profit from them. We can’t impose our will upon a system. We can listen to what the system tells us, and discover how its properties and our values can work together to bring forth something much better than could ever be produced by our will alone.”[8]

The solutions to the wicked problem lies across disciplines, across various expertise, across countries and varying laws and it will take concerted efforts in these different dimensions to see changes at scale.

[1] Sociologists, psychologists, policy experts, lawyers, academics, researchers among the many disciplines studying social media.
[2] Facebook whistleblower Frances Haugen recently shared internal research showing that the platform policies and interactions were hurting their customers. David Pierce Kramer Anna, “Here Are All the Facebook Papers Stories,” Protocol — The people, power and politics of tech, October 25, 2021, https://www.protocol.com/facebook-papers.
[3] Charles Riley, “Instagram Says It’s Working on Body Image Issue after Report Details ‘Toxic’ Effect on Teen Girls,” CNN, Business (CNN, September 15, 2021), https://www.cnn.com/2021/09/15/tech/instagram-teen-girls/index.html.
[4] Casey Newton, “The Secret Lives of Facebook Moderators in America,” The Verge (The Verge, February 25, 2019), https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona.
[5]Kaliya Young, “Thoughts on ‘Universal ID,’” Identity Woman, October 14, 2020, https://identitywoman.net/thoughts-on-universal-id/.
[6] There are ongoing discussions as part of the BlueSky Community about Social Graph portability, universal identity and other deconstructed components and elements in the creation and use of social experiences. Unfortunately, many of these discussions have been going on for as long as we have been creating social and community software. They run up against the commercial enterprises who trade data about their users for profit and information contained in identity and the social graph currently have an immense capital value for these businesses.
[7] ADL’s Social Pattern Library, http://socialpatterns.adl.org, November 18, 2021. Designed specifically to help mitigate hate and harassment in online social platforms and to provide designed micro-interactions as a reference and toolset for UX designers and developers.
[8] Donella Meadows, “Dancing with Systems,” The Academy for Systems Change, n.d., https://donellameadows.org/archives/dancing-with-systems/.

--

--

erin malone

I cowrote Designing Social Interfaces. I like to make models to explain complex systems. I design things. I take a lot of pictures.