two serious issues for the future of social media

Last year, I was working as a social media director in the personal development industry. In January, we were running a month-long course dedicated to making the most of 2020 and the new decade that had just dawned.

My team and I decided to use a Facebook group to create an element of community, where course participants could engage with each other, and where we could share content.

The group was private— each request was reviewed after they’d answered questions specific to the program we were running. It quickly swelled to more than 55,000 members, with moderation left up to our social media coordinator and myself.

Moderating that group was the most woefully unprepared I’d ever been for a job in my life, and presented a myriad of uncomfortable, and at times, seemingly unanswerable questions.

When the devastating Australian wildfires broke out in early January, the group was flooded with heartbreaking stories and photos— over 500 of them over the course of two days. The participants were a global group, so many of them were personally affected.

And then came the complaints:

The posts were depressing. They weren’t about meeting goals. There were too many of them.

We had our first tough decision to make: while the fires were undoubtedly horrifying, the purpose of the group we’d made was to support program participants and their goals for 2020. We ultimately decided to add a rule that posts must be on topic to the group.

Then came the angry messages.

We were “soulless, heartless people.” We “didn’t care about Australia or wildlife.” We were callous, uncaring, and biased. That was the perception of us.

Shortly after the wildfires, Kobe Bryant tragically passed away. As tributes poured in, heated discussions of past accusations of sexual assault appeared in the comments.

Again, we had to remove posts and ban members who hadn’t followed our group rules. But then we were accused of being inconsistent and biased— some posts that broke the rules remained up. The truth was that our small team couldn’t possibly catch everything— over 2,000 new posts were coming in a day.

Another post was expressing anger that the anniversary of the Holocaust was not getting the same amount of news coverage that Kobe’s death was, and those comments quickly turned hostile too.

The woman who posted it personally messaged me on Facebook, called me anti-Semetic for taking down the post, and said she’d be bringing her story to the media.

No one understands the difficulties of content moderation better than those who have had to do it— which is why the events of this week came as a shock, but not a surprise.

Predictable Surprises

The tragedy at the US Capitol building that left five dead is a day that’s burned into my mind alongside September 11th.

In the aftermath of this week, much like in the weeks following the collapse of the Twin Towers— many people are shocked and horrified, but not surprised.

The ADL released a detailed report in the days before the riot that catalogued the extremist groups who were openly planning to be there, the specific threats being discussed, and the possibility for violence.

We now know that these attacks were planned in the open. Yes, on dark corners of the web, but also on places like Twitter, Facebook, and YouTube.

The threat of more violence in the upcoming weeks led Twitter to permanently ban Donald Trump’s Twitter account, with Facebook and Instagram suspending him until the inauguration.

But is it enough? Without fundamental change from top to bottom, will we be here in another four years (or another week)?

I don’t know the answer to that. But I do know two areas of focus we need to address going forward: the serious role of content moderation, and the unilateral and unchecked decision making powers of social media platforms.

Underfunded and Under Resourced

One of the most important jobs in social media has been largely ignored, undervalued, and outsourced.

Social media managers and content moderators are the first line of defense on the battlefield of social media. The comparison isn’t a coincidence— many of them develop PTSD from what they see.

Following the heartbreaking reporting by The Verge on the realities of working in these moderation centers, Facebook agreed to pay $52 million to moderators for unsafe working conditions that damaged their mental health.

The settlement included other changes, like weekly sessions with mental health professionals, improved tools that mute audio and change video to black and white, and screening applicants for emotional resiliency.

These are steps in the right direction, but more needs to be done to take seriously the role of content moderation in social media.

Jobs in social media typically command little respect, both within companies that depend on it and the broader public. How many times have you seen a social mess up blamed on the intern?

Contracted content moderators at Facebook were making little over $28,000 a year, a fraction of the median $280,000 Facebook employees make in salary, stock, and bonuses.

Content moderation is a serious job, and it’s time we take it as seriously as we do sales, marketing, or engineering.

Too Big to Fail

In the coming weeks, there will likely be investigations into how Wednesday’s events were both planned in plain sight and able to occur.

To some extent, we already know— this isn’t a new phenomenon.

In a 2018 report, Facebook admitted that they were unable to prevent the spread of disinformation used to “foment division and incite offline violence” in Myanmar.

Though Facebook and most other social media platforms have made major changes following the 2016 US election— it hasn’t moved the needle. Social media is still being used to spread false information and plan violent acts.

Facebook, with 2.2 billion members, is larger than any country in the world. Decisions made in a boardroom in California impact the majority of businesses around the world in some way.

It’s time to admit that the problem is too big and complex for tech CEOs to solve on their own, and that we shouldn’t want them to.

I applaud Twitter’s decision to permanently ban Donald Trump from its platforms and wish it had been done sooner. But there are other valid concerns about the unilateral power CEOs like Mark Zuckerberg and Jack Dorsey hold in making these life or death decisions.

And I really do mean life or death, because for many around the world, it is.

Previous
Previous

Tools for small changes

Next
Next

my all time favorite articles & essays