Think You Can Be a Content Moderator? Test Your Skills With This Game

A simple mobile-web game taught me a valuable lesson this week: Although I enjoy dishing out value judgments on social media, I don’t want to get paid to do it as a content moderator. 

The game in question, Moderator Mayhem(Opens in a new window), is from startup-advocacy group Engine(Opens in a new window), which commissioned policy-gaming shops (yes, they are a thing) Copia Gaming and Leveraged Play to develop it for a specific audience. That is, policymakers suggesting or weighing proposals to increase the liability of social-media companies for either removing or retaining too much user content, who often seem unaware of how hard content moderation can be in practice.

The game puts you in the role of a content moderator for a fictional site called TrustHive, which lets people review anything—businesses, places, movies, politicians, whatever. Your job is to assess reports of alleged policy violations, and either approve the posts in question with a Tinder-esque swipe right or tap of a green checkmark button, or take them down by swiping left or tapping a red “x” button. And don’t dawdle: Complaints about violations of TrustHive’s 18 categories of prohibited content keep coming.

Moderator Mayhem screenshots


(Credit: Rob Pegoraro)

Some of these are easy calls, like a “photo of the kettle reveals naked photographer reflection” description that calls out a violation of the no-nudity rule.

More often, the decision isn’t obvious. Tapping an eye-icon “Look Closer” button subjects you to a two-second wait but can surface useful content, such as when an allegation of a bigotry-policy breach (“Review of a club uses multiple racial slurs”) yielded this helpful detail: “The review is someone recounting how someone at the club shouted those slurs at her.”

Sometimes the rules don’t quite cover the case at hand. Does a review of a celebrity violate the harassment policy if it says “I heard he is a drug addict and cheated on his girlfriend”? 

Unlike in the real and often underpaid world of content moderation, you don’t get an eyeful of actual prohibited content, so Moderator Mayhem should not leave you with a case of PTSD.

In higher levels of the game, you have to judge appeals of your prior decisions and approve or reject preliminary calls by TrustHive’s new AI-based content-moderation system. The game shows it getting many things wrong, such as when it flags a review of a math book for containing CSAM (child sexual abuse material) because it includes the word “sexagesimal.” That’s a reference to base-60 math(Opens in a new window) (which I just learned myself).

The AI system also seemed to go astray with most copyright-violation calls, although many human reports of intellectual-property fouls in the game also ignored fair-use principles

Recommended by Our Editors

If you botch a call, your manager may appear with a scolding message like “Hmm… Not the decision I would have made.” He may also advise you of external developments among users (“Some users are quitting TrustHive and moving to platforms that have more relaxed policies”) and policymakers (“Apparently, the governor has called our CEO and demanded that we delete the reviews critical of him. Our CEO says we need to take them down, or we’ll be in trouble with state regulators.”).

The cards representing new reports keep stacking up as your time runs out. In one round in which I judged only nine reports—the most I managed to grind out in a level was 34—the manager warned me that excessive contemplation can be a career-limiting move. 

“You’re making excellent decisions but you’re way too slow,” this bearded, bespectacled fellow’s message read. “Speed up if you want to keep your job.”

After playing this through to the end and cranking through 97 reviews, I somehow earned a promotion to content-moderation management. It felt like winning a pie-eating contest in which the prize was more pie.

What’s New Now to get our top stories delivered to your inbox every morning.”,”first_published_at”:”2021-09-30T21:30:40.000000Z”,”published_at”:”2022-08-31T18:35:24.000000Z”,”last_published_at”:”2022-08-31T18:35:20.000000Z”,”created_at”:null,”updated_at”:”2022-08-31T18:35:24.000000Z”})” x-show=”showEmailSignUp()” class=”rounded bg-gray-lightest text-center md:px-32 md:py-8 p-4 mt-8 container-xs” readability=”30.769230769231″>

Get Our Best Stories!

Sign up for What’s New Now to get our top stories delivered to your inbox every morning.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.

Facebook Comments Box

Hits: 0