Goals

 

Transparent and Responsive moderation

The ultimate and most immediate goal of this section is for TikTok to deal with the mass reporting problem and stop causing serious livelihood harm to people because of automated report processing systems. In service of that goal we are asking for:

A more transparent and responsive reports and appeals system involving humans.

  • Tell users how automatic moderating operates.

    • What is the level of active human moderation?

    • What is the current threshold that warrants human moderation?

  • Tell users what the training/expectations are for human mods.

    • How is that training being delivered, evaluated, and enforced?

  • Reduce TikTok administrators’ ability to heighten scrutiny on specific targeted accounts for reasons of personal distaste or malice (as reported by multiple moderators in public interviews)

  • Roll back changes that have meant users can no longer consistently attach photos or files to reports and appeals

  • When accounts receive violations for content, always alert them with a notification (or email), and always give them a specific, named section and subsection of the community guidelines of which they’re found in violation

  • Establish a high standard of well-being practices for human mods.

Additionally TOCA is asking to know how the issues caused by previous Auto-R policies have been addressed, and what functions of that system are still being used. The nature of the Auto-R system meant that marginalized users were heavily suppressed and we need to know what has been done to compensate for that systemic suppression. 

 

Better accessibility tools

TikTok is still a highly inaccessible app. The rollout of captions has not solved the issues with the app and has exposed other issues. Our ultimate goal in this section:

Regularly consult with accessibility experts and strive to constantly improve accessibility for people with a wide variety of cognitive and physical disabilities.

We are not experts but many of us are disabled and deal with issues on the app daily. In our opinion the following are obvious improvements to begin with:

  • Use network analysis to identify the cohorts most specifically affected by a given accessibility issue, and prioritize them in new feature rollout.

  • Video editing improvements:

    • Allow typing timestamps as well as use sliders for timing video elements.

    • Allow pausing in main editing screen for all users.

    • Allow clips to be split within TikTok—allow users to cut individual shots into multiple clips.

  • Caption improvements: 

    • Make captioned videos download with embedded .scc files rather than either burning in or leaving out the captions.

    • Make captions work on browser and other embeds of TikTok.

    • Lift the character limit in the caption box.

    • First: Allow users to enter captions even if they can’t auto-generate captions in their language, and to manually set the timing on their captions. 

    • Ultimately: provide auto captioning in all languages for which there is a vendor providing this service; provide captioning that meets the regulatory requirements in any area where TikTok operates, worldwide (do not only provide expanded utility in places where it’s illegal not to)

    • Allow videos to be captioned, or for captions to be edited, after video publication (it's fine if the new captions have to go through a review process)

    • Allow users to edit clips in stitches as they do in other videos.

 

Improved feature rollout

TikTok’s rollout of new features is consistently frustrating at best and making the app less accessible at worst. The baseline ask for us is:

New features that provide functionality for the audience of the user who receives them should be visible to that audience, regardless of whether they've been granted the feature for their own pages

  • Statements on the existence and scale of feature rollouts should be published

  • Transparency on if and when features are rolled back or suffer failures

  • Take steps to ensure that TikTok and organizations TikTok partners with are in clear alignment on the planned rollout for any features on which they’re collaborating.

 

Stop banning sex workers

A TikTok Moderator did an interview, which has now been made unlisted on YouTube, in which he stated that TikTok is actively trying to push all sex workers off the platform regardless of the content they post on the platform. TikTok has now made changes to the community guidelines that can be interpreted to fit a wide array of situations in an effort to force every sex worker out, potentially putting their livelihood in danger and forcing them into more dangerous spaces all for doing consensual and legal work. We are asking for the following:

  • Clear, specific parameters on what constitutes "Content that depicts, promotes, or glorifies sexual solicitation, including offering or asking for sexual partners, sexual chats or imagery, sexual services, premium sexual content, or sexcamming"

  • A commitment that having links to consensual sex work content reachable from the bio link will not constitute a violation of this policy

  • A commitment that referring to the fact that a user does sex work does not constitute sexual solicitation

 

Minor Safety

TikTok, though it allows younger users on the app, fails to create friction around those users accessing more adult content. We cannot blame children for acting as we did in online spaces as children, we need to require the app to be more proactive in its practices. We are demanding:

TikTok must assemble a team of experts specifically dedicated to minor safety

  • Create a system for age verification

  • A dedicated report system for time-sensitive minor safety issues, with specialized human moderators

  • Allow users to set their content to not push out to minors, on an account level or on an individual video level

  • Reconcile age verification across the app, so that your age as a user is stored and referenced as a single variable per account, and users who are actively paid by the creator fund can’t be kicked out of the marketplace or banned from going live for being under 18

 

CONTENT PREFERENCES

Within TikTok’s account settings users can set language preferences but those preferences do not appear to be respected, and the initial categories of video TikTok presents new users with do not appear to encompass all of categories TikTok sorts users and content into.

  • Allow users to set multiple language feeds for content preferences; serve content from all language spheres they’ve selected on their FYF

    • Accommodate for differences in userbase size within any given language when ranking videos relative to each other in FYF rankings, so that languages with small populations of speakers are not drowned out for users who want to hear them

  • Allow users to access a report of the tags associated with their accounts; allow users to manually add or remove tags to their preferences.

 

Transparency

There are many issues that don’t fall into a specific category that users would simply like additional information on. These asks correspond to standard practices elsewhere in online media.

Greater transparency around:

  • Security of user information in the Creator Marketplace.

  • Pay rates: what does TikTok consider the normal CPM range to be? Why does CPM fluctuate so dramatically on some days?

  • Surveys: popups asking questions about the nature of a video (is this informational? Appropriate for children? Is this entertainment? Was this annoying?… ) Who is this information for, the platform? Advertisers? What are all the categories?

  • Information about the existence and mechanics of derived or unlisted tags (not necessarily in exhaustive detail but enough that users can understand the mechanics)

 

Company culture

The existence of the Auto R system and continued issues with mass reporting ending in the ban of the user being reported rather than the accounts doing the reporting has exposed a number of issues within the culture of TikTok as a whole. We are demanding the following:

  • Transparency about who writes policies; specific teams named as authors of documents like the community guidelines;

  • Incorporate mechanisms into the recommendation engine that identify and counteract the effects of systemic bias on content performance;

  • A public commitment that TikTok will not under any circumstances approach problems of mistreatment of marginalized communities by limiting the potential success of the targeted communities;

  • A public commitment that TikTok believes the only acceptable approach to dealing with harassment, bullying, mass-reporting, abuse, &c. is to place limits on the abusive individuals and groups;

  • A public commitment to work with sensitivity readers and editors in communications regarding marginalized groups.