16 / 134
Apr 2021

There at least needs to be a generic "other content warnings apply" to let creators head their own episodes with the warnings while still marking them mature, because not all mature content can be compartmentalised into a set of options.

Edit to add: if we're doing discussions, grouping truly offensive language such as slurs with casual curse words could lead to some issues as well.

Tbh, I'm not even sure what this warning is supposed to be in reference too. Suicide and self harm, and eating disorders are mental health conditions and they're separate categories and emotional abuse is also separate, so I'm not even sure what this category is supposed to entail.

Maybe a concise list instead? Severe panic attacks, PTSD, major depressive episodes, etc.

Oh that's fair. I agree with Saint though. We could have a generic "other content warning" option and have a comment somewhere encouraging the creator to specify in their episode :blush:

Hmm we will have to keep the list relatively simple. The reason we have separate sections for Suicide/self harm and Eating disorders is due to the extreme triggering nature for many survivors.

The individual selections can be combined for further clarity, Like mental health + Suicide as an example.

It's EVEN worse that that! There is nudity too!
Nudity and sexual violence in the same section! 🤬
It's a downright dangerous message given here!
I hope it's just a mistake and that Tapas will correct that.

Sexual Violence is a complete difference and a 100% stand-a-lone category in my book.

I would change 'Cursing/Profanity' to 'Slurs and Pejoratives' since I don't believe people should have to tag mature for having general swear words like f*ck in their episode.

Sexual content and nudity should be a separated category from sexual violence....

Despite both being sexual in nature, it's context is quite the opposite. One is consensual while the other isn't.

This could lead to 2 scenarions:

People who are not comfortable with rape-related content avoiding all sexual scenes.

Or that same people ending up witnessing triggering content due to vagueness in the categories.

My perception only, but I think you're just trying to create categories based on app restrictions. If so, why not make it clear and explain to people what's ok and what is not for app standards, instead of making this dumpsterfire of categories ._.

So far I have:

1 - remove "Sexual Violence" from "Sexual Content and Nudity, sexual violence"
2 - add "Sexual Violence" as its own option.
3 - Remove the word "Issues" from Mental Health

Joanne, I'm disagreeing on the cursing, as for some creators and readers this is actually a concern.

They’re the ones that decided the categories were necessary in the first place. Prior t this, people would mark it mature as necessary and put up their own trigger warnings. Putting groupings that are stigmatizing and potentially triggering us making it worse.

I personaly think making nudity a inherently sexual thing is illogical and dangerous but at this point this it is a lesser worry. Let's at least get the sexual violence its own category (or at least separated from consensual sex).

It's unfortunate that this is your perception. Our old mature filter system hid series behind a notification wall without telling readers the reason for the filter. The episode could contain extreme violence, or a curse word. With that, it made the filter nearly useless to readers and creators - an update was long overdue and necessary.

With all respect. And this comes from a happy “jens” that always back up what tapas does.

This one you HAVE to change ASAP