60 / 134
Apr 2021

Gotta second that. Violence is not neccesarily abusive. A war between soldiers is different than an alcoholic parent beating up their kid and/or guilt-tripping them.

Seconding @nathanKmcwilliams's suggestion to separate Physical Violence from Abuse.

Could I propose Abuse be it's own category? So like: Abuse - Physical and/or Emotional

On cursing, I suppose then I would like to see "Slur and Pejoratives" added to "Cursing/Profanity".

Alright, so far we have:

1 - remove "Sexual Violence" from "Sexual Content and Nudity, sexual violence"
2 - add "Sexual Violence" as its own option.
3 - Remove the word "Issues" from "Mental Health Issues"
4 - Remove "Physical violence", from "Physical Violence, Abuse, including emotional abuse"
5 - Add "Physical Violence" as a new option.
6 - Rename "Abuse, including emotional abuse" to "Abuse - Physical and/or Emotional"

Question, by nudity, the app mean the private parts of the body like boobs and genitalia?

I think it needs a separate category because there are non-sexual reasons for nudity:

here are some examples:

-A model posing for a painter

-tribeswomen who don't cover their breast because they dont have such taboo

-changing clothes

-nudists

-customs of certain cultures or rituals

well, those are the ones at the top of my head :sweat_smile: , there may be way more....

That one is hard. As I believe it, it is up to the individual reader who flags it and then you have to apply to Tapas that it was a mistake

I personally think Mental Health should be either entirely reworded or expanded upon. I would assume it’s referring to intense breakdowns, or panic attacks, or PTSD flashbacks, etc. But it’s not clear at all, and I would hope it’s not referring to general anxiety.

Can some things like self harm and suicide encompass ideation as well as depictions and actions?

And I agree that sexual violence, dub-con, and non-con should be its own thing and separated. Examples for mental health crises/concerns/topics would be beneficial

I feel like "crisis/ crises" would make the most sense

Cause also it's what medical professionals use as a broad term of mental health issues that reach that level - which is the level that would be triggering for many people

I also agree with this. And in an attempt to be helpful (and hopefully not come across too particular) this would be a more ideal list (to me at least):

Consensual Sexual Content & ​Nudity
Sexual Violence, Sexual Assault & Dubious Consent
Graphic Physical Content: Gore & Bodily Fluids
Graphic Physical Content: Violence & Abuse
Psychological or Emotional Abuse
Depictions of Mental or Emotional Distress (<--I fixed it, I hope)
Self-Harm & Suicide (including ideation)
Substance Abuse & Addiction
Eating Disorders
Hate Speech, Bigoted Sentiments & Slurs
*Excessive Profanity or Cursing


*[A/N: Kind of agreed this was unnecessary though]

@ratique

Thank you. (I'm out of likes :cry:)

It took some time, but I researched a few lists of Content Warnings and Trigger Warnings and made it fit what was already there and what was suggested.

Would these include references to or implied? I always put content and trigger warnings for even implied abuse or assault (like a quick flashback or feeling that doesn't actually reference or depict sexual assault but implies that is what happened). I don't know if everyone does that though.

Also, I like the rewrite of mental or emotional distress, but this is so broad. Is there a certain level this needs to be at?

I think also, like you don't need a mature filter for every type of mental health condition, which is what is bothering people here. Basically I think the need for a "Mental Health" filter is kind of rare.

If you have a comic with like...someone going through a depression because of a breakup...or anxiety, that doesn't need to be under a mature filter, youknow? Not unless it actually steps into violence, suicide and self harm, which is separate filters.

Personally I write CW or TW for things as simple as...if I have guns or something in my story, so I'm on the side of trigger warnings for anyone that is sensitive to that. But a mature filter is different. When the mature filter can potentially remove a comic from the app, that's a problem, and "Mental Health" is too vague in a sea of comics that are largely about mental health.

@littlelilylee5683 and @rajillustration I think all that is just up to the discretion of the creator, whether to tag or not and what to list it as.

Tapas isn't requiring mature tags aside from the obvious things that will get them in trouble with the app, so visual depictions of nudity and sex and certain levels of realistic violence have to be tagged. But anything else is simply a tool for the creator to use for the benefit of their particular audience.

Why is mental health a mature label though, It seems pretty vague. Anyone under 18 can handle that as a topic i'm assuming? Like why is that even an option.

Also if the app is rating things based on content guidelines put out by organizations that monitor that stuff, why bother go through this pointless mess and go off actual rating guidelines that exist for film and other media? You're never going to make users content.

Is there a way to tag my whole series mature so I won't worry about a little spittle of red on a panel?

My fantasy novels have plenty of sword fighting, that's physical violence and people kind of expect that, they don't expect a rape scene. Oh, no no no... this is not good.

I would separate Graphic violence from abuse.....

Violence is not neccesarily abusive. A war between soldiers, or a duel to death is different than an alcoholic parent beating up their kid and/or guilt-tripping them.

Nope. Usually people make note in the description that the story is 18+ or whatever warnings they might need. You already don't need the mature filter for all occurrence of blood/violence. It's specifically for more mature/excessive occurrences. (personally I like that we're allowed to use or discretion if something needs the filter or not! But I get that it also causes concern since the line is often not clear)

why don't we just split it up into: Language, violence, substance use, nudity and sex like most rating systems and then let the user write down what specifically is in that scene that is mature and/or just have an "other" option that lets the user write down it. It simplifies it. We will be here all day if we list every thing that is a dicey topic.

A lot of these aren't mature so much as trigger warnings for people who may want to avoid this content. Very heavy mental health distress may be things people want to avoid, or at least read in a better state of mind.

Yeah, Ratique said allowing a write in option wouldn cause internal issues, but I don't see why we can't have it specifically for display reasons to our audience. Could be really useful for allowing creators to add clarity and context!

But oh well. It's still nice to have options

I mean, we can have a limited number of options and then a dialogue box where the user can write whatever they want in there that will have no impact on the mature label itself. idk i'm not tech savvy but something better than this

No, I agree! It's clearly a do-able thing without internal conflict, even though internal conflict was the reason given for not doing it.

@joannekwan OK, that's really good to hear point blank (since there are so many rumors about what happens when you got mature stuff on the app), and I agree that this should be up to user discretion. I just hope that the way we're presenting it currently doesn't make it seem like you have to put a mature filter for every drop of blood or discussion of something heavy.

We get people popping on the forums sometimes who really don't know if their story is too violent or not when it's like...innocent stick figures punching people. I think people new to the site or new to creating stories don't really know where the line is because we've made it understandibly pretty vague.

I agree, and I have other trigger warnings that I mark mature and specify at the top of the episode. But I guess marking every episode mature where a character overthinks or questions or has fears and needs support because he has anxiety seems extreme. Many people suffer from anxiety (me included) but it is quite normal. I'm just wondering where the line is. Or if should be marking everything as mature because almost all of my characters have some kind of trauma or condition that impacts the way the live their lives.

Maybe it would be better to tag an entire series as "contains mature content" and then indicate what it contains at the front. I know many of us write it in the description, but it would suck to be invested in a story and then 30 episodes in not be able to continue because there is a trigger. Then of course still mark sexual content, any form of abuse, self harm, and the other more serious things per episode so readers have a warning.

Agreed! LOL
I think if there was a notice alongside the mature toggle saying what must be tagged (visual nudity, sex, realistic violence) and that the rest is left to creator discretion would be nice, but I can understand if the UI designers think that would add too much clutter.

I don't think I explained myself well enough.
Tapas obviously do not review everything, we all know it is impossible.
On many comics/novels, especially long form ones, there will necessarily be a few pages that do not fit the app extremely restrictive conditions.
If these pages are marked mature for a specific reason, they will likely be more or less subjected to scrutiny depending on why they are marked mature, and expose the creator to problems they would not have had otherwise.

Now one could say, I'm just upset that it would be easier to find out people not following rules, and that people should follow rules to begin with.

But come on... the content of the app is NOT safe for 13 years olds, and non-sexual nudity is dangerous for no one. This is all a comedy. We're forced to play it, we're playing it.

If the warnings are actually used to remove EVERYTHING that the app does not allowed it will have catastrophic consequences. And the app would still not be safe for kids.