My personal one is that I've noticed a trend in stories. Whenever a religious society with weird customs shows up in the story, it will be revealed to be controlled by an evil cult that uses fear, dogmas, and oppression to keep people ignorant and compliant.
Like if the religion has a rule that you shouldn't touch blue things because the blue color is sacred, you can safely bet that there will be severe punishment for touching or possessing something blue, and most likely it'll be execution, also similarly the laws about "having wrong ideas" will be similarly severe and unforgiving.
If same thing would be applied to fictional portrayal of Christianity, for example, you'd have scene when character loudly proclaims "the Earth revolves around the Sun!", and instead of people just going "TF are you saying, weirdo, everybody knows that's not true," a couple of guards will instantly appear to drag the guy away to prison for him later be executed for blasphemy.
Bonus points for having a character that will initially be a naive and true believer, and over the course of history they will be set free and realize how wrong and deceitful their culture really is.
Why can't we have a weird culture for a change that's just honestly weird and doesn't have any dark dystopian secrets behind it?
...and sorry for the thread necro.