I hate seeing trauma on screen. Like Queer pain, rape, hell even black pain annoys the hell out of me sometimes and I'm white. I subscribe to the notion that we as people already have to deal with so much shit in the real world and I really don't want to watch another "the world sucks and you can't fix it" movie. And no Mysterious Skin doesn't have such a nihilistic approach to the world, it was more about the incident that shaped two boys' lives. But does seeing queer trauma add to existing trauma? I know that recently we have been getting better queer films about happiness like Happiest Season (which was really good and surprisingly nuanced) or Love, Simon. Both are still very white but they are stupid cheesy movies about being queer and finding love, and after growing up hearing about how queer people can't find love it feels really validating to me to see it happen in such a stereotypical way. The cheesy romance films all about a will-they-won...