About 2 and a half years ago, I thought I had genital warts. I didn’t tell anyone, I didn’t ask any of my friends or family for advice, I just called planned parenthood to go get tested. Turns out, I had some type of (non sexually-transmitted) viral infection called Molluscum that is super common in babies and toddlers. Strange, no one knew how I had gotten it, and I was treated, and it went away. Now I say all of this to mention that I was TERRIFIED of the idea that I had genitial warts. There is such a negative stigma around STIs and STDs but there isn’t a negative stigma around having sex? It’s not something that should make sense.
It’s perfectly socially acceptable to go out to the club, and go home with someone random. Sometimes people literally go out with the sole purpose of going home with someone. But God forbid you get an STD or an STI! That is just so looked down upon, and I feel like it is something that should be normalized, or at least to a degree.
Even with the fact that AIDS used to be called GRIDS… That specific STI was literally NAMED after the fact that there was a gay predominance. I think that’s BS. Everyone has sex. EVERYONE. Even most asexual people have had sex at least once in their lifetime. Why is the fact that STIs and STDs are a potential outcome of that seen in such a bad light? People don’t get shunned when they get cold sores or Mono from making out with someone random… Why are people so judged when they get the clap from someone random?
I am just tired of living in a world where people are constantly trying to put themselves on a tier above others. It’s tiresome and I don’t have the energy to try to keep up. Ask yourself why! Why are we so afraid to contract an STD or STI? Most of the common ones are treatable, so why? Are we scared of what it will say about us as a person? All it says is that we are sexually active, and that is a concept that is super normalized… so why aren’t STDs?