The USA does not have a rape culture, it is just a term used by feminists to have people look over and say "Oh what is that?" So then they can talk about why they wait over 10 years to speak about when someone touched them. It is not real in the US, but more in the middle eastern countries instead, where they is a real culture of it. Change my mind.
Debra AI Prediction
Post Argument Now Debate Details +
Arguments