Have you ever noticed that women on TV always have their anatomy hanging out, but the men are always dressed?
Why do women have to walk around half-dressed and cold while men walk around properly dressed and warm?
Why, in a sunset beach scene, is the woman in a bikini and the man in pants and a sweater?
I'm very angry about this today. It's aggravating my depression. I want answers as to why women are treated like dirt by the media.
This is breaking my heart today. No matter what we do we are always treated like *****s.
|