I'm really not seeing this. What's so degrading about showing ones breasts?
American society has made it seem like its such a big deal for a woman to show her breasts. Although I have never showed my breasts in public at any type of event I don't see the big deal. I sure do look at other females who show theirs though. :QT: If I was in a safe place where I knew that no guys viewing my breasts would take it upon themselves to feel them I would go for it. In fact, next month I will be in Ontario, Canada. There women are allowed to go topless and I have actually thought about doing it. We'll see!