I am pretty sure that most of you have realized that society's views on sex has changed. In today's society it is the opposite way around, instead of getting to know someone first then having sex with them later instead you have sex with them first then get to know them....if the sex is good. I hate that because afterwards you have some females that get their feelings hurt and you have guys that think relationships should be based off of just sex when that is entirely not true. When did sex become something that is not emotional?
No comments:
Post a Comment