I have never been someone that is interested in sociology. Before I took this class, I did not even really know what sociology was. This class helped me learn more about controversial issues facing our society today. My favorite discussion in this class is body image and gender roles in society. I found in very interesting and information how when a specific job title is mentioned, subconsciously a person either thinks about a man or a woman who is generally found in that role. Society also has different expectations set for each gender. These expectations have changed as we have advanced, but are the expectations ever going to be equal?