I have an honest question, now I live in a part of the country that is conservative so maybe this is why I feel this way but do we live in a time where everyone is butthurt over everything? And I don't mean that in a conservative or liberal way, I mean in general.
I feel like as a country we went from "everyone has the rights and freedom to do and believe whatever they want" to "if you don't do things exactly like I want I am going to throw a fit until I get my way"
I don't know when all of this changed but it's kind of tiring to me.