When did 'politically correct' become a bad thing? I mean, if an institution seeks to update themselves with an eye towards greater fairness, why would they be criticized, with snideness, for trying to be politically correct? Isn't that a compliment?
When did Christianity become the National Religion? I missed the memo.
And when did Christianity come to mean conservative? Jesus was a liberal, people!
And since when did Conservative mean 'we care what you do in the privacy of your own home'? The conservatives I knew growing up believed whole heartedly that their private lives were none of the governments business. When did that change?
No wonder it is so hard to have a rational discussion anymore. Somebody has taken the words and packed them with agendas, misinformation and skewed meaning.
Some days I think the greatest threat to this country is our ever decreasing standards in education. Less money to guns and more to teachers, IMHO, is the solution.