I'm American but to be honest, I find the U.S. to be a very scary place these days. We have politicians who want to make the rich richer and the hell with everyone else. Religion has taken over our country and the number of things done in the name of God have very little to do with actual religion. Yeah, I think there was a time when we were the absolute greatest country in the world, but we have since put the needs of a minute few over the needs of all Americans. And that is indeed scary.
Abusive comment hidden.
(Show it anyway.)