Wait... isn't America already a "christian" nation (may you all have peace)? I mean, no, of course, it isn't (it's "works" often prove that it really never was)... but for the sake of argument doesn't it already consider itself such? I mean, wasn't it supposedly "founded" on so-called "christian" principles? Isn't it the God [of the Bible... well, okay, the OT] that is mention on American money and in the anthem sang at the opening of virtually every sports event held in the country? Isn't it the "christian" Bible that most still "swear" on (when they are called to swear)... and the country leaders bank their various oaths on?
I'm confused: I mean... I can see America heading toward NOT being a so-called "christian" nation... but I think it's still got a long way to go before it gets there. Shoot, I have more people giving me the "evil eye" because I don't belong to a church (and not necessarily their church, but just a church!)... than I do because I still believe in God (although, some of those tend to through some pretty "evil" eyes my way, as well...).
Oh, and isn't this propaganda, too? You know, since it's introduced with the word "If" and all? I mean, I'm just sayin'...
Again, peace to you all!
A slave of Christ,