Raised in a religion that preached a massive, theocratic holocaust for the majority of mankind can leave you with more than a few mental and emotional scars. I remember being terrified on New Years eve in 1986 (I was 9 yrs old) because of all the talk that that year was the "International Year of Peace." I was conditioned to look at world events as being a sign that the end is near, sound familiar?
Well, times are bad. Are they the worst ever? Perhaps, at least in terms of dangers to the over all survival of our species. Interestingly, I do not see humans as being any more "evil" the way the Society does. If anything the enlightened world is more humane. Were there talks of "human rights" or "women's rights" a couple hundred years ago? "War crimes?" Was this in human's vocabulary before the twentieth century? If you were black, living in the South in the 1950's you could not eat at the same restaurant as a white person. In this area, times are better, far better.
But still, things are not good. Like most everyone here I get a little twinge of fear whenever I hear the news, especially when I hear any talk of "peace and security" (btw...is there really any other way to word this concept? I don't think so.)
I've read "Signs of the Last Days -- When" by Carl Olof Jonsson. It's a good read, but not great; his Christian belief makes a secular understanding of the subject impossible. Anybody have any thoughts on this matter from both a historical/scientific point of view as well as a psychological point of view? Any other good books you can think of? Thoughts?