Is there any religions out there that actually think that any religion is good as long as you worship?
Or do they all think the other religions are under the control of Satan/the Devil.
Funny, I knew that the JW's felt that way but never really thought about the other religions.
Is there any humble enough to admit they don't have all the truth and aren't any better than any other religion?
That's the one I want to join..
Can someone correct my spelling of there's to their's in the title?