I know that many feel the Bible teaches equality between the sexs. But an honest look without any fear of God what do we find??
I think the Bible has played a prominant role. The Bible also condoned slavery and the right to beat one's slave severly up to the point of death as long as he lived a day after the beating. It has played a role in justifying wars,,that have "god's" backing. What will a honest look show?