There is some island off the coast of ...Africa? where the roles of men and women are reversed. Men tend the children and the households and women take care of property/business and working outside the home. The documentary I was watching showed the men lined up dress up in costume and make up, waiting for the women to chose them. They also showed groups of married folk with the made up men sitting at tables discussing child care and the callousness of their mates while the women were at other tables smoking and talking business. It was like bizarre-o world. It seemed that the point was that men and women can slip easily into each other's roles. It is a matter of enculturation. One point made was that the women could be just as overbearing as the men in other societies. And the men could seem just as passive.
I don't think the wt society would be much better off with roles reversed. It is the foundation that the society rests on, that is the problem. Fundalmentalism and the results of too much power being in the hands of a few. And that the genders are set in opposition to each other--one subservient to another.
I think men and women are much more alike than different. It would be nice if we could all have an appreciation of each other and the power and ability all humans possess and appreciation for the biological differences that make us unique.