General Discussion
In reply to the discussion: Where have societies' views of women come from? [View all]Honeycombe8
(37,648 posts)Men were bigger, stronger, controlled everything. Women had babies and needed men to protect and provide for them, esp since they had children to take care of. So men took over women and their children, like property. Later, marriage was invented by men as a way of ensuring the children were his (at least in theory).
As years passed, physical strength became less important, and mental strength became more important, societies became enlightened, the concept of human rights came along, women began to ask or demand for more rights, and so on.
Those cultures or countries that still treat women like chattel are stuck in time and are not particularly advanced, seems to me. The don't have the concept of human rights, at least as regards women and children.
Religion plays a part, but it also enhanced a woman's position in society, in some ways, in some cultures. Christianity, anyway. There was a Virgin Mother, and Mary was included in Jesus' inner circle. Men were told to treat women respectfully, etc. Women were accepted as pastors in some churches. Again, that was in theory.