Patriarchal religions, like Judaism and Christianity, established and upheld the 'man's world.'
When I was 35, all of a sudden I thought maybe it'd be nice to knit a sweater.
I see my role as a scholar announcing that women's feelings of unworthiness and insecurity often may be traced to training in a male-oriented religion, and I'm trying to investigate a richer spiritual life for both sexes.
Is it really true that religion makes people more kindly, generous, or loving? History tends to disprove this. The worst wars, the most vicious Inquisitions, the cruelest pogroms and persecutions, were both fomented and supported by religion.
Marriage finally became acceptable to the churches when laws were established that could make it a means of depriving women of incomes and property, and making wives the equivalent of slaves.