I realized a lot of this a long while back, but it is much more relevant, now. So, I'm going to go through what I believed (and many still do), what I learned, and why it is important to our age. I grew up in a Southern, conservative area... albeit still very Democrat but that would shift... the party, not the culture. It was pretty universally believed that a man should take care of a woman... providing for her.. defending her.. and making it so she "didn't have to" work. Her place, it was believed was in the home, raising children. I know you might be saying it's a lot like that, now... and I'll get to that. So, the idea that a woman would be working was back then a sign of shame... this was before it became the norm for both to work. Indeed, as women began to work, shows like Laverne and Shirley and others of career women were viewed by many as "Feminists" that were avoiding their household responsibilities and ruining families. If a lot of ...