I do like the message of this article, but there's no need to hate on feminism for it all.
Everyone needs to depend on themselves, period. Where is the world where people expect someone to take care of them?
I learned early on, day one of adulthood, you can't really count on anyone to take care of you, you have to learn to take care of yourself.
We women often hear and have learned the rhetoric, you can't rely on a man, we have to learn to take care of ourselves. Why would men expect us to take care of them?
I do beleive we can be cheerleaders for others, to emotionally support and be there for others. But to a point. A person has to take responsibility for their own life and go out and make the changes they want for themselves. Never rely on someone else for your happiness, for your financial income, for anything.
It is nice when people show up and are there for you, take it while you can. But all adults need to take care of themselves, period. Don't expect anyone to take care of you, take care of yourself. When someone offers you help, take it, but realize they may try to hold it aganst you at a later date so be careful who you let help you.
I think this is a message everyone needs. People gotta take responsibility for their own lives.
I as a woman have been taught this my whole life. I'm curious why this is do gendered, are men not taught this? Are they taught that women will take care of them financially? Are they taught that women will take care of them period?