As a girl I did think well of feminism, but like you I sort of saw it as a non-issue until I entered the work force.
In school I was more focused and what I could do to build myself up as a potentnial employee, and I didin't want to think about the ways women were limited.
After I was working, in a field that I had a degree form prestigious university in, I saw men half as educated as me, have as experienced, getting promotions and being groomed for promotions while I was given lame excuses for why I was to stay put. I found myself having to justify why I should get to keep my job, one that I performed reguarly well at, while men with no experience, many who didn't even care about that job, were elevated as more talented than myself.
I remember one such man, spent an entire meeting on his phone, playing a game, while I, having more experience than him, spent the entire meeting taking notes and particpating in solutions, get prompted over me. True, I had lots of ways in which to grow myself as an employee, and I benefited from that, but I saw men who didn't care at all, had less experience, and less success, and it was just assumed they would be better at anything than me.
Although I love my current field, I did major and study in a field where there were few women (I studied mathematics), but found myself constantly corraled into the field of teaching. It always assumed that as a woman, I was a natural nurterer, and that was the best thing to do with an educated woman, have her work with children. Many times in my career I was pushed into working with younger children, and my male collegues pushed into high school. True, I am great with children of all ages, but I often heard "Oh you just look like such a nice, safe lady! I'm sure the little ones take to you so well!"