I can think of plenty. From the men who told you to "smile because you look pretty", from the ones who tell you that you should wear heels and dresses, to the ones who tell you that you should be "nicer"/ask nicer... The list goes on.
And there are plenty of articles written by men about what women "need" to do in order to make life easier for them.
And then you have the men who tell other men what they have to do. Example: Be an alpha, don't be a pussy. You need to buy this game! Etc, etc.
I think that the reason you can't think about men doing that is that it is so common that you just put them in the "background" of your mind.