Therapy?

Not really. Men have been trained by society that women's needs are more important than theirs. .
huh.
I have never heard this perspective at all
Most women i know would argue that ‘men’ (not necessarily ‘a man’ which might be a distinction worth discussing) are taught to be dismissive of women’s feelings (and to a degree all feelings)

Men are taught to subvert their feelings into work, then sports, then drink and nowadays video games
Putting the feelings of women and kids on par with our own would be a positive development