Feminism | Definition of Feminism by Merriam-Webster
https://www.merriam-webster.com/dictionary/feminism
Definition of feminism.
1 :the theory of the political, economic, and social equality of the sexes.
2 :organized activity on behalf of women's rights and interests.
Attacking what you know nothing about is a clear sign of idiocy. Attacking feminism is a failure to understand that the sexes share equality.
Being a misogynist is ultimately a form of violence against women. The respite of the ignorant is always violence.