We live in a society that tells people to avoid being raped instead of telling everyone to just not FUCKING rape people. We see ads against rape on mass media all the time, and what's the usual message?
its not so much teaching men, but not saying "well it's your fault" when it happens to a woman. blame the assailant not the victim. i don't know if you can teach men to not rape. some are sick mental cases.