To interject a little bit of seriousness in here, working at a company owned by them before, I am completely not surprised by this. In my experience, management completely does not give a shit about the employees and does what is best for them and their "buddies." For example, not giving the woman enough training to be able to just do her job. Secondly, after she brings enough attention to something they don't like, they take any action they can to punish her. I won't get into the details here, but my questioning over all of the bullshit policies the management enforced where I worked ultimately lead to me leaving the company.
Now don't get me wrong, I'm not taking any "feminist" sides at all (I myself am male). I'm just saying that with how their management thinks and acts, I am utterly shocked that they are still in business (i.e., being purposefully told to do things incorrectly just so they have better figures to show off to the government and media).