You train an ML system with a corpus labeled using stereotypes, and it will implement those stereotypes. If a random sampling of images of the populace shows a correlation between long hair and being female, it's perfectly reasonable for a filter like this - which after all is just a toy - to treat long hair as a feminine attribute.
Coding a rule that long hair is feminine into a classifier (or a generator trained by such a classifier) would be sexist. A classifier learning that correlation on its own is imprecise - and certainly someone might refuse to use it for that reason - but it's not particularly useful to label it "sexist". Using such a classifier (or generator) in certain applications could certainly be a sexist decision.