Glasses is one variable. Then there's facial hair, tinted contact lenses, hair color, tan/no tan, makeup, lighting, growth (maturation/obesity/dieting, the list goes on and on.
AI is like idiot unevolved children playing with wooden blocks: they can say which blocks match but good luck when transferred to the real world where everything isn't a wooden block.
The "but if only we had better data" is a crock because the real problem is that they don't actually know what constitutes a robust identifier system. People and animals have a system tested over millions of years in the real world; AI facial has literally a handful of years of real world exposure.
I haven't seen it, but there might be research which has attempted to quantify just how many data points are uses by people ton identify faces.
My bet is that it is a lot more than the relative handful present AI use: distance between eyes, nose size, jaw angle, etc. And also a lot more quantitative: attractive people are symmetrical to the millimeter level, so clearly human pattern recognition is taking that into account even if subconsciously.