A pox on people behaving sterotypically
The paper lays out the following:
"Intuitively, the goal is to show ads that particular users are likely to engage with, even in cases where the advertiser does not know a priori which users are most receptive to their message. To accomplish this, the platforms build extensive user interest profiles and track ad performance to understand how different users interact with different ads. This historical data is then used to steer future ads towards those users who are most likely to be interested in them, and to users like them."
Which is exactly what you would expect effective advertising to do: target based on individual interests.
However, the paper then shifts from the individual to the group with the following argument:
"However, in doing so, the platforms may inadvertently cause ads to deliver primarily to a skewed subgroup ... if these “valuable” user demo-graphics are strongly correlated with protected classes, it could lead to discriminatory ad delivery"
So, if the targeted individuals could be strongly correlated with a protected class, then it leads to "discriminatory" targeting. As the paper continues:
For example, ads targeting the same audience but that include a creative that would stereotypically be of the most interest to men (e.g., bodybuilding) can deliver to over 80% men, and those that include a creative that would stereotypically be of the the most interest to women (e.g., cosmetics) can deliver to over 90% women. Similarly, ads referring to cultural content stereotypically of most interest to black users (e.g., hip-hop) can deliver to over 85% black users, and those referring to content stereotypically of interest to white users (e.g., country music) can deliver to over 80% white users"
Which, to me, suggests the algorithms are working correctly, the problem is people engaging in "stereotypical" patterns of behaviour which the algorithms are picking up on.