Targeted online ads found to show gender bias

Credit: Anne-Sophie Kim/Layout Manager Credit: Anne-Sophie Kim/Layout Manager

Today, more than 3 billion people are connected to the World Wide Web, making the digital realm an ideal place for marketers and advertisers to promote their products, services, and opinions. Ever since the first clickable web ad was sold in 1993 by Global Network Navigator to a San Francisco law firm, online advertising has been advancing, to the point where online ads can be customized to fit a user’s interests and needs. ​But as online ads have become more specifically tailored to each user, researchers suggest that advertisers unfairly target users of a particular demographic.

A recent study conducted by a team of researchers including Anupam Datta, an associate professor of computer science and electrical and computer engineering at Carnegie Mellon University, focused on Google ads shown to users with different ad settings. Datta’s study of Google online ads is part of an overarching goal to learn more about the scientific foundations of security and privacy. Datta started the project with Amit Datta, a doctoral student in Carnegie Mellon’s electrical and computer engineering department, and Michael Tschantz, a former doctoral student at Carnegie Mellon University who is currently a researcher at the International Computer Science Institute at the University of California, Berkeley.

​In their study, the research group used AdFisher, a tool they designed to run automated experiments on customized ad settings. The tool runs 1000 browsers, simulating 500 female users and 500 male users. The gender of each simulated user was marked on Google’s Ad setting page, which is meant to provide users some control over the ads they receive online. Before the experiment, the researchers ensured that gender was the only difference between the two groups.

The browsers were then directed to top 100 websites for employment that were listed on Alexa, an Amazon-owned company that provides web traffic data services. A built-in feature of AdFisher enables the browsers to collect ads on any of the visited websites., the website of a popular newspaper, was chosen for most of the experiments as it provided the most Google ads per page reload.

“AdFisher then used machine learning and statistical analysis to identify the differences among the ads served to the males and females,” Anupam Datta explained in an email interview with The Tartan.

The results showed that an ad from a career coaching service, with the promise of high-paying executive level jobs, was targeted significantly more towards men than women. The study determined that male users were shown the ad 1816 times, while women were only shown the add 311 times.

“We were surprised and concerned by this finding,” Anupam Datta said.

While the results from AdFisher detect discrimination in Google’s ad targeting, it does not provide a full picture as to why this is happening. Datta and his colleagues have several theories as to what caused the phenomenon. Google’s targeting algorithm may have learned that males are more likely to click on a particular ad than females, and therefore used that information to promote more relevant ad targeting. It is also possible that the advertisers themselves chose to target males over females, a request that could easily be carried out by Google’s massive, automated advertising system. The researchers believe that it is unlikely that Google programmed their advertising ecosystem to show a bias toward male users over female users.

The researchers also noted the possibility that the results were influenced by Google’s security measures to detect robots. “Indeed, it is possible that Google suspected that the browser activities were conducted by bots rather than humans in our experiments and therefore treated them differently,” Datta said. “However, we consider it unlikely that they decided to serve ads in a discriminatory manner after making such a determination.”

The root of the issue lies within what are called “black boxes,” algorithms or other programs whose inner workings are unknown.

“In today’s society, important decisions in many sectors are being made by a combination of algorithms and humans who act from within black boxes,” Datta stated. “Their operation is not transparent to the people who are subjected to these decisions, making it difficult to hold them accountable for undesirable outcomes.”
To study the lack of transparency in Google’s ad system, the researchers had a group of simulated users visit top websites for substance abuse, while another group of simulated users was idle. The substance abuse visitors were exposed to more ads for alcohol and drug rehabilitation centers, but Datta and his colleagues noticed that the control group did not receive any of those ads.

Furthermore, the search for substance abuse had no effect on the Google Ad Settings page, perhaps for the same reasons that targeted ads seemed to produce gender discrimination. However, the researchers suspect that the results may be due to remarketing, in which an advertisement from a visited website continues to follow the user on the Internet.

While Google did not officially address these findings, the company has recently reworded its ad settings page to indicate the limits of users’ control over what is targeted to them.

Even as Google utilizes personal data to produce more relevant advertising, its lack of transparency may set up systems for unintentional abuse, such as helping to maintain the gender gap in salaries. “Online advertisements and recommendations are a gateway to opportunities, and this form of differential treatment based on gender can unfairly limit opportunities,” Datta explained.

Currently, the researchers are working with Microsoft to develop tools that can offer more internal oversight of information processing ecosystems, particularly the ad ecosystem.

Datta expects that there will be more work done in making companies more accountable when concerning behaviors, including discrimination, arise from their information systems. His group believes that using AdFisher will help identify violations in black box information systems during future studies.

“Our research findings provide the first proof of existence of discrimination in online behavioral advertising,” stated Datta. “Thus demonstrating that the problem exists in practice, not just in theory.”