You are now in the main content area

Accessibility matters: Pathways to a more inclusive future
Innovation Issue 36: Spring 2022

A search for fairness: Counteracting gender bias in AI

Meet the Expert 

A search for fairness: Counteracting gender bias in AI

Small white cube-shaped beads with letters that spell out non-binary


Our society now relies daily on the web for almost everything, from recipes to critical health information. Users depend on web search engines to find relevant and accurate information, but results from search engines could be tainted with biases such as gender stereotypes.

Electrical and computer engineering professor Ebrahim Bagheri leads the Responsible Artificial Intelligence (RAI) training initiative at Toronto Metropolitan University. He and his team are conducting research on methods to counteract biases. He explains there are at least two potential ways for search engines to learn biases. In the first, AI-driven search systems may be exposed to such biases because they are trained over large volumes of human-generated content taken from the web, including Reddit comments, Wikipedia entries and social media posts. “If there are negative stereotypes relating to certain ethnicities or genders in the content used for training, those stereotypes get picked up quite quickly by these AI systems,” said professor Bagheri, who is the Canada Research Chair in Social Information Retrieval. 

Another way such systems may be exposed to bias is through the way search engines retrain themselves to improve user satisfaction. In this process, search engines will use information from past user clicks on search results to increase clicks by future users. Users also introduce their own biases when clicking on search results and these are gradually incorporated into the search engine’s ranking decisions. The results of these biases becoming engrained in search results can be significant. While single individuals may have different biases, professor Bagheri notes that as individuals, users can only impact the people around them. Biases in search systems, however, can have a much wider effect. For example, he says, consider a search engine serving over five billion searches per day. That is five billion impressions on users that can be impacted by such biases. 

“The problem is that these biases are now intensified and deployed at scale, so there needs to be a way to systematically address them,” said professor Bagheri. His group is experimenting with different approaches to address this issue. The first is developing processes to make the data used to train AI for search engines fair, essentially “de-biasing” the training data used in these systems. The second is building bias-aware methods that anticipate and compensate for the fact that search results can be skewed. Both approaches have delivered successful results, maintaining search engine performance and result relevance while increasing fairness and including less biases. They have had successful results with both approaches, maintaining performance and relevance while increasing fairness. 

“If you approach this problem in a systematic and principled way, you don’t have to hurt search result effectiveness in favour of reducing stereotypical biases. You can maintain the same levels of effectiveness while addressing these biases,” said professor Bagheri. He says future steps include tapping social science experts to explore the concept of gender – something he notes is fluid, but tends to be programmed in a binary way by the AI community. Another area of consideration is that there are cases where search results should be somewhat biased, he says, giving an example of a disease that only afflicts one biological sex. 

Professor Bagheri’s research team includes engineering graduate students Negar Arabzadeh and Shirin Seyedsalehi, information technology management graduate student Amin Bigdeli, and his collaborator, professor Morteza Zihayat from the Ted Rogers School of Management. Some of their recent work on gender bias and information retrieval can be read in the proceedings of the European Conference on Information Retrieval (external link)  (ECIR) and the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval (external link)  (SIGIR). 



Future steps include tapping social science experts to explore the concept of gender – something that is fluid, but tends to be programmed in a binary way by the AI community.