• Bias

    Measuring Fairness in Ranked List Information access systems like search engines and recommender systems suffer from fairness and bias issues. These systems usually present retrieved results in ranked lists to users and items with similar relevance in the lists may receive unequal exposure based on their demographic group membership. In this project, we are focusing on measuring fairness in ranked output by conducting following analyses:

    1. Describing existing fair ranking metrics using unified notations.
    2. Identifying the limitaions of the existign metrics and gaps in fair ranking metrics research area
    3. Sensitivity analysis on the fair ranking metrics.
    4. Designing fair ranking metric with broader applicability.
  • Stereotype

    Stereotype in Information Access Systems

    Information access systems like search engines and recommender systems may perpetuate social stereotypes and reinforce them through their results. In this porject, we aim to address this issue by working on identifying and measuring the stereotypes in retrieved results. Currently we are focusing on the tendency of replicating and manifesting gender stereotypes associated with children’s products through retrived results. We hope this research will contribute towards developing safe web enviroment for children.