NUST Institutional Repository

Invisible in Plain Sight

Show simple item record

dc.contributor.author Manzoor, Rohan
dc.date.accessioned 2024-02-12T05:04:00Z
dc.date.available 2024-02-12T05:04:00Z
dc.date.issued 2024
dc.identifier.other 360818
dc.identifier.uri http://10.250.8.41:8080/xmlui/handle/123456789/42172
dc.description Supervisor: Dr.Wajahat Hussain en_US
dc.description.abstract Recently computer vision datasets and models have been diagnosed with multiple social biases including gender, race, and age. Exposing these biases has proven to be the first step in di luting their effects from algorithms. In this work, we discover and expose the entire chain of differently-abled bias in generative models, well known general purpose datasets, and widely used search engines. Social bias (gender, race, and age) discovery is aided by readily available human attribute detectors trained on large datasets. Developing automatic methods for discovery of concepts with limited representation (training data), e.g., disability bias, is an interesting challenge. We pro pose a novel framework for efficient discovery of biases related to underrepresented groups. Finally, we motivate the creation of large scale differently-abled datasets using a real life exam ple of hotel booking websites. Our experiments reveal that although hotels gain advantage in search rankings leveraging claims of inclusive amenities, the browsing experience of differently abled individuals is not catered for. We hope our findings will guide users, researchers and tech giants to tackle this bias same as gender, race and age stereotypes. en_US
dc.language.iso en en_US
dc.publisher School of Electrical Engineering and Computer Sciences (SEECS), NUST en_US
dc.title Invisible in Plain Sight en_US
dc.type Thesis en_US


Files in this item

This item appears in the following Collection(s)

  • MS [881]

Show simple item record

Search DSpace


Advanced Search

Browse

My Account