NUST Institutional Repository

Smart Annotation Approaches for Medical Imaging

Show simple item record

dc.contributor.author Mariam, Komal
dc.date.accessioned 2023-07-13T11:50:16Z
dc.date.available 2023-07-13T11:50:16Z
dc.date.issued 2020
dc.identifier.other 205069
dc.identifier.uri http://10.250.8.41:8080/xmlui/handle/123456789/34626
dc.description Supervisor: Dr. Wajahat Hussain en_US
dc.description.abstract Unavailability of large training datasets is a bottleneck that needs to be overcome to realise the true potential of deep learning in histopathology applications. Although slide digitisation via whole slide imaging scanners has increased the speed of data acquisition, labelling of virtual slides requires a substantial time investment from pathologists. This makes generating large, correctly-labelled slide datasets an expensive, time-consuming and laborious exercise. Eye gaze annotations have the potential to speed up the slide labelling process. This work explores the viability of using eye gaze labelling as compared to conventional hand based labelling techniques for training object detectors. A low-cost gaze tracking device is used to track the gaze of a pathologist working with virtual slides on a computer screen. Challenges associated with gaze based labelling and techniques to refine the coarse data for subsequent object detection are also discussed. Results demonstrate that gaze tracking based labelling can save valuable time of the pathologist while performance of deep learning algorithms trained to detect Keratin Pearls in oral cancer Whole Slide Images (WSI) using gaze annotations is comparable to deep learning models obtained from hand annotations. en_US
dc.language.iso en en_US
dc.publisher School of Electrical Engineering and Computer Science en_US
dc.title Smart Annotation Approaches for Medical Imaging en_US
dc.type Thesis en_US


Files in this item

This item appears in the following Collection(s)

  • MS [882]

Show simple item record

Search DSpace


Advanced Search

Browse

My Account