NUST Institutional Repository

Evaluating Transfer Performance of Self Supervised Models for Vehicle Re-Identification

Show simple item record

dc.contributor.author Abbasi, Maria Waseem
dc.date.accessioned 2023-08-07T09:52:23Z
dc.date.available 2023-08-07T09:52:23Z
dc.date.issued 2023
dc.identifier.other 320677
dc.identifier.uri http://10.250.8.41:8080/xmlui/handle/123456789/35732
dc.description Supervisor: Dr. Khawar Khurshid en_US
dc.description.abstract In recent times, self-supervised visual representation learning has made considerable advancements. But no such studies are available which determine the extent to which features learned through self-supervised pre-training on the ImageNet dataset can generalize to distinct tasks. Labelled data is one of the major challenges faced in the field of vehicle re-Identification. So, we have decided to perform this analytical study on the downstream task of Vehicle Re-Identification. In our model, architecture is based on two steps to address this problem and assess the transfer performance of SSL method on Vehicle Re-identification task. The first step is to pretrain the self-supervised model, which in our case SimCLRv-2, on ImageNet to learn representations from this data. The pre-training phase leverages unlabeled data to initialize the model and learn representations. The second step is to use these pre trained weights from the intial step to perform vehicle re-identification task. In this step, we have used fully connected convolutional layers on top of the frozen features that are acquired from pretrained model. We have used Veri-WILD dataset to further tarin and test the model. We have experimented with pretraining and without pretraining the model. Our experiments provide evidence that the inclusion of self-supervised pre-training results in quicker convergence and less training time in comparison to training without pre-training. The evaluation metric that we have used is mean Average Precision (mAP%). This strategy has helped us achieve more accurate results, by using both learned representations as well as a few annotations present in the data. en_US
dc.language.iso en en_US
dc.publisher School of Electrical Engineering and Computer Sciences (SEECS), NUST en_US
dc.subject Self-Supervised, Vehicle Re-Identification, Transfer Learning, Pretraining, Representation learning en_US
dc.title Evaluating Transfer Performance of Self Supervised Models for Vehicle Re-Identification en_US
dc.type Thesis en_US


Files in this item

This item appears in the following Collection(s)

  • MS [375]

Show simple item record

Search DSpace


Advanced Search

Browse

My Account