NUST Institutional Repository

Hierarchical Softmax for Fine-Grained Classification

Show simple item record

dc.contributor.author Mohammad Uzair, Fatima Hassan
dc.date.accessioned 2021-01-05T11:21:44Z
dc.date.available 2021-01-05T11:21:44Z
dc.date.issued 2019
dc.identifier.uri http://10.250.8.41:8080/xmlui/handle/123456789/20548
dc.description Supervisor: Dr. Faisal Shafait en_US
dc.description.abstract Normally, we use Softmax after the last fully connected layer in a convolutional neural network to get the probability of the classes. With the number of increasing classes, the accuracy for classification decreases. Also, Multiple datasets cannot be jointly trained because they are not mutually exclusive. A joint training algorithm that combines multiple datasets in a hierarchical structure and uses hierarchical softmax as the output layer is proposed. We use this technique to train the detection and classification datasets like COCO and ImageNet together. We train YOLO on 9000 classes from the detection dataset of COCO and classification dataset of ImageNet. Our trained model was able to predict classes that were not in the training dataset. While testing on the 200 detection classes of ImageNet, we were able to achieve 19 % mean Average Precision. Out of this 200 classes the model was trained only on 44 classes that were present in the COCO detection dataset. We achieved 16 % mAP on the 156 classes that were never in the training dataset. Apart from this, Our model was trained to predict over 9000 classes. en_US
dc.publisher SEECS, National University of Sciences and Technology, Islamabad en_US
dc.subject Computer Science en_US
dc.title Hierarchical Softmax for Fine-Grained Classification en_US
dc.type Thesis en_US


Files in this item

This item appears in the following Collection(s)

  • BS [211]

Show simple item record

Search DSpace


Advanced Search

Browse

My Account