NUST Institutional Repository

Natural Language Modeling for Sentence Generation through Deep Learning

Show simple item record

dc.contributor.author Bashir, Sarmad
dc.date.accessioned 2023-08-19T13:49:41Z
dc.date.available 2023-08-19T13:49:41Z
dc.date.issued 2019
dc.identifier.other 170755
dc.identifier.uri http://10.250.8.41:8080/xmlui/handle/123456789/36970
dc.description Supervisor: Dr. Seemab Latif en_US
dc.description.abstract With the advancement in technology and establishment of Recurrent Neural Networks (RNN), various Natural Language Generation (NLG) tasks have achieved tremendous success, such as image caption generation, neural machine translation and abstract summarization based systems. However, inclusion of pre-specified lexical constraints in the sentences for increasing the quality still remains a new, not well studied task in NLG. Lexical constraints takes the form of words in the output sentences for mitigating out-of-domain image tags (constraints) issue in image caption generation. Moreover, spoken dialogue systems tend to generate universal replies that contain specific information, therefore pre-specified constraints can be incorporated in replies to make them more realistic. However, existing methods allows the inclusion of lexical constraints in the output sentences during the decoding process, which increases the architectures complexity exponentially or linearly with respect to number of constraints. Also, some approaches can only deal with single con straint. To this end, this thesis proposes an neural probabilistic architecture based on backward/forward language model and word embedding substitution method that can cater multiple constraints to generate fluent and coherent sentences. Moreover, we split the sequence on Part-of-Speech verb category for backward generative model to employ word’s positional information. The analysis of the proposed architecture for generating lexical constrained sentences outperforms previous methods in terms of perplexity evaluation metric. Human evaluation also presents that generated constrained sentences are close to human-written sentences in particular aspect of fluency en_US
dc.language.iso en en_US
dc.publisher School of Electrical Engineering and Computer Science NUST SEECS en_US
dc.subject Recurrent neural networks, natural language generation, lan guage models, lexical constraints, word embedding en_US
dc.title Natural Language Modeling for Sentence Generation through Deep Learning en_US
dc.type Thesis en_US


Files in this item

This item appears in the following Collection(s)

  • MS [434]

Show simple item record

Search DSpace


Advanced Search

Browse

My Account