Demystifying Text Generation Approaches

Abstract

Natural Language Processing (NLP) is a subfield of Artificial Intelligence that is focused on enabling computers to understand and process human languages, to get computers closer to a human level understanding of language. The main emphasis in the task of text generation is to generate semantically and syntactically sound, coherent and meaning full text. At ahigh level. the techniques has been to train end to end neural network models consisting of an encoder model to produce a hidden representation of text, followed by a decoder model to generate the target. For the task of text generation, various techniques and models are used. Various algorithms which are used to generate text are discussed in the following subsections. In the field of Text Generation, researcher’s main focus is on Hidden Markov Model(HMM) and Long Short Term Memory (LSTM) units which are used to generate sequential text. This paper also discusses limitations of Hidden Markov Model as well as richness of Long Short Term Memory units.

Keywords

ANN HMM LSTM. Natural Language Processing RNN

  • Research Identity (RIN)

  • License

  • Language & Pages

    English, 15-19

  • Classification

    DDC Code: 006.3 LCC Code: Q335