Demystifying Text Generation Approaches

Article Fingerprint
Research ID O2209

Abstract

Natural Language Processing (NLP) is a subfield of Artificial Intelligence that is focused on enabling computers to understand and process human languages, to get computers closer to a human level understanding of language. The main emphasis in the task of text generation is to generate semantically and syntactically sound, coherent and meaning full text. At ahigh level. the techniques has been to train end to end neural network models consisting of an encoder model to produce a hidden representation of text, followed by a decoder model to generate the target. For the task of text generation, various techniques and models are used. Various algorithms which are used to generate text are discussed in the following subsections. In the field of Text Generation, researcher’s main focus is on Hidden Markov Model(HMM) and Long Short Term Memory (LSTM) units which are used to generate sequential text. This paper also discusses limitations of Hidden Markov Model as well as richness of Long Short Term Memory units.

Conflict of Interest

The authors declare no conflict of interest.

Ethical Approval

Not applicable

Data Availability

The datasets used in this study are openly available at [repository link] and the source code is available on GitHub at [GitHub link].

Funding

This work did not receive any external funding.

Cite this article

Generating citation...

Related Research

  • Classification

    DDC Code: 006.3 LCC Code: Q335

  • Version of record

    v1.0

  • Issue date

    13 February 2023

  • Language

    English

Iconic historic building with domed tower in London, UK.
Open Access
Research Article
CC-BY-NC 4.0
LJRCST Volume 23 LJRCST Volume 23 Issue 1, Pg. 15-19