Deep learning approaches to text production

Author(s)

    • Narayan, Shashi
    • Gardent, Claire

Bibliographic Information

Deep learning approaches to text production

Shashi Narayan, Claire Gardent

(Synthesis lectures on human language technologies, 44)(Synthesis collection of technology)

Springer, c2022

  • : pbk

Available at  / 1 libraries

Search this Book/Journal

Note

Reprint. Originally published: Morgan & Claypool, c2020

Includes bibliographical references (p. 139-173)

Description and Table of Contents

Description

Text production has many applications. It is used, for instance, to generate dialogue turns from dialogue moves, verbalise the content of knowledge bases, or generate English sentences from rich linguistic representations, such as dependency trees or abstract meaning representations. Text production is also at work in text-to-text transformations such as sentence compression, sentence fusion, paraphrasing, sentence (or text) simplification, and text summarisation. This book offers an overview of the fundamentals of neural models for text production. In particular, we elaborate on three main aspects of neural approaches to text production: how sequential decoders learn to generate adequate text, how encoders learn to produce better input representations, and how neural generators account for task-specific objectives. Indeed, each text-production task raises a slightly different challenge (e.g, how to take the dialogue context into account when producing a dialogue turn, how to detect and merge relevant information when summarising a text, or how to produce a well-formed text that correctly captures the information contained in some input data in the case of data-to-text generation). We outline the constraints specific to some of these tasks and examine how existing neural models account for them. More generally, this book considers text-to-text, meaning-to-text, and data-to-text transformations. It aims to provide the audience with a basic knowledge of neural approaches to text production and a roadmap to get them started with the related work. The book is mainly targeted at researchers, graduate students, and industrials interested in text production from different forms of inputs.

Table of Contents

List of Figures.- List of Tables.- Preface.- Introduction.- Pre-Neural Approaches.- Deep Learning Frameworks.- Generating Better Text.- Building Better Input Representations.- Modelling Task-Specific Communication Goals.- Data Sets and Challenges.- Conclusion.- Bibliography.- Authors' Biographies.

by "Nielsen BookData"

Related Books: 1-2 of 2

Details

  • NCID
    BC17463908
  • ISBN
    • 9783031010453
  • Country Code
    sz
  • Title Language Code
    eng
  • Text Language Code
    eng
  • Place of Publication
    [Cham]
  • Pages/Volumes
    xxiv, 175 p.
  • Size
    24 cm
  • Parent Bibliography ID
Page Top