Epilogue: Frontiers of NLP in the Deep Learning Era

In the first part of this epilogue, we summarize the book holistically from two perspectives. The first, task-centric perspective ties together and categories a wide range of NLP techniques discussed in book in terms of general machine learning paradigms.

  • PDF / 291,289 Bytes
  • 18 Pages / 439.37 x 666.142 pts Page_size
  • 97 Downloads / 171 Views

DOWNLOAD

REPORT


Epilogue: Frontiers of NLP in the Deep Learning Era Li Deng and Yang Liu

Abstract In the first part of this epilogue, we summarize the book holistically from two perspectives. The first, task-centric perspective ties together and categories a wide range of NLP techniques discussed in book in terms of general machine learning paradigms. In this way, the majority of sections and chapters of the book can be naturally clustered into four classes: classification, sequence-based prediction, higher-order structured prediction, and sequential decision-making. The second, representation-centric perspective distills insight from holistically analyzed book chapters from cognitive science viewpoints and in terms of two basic types of natural language representations: symbolic and distributed representations. In the second part of the epilogue, we update the most recent progress on deep learning in NLP (mainly during the later part of 2017, not surveyed in earlier chapters). Based on our reviews of these rapid recent advances, we then enrich our earlier writing on the research frontiers of NLP in Chap. 1 by addressing future directions of exploiting compositionality of natural language for generalization, unsupervised and reinforcement learning for NLP and their intricate connections, meta-learning for NLP, and weak-sense and strong-sense interpretability for NLP systems based on deep learning.

11.1 Introduction Natural language processing (NLP) is a most important technology in our information age, constituting a crucial branch of artificial intelligence via understanding complex natural language in both spoken and text forms. The history of NLP is nothing short of fascinating, with three major waves closely paralleling those of the development of artificial intelligence. The current rising wave of NLP has been propelled by deep L. Deng (B) Citadel, Chicago & Seattle, USA e-mail: [email protected] Y. Liu Tsinghua University, Beijing, China e-mail: [email protected] © Springer Nature Singapore Pte Ltd. 2018 L. Deng and Y. Liu (eds.), Deep Learning in Natural Language Processing, https://doi.org/10.1007/978-981-10-5209-5_11

309

310

L. Deng and Y. Liu

learning over the past few years. As of the time of writing this epilogue in November of 2017, we see expansions of many deep learning and neural networks methods presented in this book in multiple directions, with no sign of slowing down. Since we started this book project about one year ago, the NLP field has witnessed significant advances in both methods and applications, many empowered by deep learning. For example, unsupervised learning methods have very recently emerged in the literature; e.g. (Lample et al. 2017; Artetxe et al. 2017; Liu et al. 2017; Radford et al. 2017). In addition, excellent tutorial and survey materials have been published recently, offering new insight into numerous deep learning methods and comprehensive state of the art results for NLP; e.g. (Goldberg 2017; Young et al. 2017; Couto 2017; Shoham et al. 2017). These new developments and li