Relational Sequence Learning

Sequential behavior and sequence learning are essential to intelligence. Often the elements of sequences exhibit an internal structure that can elegantly be represented using relational atoms. Applying traditional sequential learning techniques to such re

  • PDF / 1,002,488 Bytes
  • 28 Pages / 430 x 660 pts Page_size
  • 53 Downloads / 228 Views

DOWNLOAD

REPORT


3

1 CSAIL, Massachusetts Institute of Technology 32 Vassar Street, Cambridge, MA 02139-4307, USA [email protected] 2 Departement Computerwetenschappen, K.U. Leuven Celestijnenlaan 200A - bus 2402, B-3001 Heverlee, Belgium {Luc.DeRaedt,Bernd.Gutmann}@cs.kuleuven.be Machine Learning Lab, Institute for Computer Science, University of Freiburg Georges-Koehler Allee, Building 079, 79110 Freiburg, Germany {landwehr,karwath}@informatik.uni-freiburg.de

Abstract. Sequential behavior and sequence learning are essential to intelligence. Often the elements of sequences exhibit an internal structure that can elegantly be represented using relational atoms. Applying traditional sequential learning techniques to such relational sequences requires one either to ignore the internal structure or to live with a combinatorial explosion of the model complexity. This chapter briefly reviews relational sequence learning and describes several techniques tailored towards realizing this, such as local pattern mining techniques, (hidden) Markov models, conditional random fields, dynamic programming and reinforcement learning.

1

Introduction

The activities humans perform are by their very nature sequential. Sequences occur in many activities ranging from reasoning to language understanding, in everyday skills as well as in complex problem solving. The ability to learn from such sequences is therefore essential to artificial intelligence and sequence learning techniques can be applied in many domains such as planning, reasoning, robotics, user modeling, natural language processing, speech recognition, adaptive control, activity recognition, information extraction, and computational biology. This explains why learning from sequences has received a lot of attention in the past few decades. Learning tasks investigated include classification, prediction, local pattern mining, labeling, alignment, transduction, and density and policy estimation. One major dimension along which to differentiate sequential learning techniques is the complexity of the language they employ to describe sequences and models. At one extreme are learning approaches that assume a propositional language. The simplicity of a propositional language allows such methods to represent the model in matrix form: cells typically denote the transition probabilities among symbols. In turn, efficient matrix operations can be used to devise L. De Raedt et al. (Eds.): Probabilistic ILP 2007, LNAI 4911, pp. 28–55, 2008. c Springer-Verlag Berlin Heidelberg 2008 

Relational Sequence Learning

29

efficient algorithms. At the other end of the spectrum, (probabilistic) relational systems accept descriptions of complex, structured sequence elements and generate relationally structured models. They typically have access to background knowledge and allow for a more compact description of the entities, but often have also a higher computational complexity. This chapter reviews several relational sequences learning techniques that build on ideas developed on both sides of the spectrum. They fill