Transfer Entropy in Neuroscience

Information transfer is a key component of information processing, next to information storage and modification. Information transfer can be measured by a variety of directed informationmeasures of which transfer entropy is themost popular, andmost princi

  • PDF / 1,096,030 Bytes
  • 34 Pages / 439.363 x 666.131 pts Page_size
  • 100 Downloads / 202 Views

DOWNLOAD

REPORT


Abstract. Information transfer is a key component of information processing, next to information storage and modification. Information transfer can be measured by a variety of directed information measures of which transfer entropy is the most popular, and most principled one. This chapter presents the basic concepts behind transfer entropy in an intuitive fashion, including graphical depictions of the key concepts. It also includes a special section devoted to the correct interpretation of the measure, especially with respect to concepts of causality. The chapter also provides an overview of estimation techniques for transfer entropy and pointers to popular open source toolboxes. It also introduces recent extensions of transfer entropy that serve to estimate delays involved in information transfer in a network. By touching upon alternative measures of information transfer, such as Massey’s directed information transfer and Runge’s momentary information transfer, it may serve as a frame of reference for more specialised treatments and as an overview over the field of studies in information transfer in general.

1 Introduction This chapter introduces transfer entropy, which to date is arguably the most widely used directed information measure, especially in neuroscience. The presentation of Michael Wibral MEG Unit, Brain Imaging Center, Goethe University, Heinrich-Hoffmann Strasse 10, 60528 Frankfurt am Main, Germany e-mail: [email protected] Raul Vicente Max-Planck Institute for Brain Research, 60385 Frankfurt am Main, Germany e-mail: [email protected] Michael Lindner School of Psychology and Clinical Language Science, University of Reading e-mail: [email protected] M. Wibral et al. (eds.), Directed Information Measures in Neuroscience, Understanding Complex Systems, c Springer-Verlag Berlin Heidelberg 2014 DOI: 10.1007/978-3-642-54474-3_1, 

3

4

M. Wibral, R. Vicente, and M. Lindner

the basic concepts behind transfer entropy and a special section devoted to the correct interpretation of the measure are meant to prepare the reader for more in depth treatments in chapters that follow. The chapter should also serve as a frame of reference for these more specialised treatments and present an overview over the field of studies in information transfer. In this sense, it may be treated as both and opening and a closing chapter to this volume. Since its introduction by Paluˇs [55] and Schreiber [60] transfer entropy has proven extremely useful in a wide variety of application scenarios ranging from neuroscience [69, 73, 55, 66, 67, 68, 8, 1, 4, 6, 7, 17, 19, 40, 47, 52, 59, 62, 34, 22, 52, 36, 5, 63, 27, 6, 35, 6, 7], physiology [11, 13, 12], climatology [57], complex systems theory [44, 45, 40] and other fields, such as economics [32, 29]. This wide variety of application fields suggests that transfer entropy measures a useful and fundamental quantity to understand complex systems, especially those that can be conceptualized as some kind of network of interacting agents or processes. It is the pur