Efficient Estimation of Information Transfer

Any measure of interdependence can lose much of its appeal due to a poor choice of its numerical estimator. Information theoretic functionals are particularly sensitive to this problem, especially when applied to noisy signals of only a few thousand data

  • PDF / 429,759 Bytes
  • 22 Pages / 439.363 x 666.131 pts Page_size
  • 53 Downloads / 208 Views

DOWNLOAD

REPORT


Abstract. Any measure of interdependence can lose much of its appeal due to a poor choice of its numerical estimator. Information theoretic functionals are particularly sensitive to this problem, especially when applied to noisy signals of only a few thousand data points or less. Unfortunately, this is a common scenario in applications to electrophysiology data sets. In this chapter, we will review the stateof-the-art estimators based on nearest-neighbor statistics for information transfer measures. Nearest neighbors techniques are more data-efficient than naive partition or histogram estimators and rely on milder assumptions than parametric approaches. However, they also come with limitations and several parameter choices that influence the numerical estimation of information theoretic functionals. We will describe step by step the efficient estimation of transfer entropy for a typical electrophysiology data set, and how the multi-trial structure of such data sets can be used to partially alleviate the problem of non-stationarity.

1 Introduction Inferring interdependencies between subsystems from empirical data is a common task across different fields of science. In neuroscience the subsystems from which we would like to infer an interdependency can consist of a set of stimulus and a region of the brain [1], two regions of the brain [2], or even two frequency bands registered at the same brain region [3]. An important characterization of directed dependency is the information transfer between subsystems, especially when Raul Vicente Max-Planck Institute for Brain Research, 60385 Frankfurt am Main, Germany e-mail: [email protected] Michael Wibral MEG Unit, Brain Imaging Center, Goethe University, Heinrich-Hoffmann Strasse 10, 60528 Frankfurt am Main, Germany e-mail: [email protected] M. Wibral et al. (eds.), Directed Information Measures in Neuroscience, Understanding Complex Systems, c Springer-Verlag Berlin Heidelberg 2014 DOI: 10.1007/978-3-642-54474-3_2, 

37

38

R. Vicente and M. Wibral

describing the information processing capabilities of a system [4, 5]. The success of this task crucially depends not only on the quality of the data but on the numerical estimator of the interdependency measure [6]. In this chapter we will review the different stages in obtaining a numerical estimate of information transfer, as measured by transfer entropy, from a typical electrophysiology data set. Specifically, in Section 2 we answer why transfer entropy is used as a quantifier of information transfer. Next, we describe different strategies to estimate transfer entropy along with their advantages and drawbacks. Section 4 explains step by step the procedure to numerically estimate transfer entropy from nearest neighbor statistics. The section covers from the choice of parameters for the embedding of raw time series to the testing of statistical significance. In Section 5, we illustrate how to integrate multi-trial information to improve the temporal resolution of transfer entropy. Finally, in Section 6 we br