Koopman Operator Framework for Time Series Modeling and Analysis

  • PDF / 2,661,880 Bytes
  • 34 Pages / 439.37 x 666.142 pts Page_size
  • 77 Downloads / 264 Views

DOWNLOAD

REPORT


Koopman Operator Framework for Time Series Modeling and Analysis Amit Surana1

Received: 16 January 2017 / Accepted: 27 December 2017 © Springer Science+Business Media, LLC, part of Springer Nature 2018

Abstract We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application. Keywords Koopman operator · Dynamical Systems and Control · Machine learning · Nonlinear time series modeling and analysis Mathematics Subject Classification 37N99 · 93E12 · 93E10 · 47N99 Communicated by Clancy Rowley and Ioannis Kevrekidis.

B 1

Amit Surana [email protected] United Technologies Research Center, 411 Silver Lane, East Hartford, CT 06118, USA

123

J Nonlinear Sci

1 Introduction Recent technological advances in ubiquitous sensing, networking, storage and computing technology are leading to the emergence of new paradigms such as Internet of Things, Industrial Internet and Cloud Robotics (Kehoe et al. 2015). For instance Internet of Things (IoT) is a concept where RFID, wearable sensors and inexpensive processors could be incorporated into a vast array of robots and physical objects from inventory items to household appliances to allow them to communicate and share information (Atzori et al. 2010). These paradigms are generating high-volume, high-velocity heterogeneous data streams or time series that require novel algorithms that can mine the data for situational awareness and actionable insight. While time series analysis has traditionally been studied in different fields such as econometrics/statistics, and systems and control, the majority of current research on big data is focused mainly in the machine learning community. In this paper, we outline how recent progress in data-driven dynam