Time Series Encodings with Temporal Convolutional Networks

The training of anomaly detection models usually requires labeled data. We present in this paper a novel approach for anomaly detection in time series which trains unsupervised using a convolutional approach coupled to an autoencoder framework. After trai

  • PDF / 1,738,222 Bytes
  • 13 Pages / 439.37 x 666.142 pts Page_size
  • 78 Downloads / 241 Views

DOWNLOAD

REPORT


TH K¨ oln – University of Applied Sciences, Gummersbach, Germany {markus.thill,wolfgang.konen}@th-koeln.de 2 LIACS, Leiden University, Leiden, The Netherlands [email protected]

Abstract. The training of anomaly detection models usually requires labeled data. We present in this paper a novel approach for anomaly detection in time series which trains unsupervised using a convolutional approach coupled to an autoencoder framework. After training, only a small amount of labeled data is needed to adjust the anomaly threshold. We show that our new approach outperforms several other state-ofthe-art anomaly detection algorithms on a Mackey-Glass (MG) anomaly benchmark. At the same time our autoencoder is capable of learning interesting representations in latent space. Our new MG anomaly benchmark allows to create an unlimited amount of anomaly benchmark data with steerable difficulty. In this benchmark, the anomalies are welldefined, yet difficult to spot for the human eye. Keywords: Time series representations · Temporal convolutional networks · Autoencoder · Anomaly detection · Unsupervised learning Mackey-Glass time series · Chaos

1

·

Introduction

For the operation of large machines in companies or other critical systems in society, it is usually necessary to record and monitor specific machine or system health indicators over time. In the past, the recorded time series were often evaluated manually or by simple heuristics (such as threshold values) to detect abnormal behavior. With the more recent advances in the fields of ML (machine learning) and AI (artificial intelligence), ML-based anomaly detection algorithms are becoming increasingly popular for many tasks such as health monitoring and predictive maintenance. Supervised algorithms need labeled training data, which are often cumbersome to get and to maintain in real-world applications. Yet, unsupervised anomaly detection remains up to now a challenging task. In this paper we propose a novel autoencoder architecture for sequences (time series) which is based on temporal convolutional networks [3] and shows its efficacy in unsupervised learning tasks. Our experiments show that the architecture can learn interesting representations of sequences in latent space. The idea of c Springer Nature Switzerland AG 2020  B. Filipiˇ c et al. (Eds.): BIOMA 2020, LNCS 12438, pp. 161–173, 2020. https://doi.org/10.1007/978-3-030-63710-1_13

162

M. Thill et al.

unsupervised anomaly learning is based on the assumption that in real-world tasks the overwhelming part of the time-series data will be normal. Without the need to label the data, we train a model that learns the normal behavior, i.e. assigns a low score to normal and a higher score to anomalous data. Finally, only a small fraction of labeled data is needed to find a suitable threshold for the anomaly score. This can also be fine-tuned in operation, with an already trained model. For the initial benchmarking and comparison of our algorithm, we introduce a new synthetic benchmark based on Mackey-Glass (MG) time seri