Learning-Theoretic Methods in Vector Quantization

The principal goal of data compression (also known as source coding) is to replace data by a compact representation in such a manner that from this representation the original data can be reconstructed either perfectly, or with high enough accuracy. Gener

  • PDF / 25,206,684 Bytes
  • 344 Pages / 481.89 x 691.654 pts Page_size
  • 5 Downloads / 177 Views

DOWNLOAD

REPORT


Series Editors: The Rectors Manuel Garcia Velarde - Madrid Mahir Sayir - Zurich Wilhelm Schneider - Wien

The Secretary General Bernhard Schrefler - Padua

Former Secretary General Giovanni Bianchi - Milan

Executive Editor Carlo Tasso- Udine

The series presents lecture notes, monographs, edited works and proceedings in the field of Mechanics, Engineering, Computer Science and Applied Mathematics. Purpose of the series is to make known in the international scientific and technical community results obtained in some of the activities organized by CISM, the International Centre for Mechanical Sciences.

INTERNATIONAL CENTRE FOR MECHANICAL SCIENCES COURSES AND LECTURES - No. 434

PRINCIPLES OF NONPARAMETRIC LEARNING

EDITED BY LASZLO GYORFI BUDAPEST UNIVERSITY OF TECHNOLOGY AND ECONOMICS

i

Springer-Verlag Wien GmbH

This volume contains 13 illustrations

This work is subject to copyright. AII rights are reserved, whether the whole or part of the material is concemed specifically those of translation, reprinting, re-use of illustrations, broadcasting, reproduction by photocopying machine or similar means, and storage in data banks. © 2002 by Springer-Verlag Wien Originally published by Springer-Verlag Wien New York in 2002

SPIN 10880232

In order to make this volume available as economica11y and as rapidly as possible the authors' typescripts have been reproduced in their original forms. This method unfortunately has its typographicallimitations but it is hoped that they in no way distract the reader.

ISBN 978-3-211-83688-0 ISBN 978-3-7091-2568-7 (eBook) DOI 10.1007/978-3-7091-2568-7

PREFACE This volume contains the course material of the summer school PRINCIPLES OF NONPARAMETRIC LEARNING, held at the International Centre for Mechanical Sciences (CISM), Udine, Italy, July 9-/3, 200I. Modem nonparametric methods have become the most important tools in various fields ofapplications of pattern recognition, density and regression function estimation, data compression, on-line learning, and prediction. The common feature in these problems is that some unknown underlying system generates data and the best action is to be learnt from these data. The purpose of the course was to teach the basic principles of nonparametric inference with emphasis on the cited areas. The volume consists of six independent, but closely related areas ofnonparametric learning: • The first chapter summarizes the basics of statistical/earning theory and pattern recognition, including a self-contained study of empirical risk minimization, Vapnik-Chervonenkis theory, complexity regularization, and error estimation. • The second chapter reviews nonparametric regression function estimation, such as local averaging estimates (partitioning, kernel, and nearest neighbor estimates), empirical risk minimization applied to spline estimates, penalized least squares estimates. • In Chapter 3 the theory of on-line prediction of individual sequences is summarized. The performance ofpredictors under various loss functions are discussed in detail and theres