Reduced-Complexity Deterministic Annealing for Vector Quantizer Design

  • PDF / 764,462 Bytes
  • 14 Pages / 600 x 792 pts Page_size
  • 73 Downloads / 202 Views

DOWNLOAD

REPORT


Reduced-Complexity Deterministic Annealing for Vector Quantizer Design Kemal Demirciler Institute of Advanced Technology Research and Development, Eastern Mediterranean University, 6 Bozova Sokak, Yenisehir, Lefkosa, Mersin 10, Turkey Email: [email protected]

Antonio Ortega Integrated Media Systems Center, Department of Electrical Engineering, University of Southern California, 3740 McClintock Avenue, Los Angeles, CA 90089-2564, USA Email: [email protected] Received 16 September 2004; Recommended for Publication by John Sorensen This paper presents a reduced-complexity deterministic annealing (DA) approach for vector quantizer (VQ) design by using soft information processing with simplified assignment measures. Low-complexity distributions are designed to mimic the Gibbs distribution, where the latter is the optimal distribution used in the standard DA method. These low-complexity distributions are simple enough to facilitate fast computation, but at the same time they can closely approximate the Gibbs distribution to result in near-optimal performance. We have also derived the theoretical performance loss at a given system entropy due to using the simple soft measures instead of the optimal Gibbs measure. We use the derived result to obtain optimal annealing schedules for the simple soft measures that approximate the annealing schedule for the optimal Gibbs distribution. The proposed reducedcomplexity DA algorithms have significantly improved the quality of the final codebooks compared to the generalized Lloyd algorithm and standard stochastic relaxation techniques, both with and without the pairwise nearest neighbor (PNN) codebook initialization. The proposed algorithms are able to evade the local minima and the results show that they are not sensitive to the choice of the initial codebook. Compared to the standard DA approach, the reduced-complexity DA algorithms can operate over 100 times faster with negligible performance diļ¬€erence. For example, for the design of a 16-dimensional vector quantizer having a rate of 0.4375 bit/sample for Gaussian source, the standard DA algorithm achieved 3.60 dB performance in 16 483 CPU seconds, whereas the reduced-complexity DA algorithm achieved the same performance in 136 CPU seconds. Other than VQ design, the DA techniques are applicable to problems such as classification, clustering, and resource allocation. Keywords and phrases: deterministic annealing, complexity reduction, vector quantization, stochastic relaxation, Gibbs distribution, codebook initialization.

1.

INTRODUCTION

Vector quantization is a source coding technique that approximates blocks (or vectors) of input data by one of a finite number of prestored vectors in a codebook. The challenge is to find the set of vectors (or quantization levels) such that a given criterion for the total distortion between the actual source and the quantized source is as small as possible under a constraint on the overall rate [1]. Since distortion depends on the codebook design, vector quantizer design is a key optim