Asymmetric Joint Source-Channel Coding for Correlated Sources with Blind HMM Estimation at the Receiver

  • PDF / 1,001,806 Bytes
  • 10 Pages / 600 x 792 pts Page_size
  • 40 Downloads / 258 Views

DOWNLOAD

REPORT


Asymmetric Joint Source-Channel Coding for Correlated Sources with Blind HMM Estimation at the Receiver Javier Del Ser Centro de Estudios e Investigaciones T´ecnicas de Gipuzkoa (CEIT), Parque Tecnologico de San Sebasti´an, Paseo Mikeletegi, N48, 20009 Donostia, San Sebasti´an, Spain Email: [email protected]

Pedro M. Crespo Centro de Estudios e Investigaciones T´ecnicas de Gipuzkoa (CEIT), Parque Tecnologico de San Sebasti´an, Paseo Mikeletegi, N48, 20009 Donostia, San Sebasti´an, Spain Email: [email protected]

Olaia Galdos Centro de Estudios e Investigaciones T´ecnicas de Gipuzkoa (CEIT), Parque Tecnologico de San Sebasti´an, Paseo Mikeletegi, N48, 20009 Donostia, San Sebasti´an, Spain Email: [email protected] Received 25 October 2004; Revised 17 May 2005 We consider the case of two correlated sources, S1 and S2 . The correlation between them has memory, and it is modelled by a hidden Markov chain. The paper studies the problem of reliable communication of the information sent by the source S1 over an additive white Gaussian noise (AWGN) channel when the output of the other source S2 is available as side information at the receiver. We assume that the receiver has no a priori knowledge of the correlation statistics between the sources. In particular, we propose the use of a turbo code for joint source-channel coding of the source S1 . The joint decoder uses an iterative scheme where the unknown parameters of the correlation model are estimated jointly within the decoding process. It is shown that reliable communication is possible at signal-to-noise ratios close to the theoretical limits set by the combination of Shannon and SlepianWolf theorems. Keywords and phrases: distributed source coding, hidden Markov model parameter estimation, Slepian-Wolf theorem, joint source-channel coding.

1.

INTRODUCTION

Communication networks are multiuser communication systems. Therefore, their performance is best understood when viewed as resource sharing systems. In the particular centralized scenario where several users intend to send their data to a common destination (e.g., an access point in a wireless local area network), the receiver may exploit the existing correlation among the transmitters, either to reduce power consumption or gain immunity against noise. In this context, we consider the system shown in Figure 1. The output of two correlated binary sources {Xk , Yk }∞ k=1 are separately encoded, and the encoded sequences are sent through two different

channels to a joint decoder. The only requirement imposed on the random process {Xk , Yk }∞ k=1 is to be ergodic. Notice that this includes the situation where the process {Xk , Yk }∞ k =1 is modelled by a hidden Markov model (HMM); this is the case analyzed in this paper. If the channels are noiseless, the problem is reduced to one of distributed data compression. The Slepian-Wolf theorem [1] (proven to be extensible to ergodic sources in [2]) states that the achievable compression region (see Figure 2) is given by 







R1 ≥ H S1 | S2 , R2 ≥ H S2 | S1 , 

This is an