Weight Enumeration
There is a remarkable relation between the weight enumerator of a linear code and the weight enumerator of the dual code. The relation was first discovered by F. J. MacWilliams. The proof we give here is based on an idea due to A. M. Gleason.
- PDF / 8,139,489 Bytes
- 145 Pages / 504.567 x 720 pts Page_size
- 78 Downloads / 206 Views
201 Jacobus H. van Lint California Institute of Technology, Pasadena, CA/USA
Coding Theory
Springer-Verlag Berlin Heidelberg GmbH 1971
AMS Subject Classifications (1970): 91j A 10
ISBN 978-3-540-05476-4
ISBN 978-3-662-20712-3 (eBook)
DOI 10.1007/978-3-662-20712-3 This work is ~uhjecr to (oryrighr. All rights arc reserved, wht>~hn the whoic or parr of rh(' mareria.i i~ concerned, spe(ific~llr rhosr of translaTion , r(,printing. :or-usc of illustrations, broadcasting, rl'produ(tion by photocopying machine Of c;i~ : !ar ::1(':;:05,
an2
s~oragc
in
,~ "f.t b~~ks.
I Jnner ~ ~ 4 of the German COJlyright Law where cnpic:-s ,He' made for other than priv.ne use, a rhe .m{lI:nr {)f rhe fcc to n., ~crer",inrd hy agreement with the publ isher. :i;;
fpC'
is p:4yabJe to thl' ;'IlIhli 0 and R < C
< E.
Before giving the main idea of the proof we isolate
a number of the more technical details to be used in the main
If a code 'liard is transmitted over
t~e
part
of the proof.
channel then the prObability of a spec-
ific error pattern (at the receiver) vith 'II errors is p'llqn-w, i.e. this probability depends only on the number of errors.
Remark that p(!lr)
= p(~I!) .
errors in received words is a random variable 'IIith expected value np np(l~p).
If b:c
( ~1-=1'l)1/2 ~
Let p :
..
(np + bJ.
and variance
then by the Bienaym~-Chebyshev inequality Prob ('II > np + b) ::
(1.4.4)
The number of
1
2
€.
By (1.2.4) the number of points in a sphere S (x) is P n n (n) < ~ n(n) < ~ n Is p (!) I c 'II 2 P - 2 pP (n-p )n- p
L
( 1.4·5)
'0J
The last inequality follows frem nn • We have
( 1.4.6)
~ log n~ c (p + n£) log (p
r.
c
and analogously (1 -
~) n
+
~) n
~){log p + log (1 + ~)} c n np
P log P + 0(n- 1/ 2 ),
log (1 -
~)
n
c q log q + 0(n- 1/ 2 ).
We introduce two auxiliary f~~ctionG. (1.4.7)
• (p +
f(~,~) '. . { 0 1
If ~ E (e ,l}n, if
~(~,:::) > p ,
if
'1r(:!.':::)::
If ~i is any vector in the code and y
E (O,l)n then
(1.4.8)
- f(;t'!i) c
gi(r):c
v
E {O,l}n we define:
p
L f(;i:'!j)' j~i
12
Rem.aIit that gi (r) '"
°if 1St E Sp
(~) and no other code word is in this sphere and
otherwise Si (~) ~ I. Nov ve come to the main part of our proof.
We choose the code words !I'
"', ~ at random from {O,l}n and use these to transmit M possible messages. decoding rule for the receiver 1s as follows.
r
If
~I
The
is received and there 1s one
code word !i vi th distance :: p to "L then "L is decoded into !i' is declared (or one could always take
~,
in this case).
probability of incorrect decoding when !i is transmitted.
Otherwise an error
Again, let Pi denote the We have, by the remark
following (1.4.8), Pi
~
•
~he
L
P{rl!i )Si (X) "
;LE(O, 1}n
L L f(;i'!J)p{xl!i)'
LP{"LI1St)(1 - f{XJ!t)} + l l
first sum on the right-hand side is just the probability that l 1s not 1n Sp{!i)'
This number does not depend on!1 but only on p.
ap
j!i
:::
1
2"
Call it a p '
By (1.4.4) we
I~ve
t;.
Now by {1.4.1} M
L LP{:z:I1St)f(Z,~j)'
P error - n. s1des of (2.2.8) by z1-1 and
Data Loading...