Application to Classifier Design
This chapter is devoted to a description of the postsupervised classifier design using fuzzy clustering. We will first derive a modified fuzzy c-means clustering algorithm by slightly generalizing the objective function and introducing some simplification
- PDF / 1,477,476 Bytes
- 37 Pages / 430 x 660 pts Page_size
- 34 Downloads / 258 Views
This chapter is devoted to a description of the postsupervised classifier design using fuzzy clustering. We will first derive a modified fuzzy c-means clustering algorithm by slightly generalizing the objective function and introducing some simplifications. The k-harmonic means clustering [177, 178, 179, 119] is reviewed from the point of view of fuzzy c-means. In the algorithm derived from the iteratively reweighted least square technique (IRLS), membership functions are variously chosen and parameterized. Experiments on several well-known benchmark data sets show that the classifier using a newly defined membership function outperforms well-established methods, i.e., the support vector machine (SVM), the k-nearest neighbor classier (k-NN) and the learning vector quantization (LVQ). Also concerning storage requirements and classification speed, the classifier with modified FCM improves the performance and efficiency.
6.1 Unsupervised Clustering Phase The clustering is used as an unsupervised phase of classifier designs. We first recapitulate the three kinds of objective functions, i.e., the standard, entropybased, and quadratic-term-based fuzzy c-means clustering in Chapter 2. The objective function of the standard method is: ¯ = arg min Jfcm(U, V¯ ). U
(6.1)
U∈Uf
Jfcm (U, V ) =
c N
(uki )m D(xk , vi ),
(m > 1).
(6.2)
i=1 k=1
under the constraint: Uf = { U = (uki ) :
c
ukj = 1, 1 ≤ k ≤ N ;
j=1
uki ∈ [0, 1], 1 ≤ k ≤ N, 1 ≤ i ≤ c }.
(6.3)
D(xk , vi ) denotes the squared distance between xk and vi , so the standard objective function is the weighted sum of squared distances. S. Miyamoto et al.: Algorithms for Fuzzy Clustering, STUDFUZZ 229, pp. 119–15 5 , 2008. c Springer-Verlag Berlin Heidelberg 2008 springerlink.com
120
Application to Classifier Design
Following objective function is used for the entropy-based method. Jefc (U, V ) =
c N
uki D(xk , vi ) + ν
i=1 k=1
c N
uki log uki ,
(ν > 0).
(6.4)
i=1 k=1
The objective function of the quadratic-term-based method is: Jqfc (U, V ) =
c N i=1 k=1
1 uki D(xk , vi ) + ν (uki )2 , 2 i=1 c
N
(ν > 0).
(6.5)
k=1
Combining these three, the objective function can be written as : J(U, V ) =
c N
(uki )m D(xk , vi ) + ν
i=1 k=1
c N
K(u),
(6.6)
i=1 k=1
where both m and ν are the fuzzifiers. When m > 1 and ν = 0, (6.6) is the standard objective function. When m=1 and K(u) = uki loguki , (6.6) is the objective function of the entropy-based method. The algorithm is similar to the EM algorithm for the normal mixture or Gaussian mixture models whose covariance matrices are a unit matrix and cluster volumes are equal. When m = 1 and K(u) = 12 (uki )2 , (6.6) is the objective function of the quadratic-term-based method. 6.1.1
A Generalized Objective Function
From the above consideration, we can generalize the standard objective function. Let m > 1 and K(u) = (uki )m , then (6.6) is the objective function (Jgfc (U, V )) from which we can easily derive the necessary condition for the optimality. We consider minimization of (6.6) wi
Data Loading...