Structure Level Adaptation for Artificial Neural Networks

63 3. 2 Function Level Adaptation 64 3. 3 Parameter Level Adaptation. 67 3. 4 Structure Level Adaptation 70 3. 4. 1 Neuron Generation . 70 3. 4. 2 Neuron Annihilation 72 3. 5 Implementation . . . . . 74 3. 6 An Illustrative Example 77 3. 7 Summary . . . .

  • PDF / 12,961,644 Bytes
  • 224 Pages / 439.37 x 666.14 pts Page_size
  • 21 Downloads / 210 Views

DOWNLOAD

REPORT


THE KLUWER INTERNATIONAL SERIES IN ENGINEERING AND COMPUTER SCIENCE

KNOWLEDGE REPRESENTATION, LEARNING AND EXPERT SYSTEMS Consulting Editor

Tom Mitchell Carnegie Mellon University UNIVERSAL SUBGOALING AND CHUNKING OF GOAL HIERARCHIES, J. Laird, P. Rosenbloom, A. Newell ISBN: 0-89838-213-0 MACHINE LEARNING: A Guide to Current Research, T. Mitchell, J. Carbonell, R. Michalski ISBN: 0-89838-214-9 MACHINE LEARNING OF INDUCTIVE BIAS, P. Vtgoff ISBN: 0-89838-223-8 A CONNECTIONIST MACHINE FOR GENETIC HILLCLIMBING, D. H. Ackley ISBN: 0-89838-236-X LEARNING FROM GOOD AND BAD DATA, P. D. Laird ISBN: 0-89838-263-7 MACHINE LEARNING OF ROBOT ASSEMBLY PLANS, A. M. Segre ISBN: 0-89838-269-6 AUTOMATING KNOWLEDGE ACQUISITION FOR EXPERT SYSTEMS, S. Marcus, Editor ISBN: 0-89838-294-7 MACHINE LEARNING, META·REASONING AND WGICS, P. B. Brazdil, K. Konolige ISBN: 0-7923-9047-4 CHANGE OF REPRESENTATION AND INDUCTIVE BIAS: D. P. Benjamin ISBN: 0-7923·9055-5 KNOWLEDGE ACQUISITION: SELECTED RESEARCH AND COMMENTARY, S. Marcus, Editor ISBN: 0-7923-9062-8 LEARNING WITH NESTED GENERALIZED EXEMPLARS, S. L. Salzberg ISBN: 0-7923-9110-1 INCREMENTAL VERSION·SPACE MERGING: A General Framework for Concept Learning, H. Hirsh ISBN: 0-7923-9119-5 COMPETITIVELY INHIBITED NEURAL NETWORKS FOR ADAPTIVE PARAMETER ESTIMATION, M. Lemmon ISBN: 0-7923-9086-5

STRUCTURE LEVEL ADAPTATION FOR ARTIFICIAL NEURAL NETWORKS by

Tsu-Chang Lee

Stanford University/ Cadence Design Systems foreword by Joseph W. Goodman

Stanford University

.., ~

Springer Science+Business Media, LLC

Library or Congras Cataloging.in-Publlcation Data

Lee, Tsu-Chang, 1961-

Structure level adaptation for artificial neural nefWOrlcs I by Tsu -Chang Lee; foreword by Joseph W. Goodman. p. CfD . -- (The K1uwer international series in engineering and computer science. Knowledge representation, learning, and expert systems) Includes bibliographical references and index. ISBN 978-1-4613-6765-9 ISBN 978-1-4615-3954-4 (eBook) DOI 10.1007/978--1-4615-3954-4 1. Neural networks (Computer science) 1. TItle. II. Seties. QA 78.87 .lA43 1991 91-2251 CIP

Copyright © 1991 by Springer Science+Business Media New York Originally published by K1uwer Academic Publishers in 1991 Softcover reprint ofthe hardcover Ist edition 1991 Ali rights reserved. No pari of Ihis publication may be reproduced, SlOred in a relrieval system or transmi lIed in any form or by any means, mechanical, photo-j

this is defined by the following:

F, called the State Tansition Function (STF), is a Parameterized Function Tree (PFT)

I

with modulating parameter structure

W. F is used to generate the Composite State (CS), Q, of this neu-

ron with Q = F(x

I

W), where the vector x =

(XI, X2, ... ,

designates the list of input arguments for F (FI is the field

2

xm ) E

Ff

correspond-

ing to the input variables of this neuron). The set of input arguments X =

{XI, X2, ••. ,

x m } is called the Receptive Field (RF) of this neu-

ron. The function tree F is specified as follows F =

U

3

(II ...) (12 ...) ... ---.----.-C C2 I

..