Methods of Descent for Nondifferentiable Optimization
- PDF / 18,504,432 Bytes
- 369 Pages / 468 x 684 pts Page_size
- 83 Downloads / 232 Views
1133 Krzysztof C. Kiwiel
Methods of Descent for Nondifferentiable Optimization
Spri Berlin Heidelberg New York Tokyo
Author Krzysztof C. Kiwiel Systems Research Institute, Polish Academy of Sciences ul. Newelska 6, 01-447 Warsaw, Poland
Mathematics Subject Classification: 49-02, 49D37, 65-02, 65K05, 90-02, 90C30 ISBN 3-540-15642-9 Springer-Verlag Berlin Heidelberg New York Tokyo ISBN 0-387-15642-9 Springer-Verlag New York Heidelberg Berlin Tokyo
This work IS subject to copyright. All rights are reserved, whether the whole or part of the matenal is concerned, specifically those of translation, reprinting, re-use of illustrations, broadcasting, reproduction by photocopying machine or similar means, and storage in data banks. Under § 54 of the German Copyright Law where copies are made for other than private use, a fee is payable to "Verwertungsgesellschaft Wort", Munich.
© by Springer-Verlag Berlin Heidelberg 1985 Printed in Germany Printing and binding: Beltz Offsetdruck, Hemsbach/Bergstr. 2146/3140-543210
PREFACE This book is about numerical methods for problems of finding the largest or smallest values which can be attained by functions of several real variables subject to several inequality constraints. If such problems involve continuously differentiable functions, they can be solved by a variety of methods well documented in the literature. We are concerned with more general problems in which the functions are locally Lipschitz continuous,but not necessarily differentiable or convex. More succintly, this book is about numerical methods for differentiable optimization. Nondifferentiable optimization, also called nonsmooth optimization, has many actual and potential applications in industry and science. For this reason, a great deal of effort has been devoted to it during the last decade. Most research has gone into the theory of nonsmooth optimization, while surprisingly few algorithms have been proposed, these mainly by C.Lemarechal, R.Mifflin and P.Wolfe. Frequently such algorithms are conceptual, since their storage and work per iteration grow infinitely in the course of calculations. Also their convergence properties are usually weaker than those of classical methods for smooth optimization problems. This book gives a complete stateoftheart in generalpurpose methods of descent for nonsmooth minimization. The methods use piecewise linear approximations to the problem functions constructed from several subgradients evaluated at certain trial points. At each iteration, a search direction is found by solving a quadratic subproblem and then a line search produces both the next improved approximation to a solution and a new trial point so as to detect gradient discontinuities. The algorithms converge to points satisfying necessary optimality conditions. Also they are widely applicable, since they require only a weak semi smoothness hypothesis on the problem functions which is likely to hold in most applications. A unifying theme of this book is the use of subgradient selection and aggregation techniqu
Data Loading...