This focused monograph presents a study of subgradient algorithms for constrained minimization problems in a Hilbert space. The book is of interest for experts in applications of optimization to engineering and economics. The goal is to obtain a good approximate solution of the problem in the presence of computational errors. The discussion takes into consideration the fact that for every algorithm its iteration consists of several steps and that computational errors for di erent steps are di erent, in general. The book is especially useful for the reader because it contains solutions to a…mehr
This focused monograph presents a study of subgradient algorithms for constrained minimization problems in a Hilbert space. The book is of interest for experts in applications of optimization to engineering and economics. The goal is to obtain a good approximate solution of the problem in the presence of computational errors. The discussion takes into consideration the fact that for every algorithm its iteration consists of several steps and that computational errors for di erent steps are di erent, in general. The book is especially useful for the reader because it contains solutions to a number of difficult and interesting problems in the numerical optimization. The subgradient projection algorithm is one of the most important tools in optimization theory and its applications. An optimization problem is described by an objective function and a set of feasible points. For this algorithm each iteration consists of two steps. The first step requires a calculation of a subgradient of the objective function; the second requires a calculation of a projection on the feasible set. The computational errors in each of these two steps are different. This book shows that the algorithm discussed, generates a good approximate solution, if all the computational errors are bounded from above by a small positive constant. Moreover, if computational errors for the two steps of the algorithm are known, one discovers an approximate solution and how many iterations one needs for this. In addition to their mathematical interest, the generalizations considered in this book have a significant practical meaning.
Alexander J. Zaslavski, is a senior researcher at the Technion - Israel Institute of Technology. He was born in Ukraine in 1957 and got his PhD in Mathematical Analysis in 1983, The Institute of Mathematics, Novosibirsk. He is the author of 26 research monographs and more than 600 research papers and editor of more than 70 edited volumes and journal special issues. He is the Founding Editor and Editor-in Chief of the journal Pure and Applied Functional Analysis, and Editor-in-Chief of journal Communications in Optimization Theory. His area of research contains nonlinear functional analysis, control theory, optimization, calculus of variations, dynamical systems theory, game theory and mathematical economics.
Inhaltsangabe
1. Introduction.- 2. Nonsmooth Convex Optimization.- 3. Extensions.- 4. Zero-sum Games with Two Players.- 5. Quasiconvex Optimization.- References.
1. Introduction.- 2. Nonsmooth Convex Optimization.- 3. Extensions.- 4. Zero-sum Games with Two Players.- 5. Quasiconvex Optimization.- References.
Rezensionen
"The book is rigorously written, and organized taking into account the cursiveness of reading. The long proofs of the theorems are placed in annexes to chapters, in order to emphasize the importance of every result in a generating methodology of studying and solving problems." (Gabriela Cristescu, zbMATH 1464.90063, 2021)
Es gelten unsere Allgemeinen Geschäftsbedingungen: www.buecher.de/agb
Impressum
www.buecher.de ist ein Internetauftritt der buecher.de internetstores GmbH
Geschäftsführung: Monica Sawhney | Roland Kölbl | Günter Hilger
Sitz der Gesellschaft: Batheyer Straße 115 - 117, 58099 Hagen
Postanschrift: Bürgermeister-Wegele-Str. 12, 86167 Augsburg
Amtsgericht Hagen HRB 13257
Steuernummer: 321/5800/1497
USt-IdNr: DE450055826