书目名称 | Nonlinear Conjugate Gradient Methods for Unconstrained Optimization |
编辑 | Neculai Andrei |
视频video | |
概述 | An explicit and thorough treatment of the conjugate gradient algorithms for unconstrained optimization properties and convergence.A clear illustration of the numerical performances of the algorithms d |
丛书名称 | Springer Optimization and Its Applications |
图书封面 |  |
描述 | .Two approaches are known for solving .large-scale. unconstrained optimization problems—the limited-memory quasi-Newton method (truncated Newton method) and the conjugate gradient method. This is the first book to detail conjugate gradient methods, showing their properties and convergence characteristics as well as their performance in solving large-scale unconstrained optimization problems and applications. Comparisons to the limited-memory and truncated Newton methods are also discussed. Topics studied in detail include: linear conjugate gradient methods, standard conjugate gradient methods, acceleration of conjugate gradient methods, hybrid, modifications of the standard scheme, memoryless BFGS preconditioned, and three-term. Other conjugate gradient methods with clustering the eigenvalues or with the minimization of the condition number of the iteration matrix, are also treated. For each method, the convergence analysis, the computational performances and thecomparisons versus other conjugate gradient methods are given. .The theory behind the conjugate gradient algorithms presented as a methodology is developed with a clear, rigorous, and friendly exposition; the reader will g |
出版日期 | Book 2020 |
关键词 | conjugate gradient method; conjugate gradient algorithm; quasi-Newton method; steepest descent method; B |
版次 | 1 |
doi | https://doi.org/10.1007/978-3-030-42950-8 |
isbn_softcover | 978-3-030-42952-2 |
isbn_ebook | 978-3-030-42950-8Series ISSN 1931-6828 Series E-ISSN 1931-6836 |
issn_series | 1931-6828 |
copyright | The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerl |