找回密码
 To register

QQ登录

只需一步,快速开始

扫一扫,访问微社区

Titlebook: Nonlinear Conjugate Gradient Methods for Unconstrained Optimization; Neculai Andrei Book 2020 The Editor(s) (if applicable) and The Author

[复制链接]
查看: 8762|回复: 47
发表于 2025-3-21 18:33:28 | 显示全部楼层 |阅读模式
书目名称Nonlinear Conjugate Gradient Methods for Unconstrained Optimization
编辑Neculai Andrei
视频video
概述An explicit and thorough treatment of the conjugate gradient algorithms for unconstrained optimization properties and convergence.A clear illustration of the numerical performances of the algorithms d
丛书名称Springer Optimization and Its Applications
图书封面Titlebook: Nonlinear Conjugate Gradient Methods for Unconstrained Optimization;  Neculai Andrei Book 2020 The Editor(s) (if applicable) and The Author
描述.Two approaches are known for solving .large-scale. unconstrained optimization problems—the limited-memory quasi-Newton method (truncated Newton method) and the conjugate gradient method. This is the first book to detail conjugate gradient methods, showing their properties and convergence characteristics as well as their performance in solving large-scale unconstrained optimization problems and applications. Comparisons to the limited-memory and truncated Newton methods are also discussed. Topics studied in detail include: linear conjugate gradient methods, standard conjugate gradient methods, acceleration of conjugate gradient methods, hybrid, modifications of the standard scheme, memoryless BFGS preconditioned, and three-term. Other conjugate gradient methods with clustering the eigenvalues or with the minimization of the condition number of the iteration matrix, are also treated. For each method, the convergence analysis, the computational performances and thecomparisons versus other conjugate gradient methods are given.  .The theory behind the conjugate gradient algorithms presented as a methodology is developed with a clear, rigorous, and friendly exposition; the reader will g
出版日期Book 2020
关键词conjugate gradient method; conjugate gradient algorithm; quasi-Newton method; steepest descent method; B
版次1
doihttps://doi.org/10.1007/978-3-030-42950-8
isbn_softcover978-3-030-42952-2
isbn_ebook978-3-030-42950-8Series ISSN 1931-6828 Series E-ISSN 1931-6836
issn_series 1931-6828
copyrightThe Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerl
The information of publication is updating

书目名称Nonlinear Conjugate Gradient Methods for Unconstrained Optimization影响因子(影响力)




书目名称Nonlinear Conjugate Gradient Methods for Unconstrained Optimization影响因子(影响力)学科排名




书目名称Nonlinear Conjugate Gradient Methods for Unconstrained Optimization网络公开度




书目名称Nonlinear Conjugate Gradient Methods for Unconstrained Optimization网络公开度学科排名




书目名称Nonlinear Conjugate Gradient Methods for Unconstrained Optimization被引频次




书目名称Nonlinear Conjugate Gradient Methods for Unconstrained Optimization被引频次学科排名




书目名称Nonlinear Conjugate Gradient Methods for Unconstrained Optimization年度引用




书目名称Nonlinear Conjugate Gradient Methods for Unconstrained Optimization年度引用学科排名




书目名称Nonlinear Conjugate Gradient Methods for Unconstrained Optimization读者反馈




书目名称Nonlinear Conjugate Gradient Methods for Unconstrained Optimization读者反馈学科排名




单选投票, 共有 0 人参与投票
 

0票 0%

Perfect with Aesthetics

 

0票 0%

Better Implies Difficulty

 

0票 0%

Good and Satisfactory

 

0票 0%

Adverse Performance

 

0票 0%

Disdainful Garbage

您所在的用户组没有投票权限
发表于 2025-3-21 22:19:41 | 显示全部楼层
Book 2020each method, the convergence analysis, the computational performances and thecomparisons versus other conjugate gradient methods are given.  .The theory behind the conjugate gradient algorithms presented as a methodology is developed with a clear, rigorous, and friendly exposition; the reader will g
发表于 2025-3-22 02:27:45 | 显示全部楼层
发表于 2025-3-22 06:42:14 | 显示全部楼层
发表于 2025-3-22 12:10:34 | 显示全部楼层
发表于 2025-3-22 15:15:35 | 显示全部楼层
Conjugate Gradient Methods as Modifications of the Standard Schemes,g unconstrained optimization problems. These methods have good convergence properties and their iterations do not involve any matrices, making them extremely attractive for solving large-scale problems.
发表于 2025-3-22 17:28:43 | 显示全部楼层
Linear Conjugate Gradient Algorithm,The linear conjugate gradient algorithm is dedicated to minimizing convex quadratic functions (or solving linear algebraic systems of equations with positive definite matrices). This algorithm was introduced by Hestenes and Stiefel (1952).
发表于 2025-3-23 01:07:35 | 显示全部楼层
General Convergence Results for Nonlinear Conjugate Gradient Methods,General convergence results for nonlinear conjugate gradient methods.
发表于 2025-3-23 04:19:48 | 显示全部楼层
发表于 2025-3-23 05:40:35 | 显示全部楼层
Acceleration of Conjugate Gradient Algorithms,It is common knowledge that in conjugate gradient algorithms, the search directions tend to be poorly scaled and consequently the line search must perform more function evaluations in order to obtain a suitable stepsize ..
 关于派博传思  派博传思旗下网站  友情链接
派博传思介绍 公司地理位置 论文服务流程 影响因子官网 SITEMAP 大讲堂 北京大学 Oxford Uni. Harvard Uni.
发展历史沿革 期刊点评 投稿经验总结 SCIENCEGARD IMPACTFACTOR 派博系数 清华大学 Yale Uni. Stanford Uni.
|Archiver|手机版|小黑屋| 派博传思国际 ( 京公网安备110108008328) GMT+8, 2025-5-20 03:15
Copyright © 2001-2015 派博传思   京公网安备110108008328 版权所有 All rights reserved
快速回复 返回顶部 返回列表