找回密码
 To register

QQ登录

只需一步,快速开始

扫一扫,访问微社区

Titlebook: Convex Optimization with Computational Errors; Alexander J. Zaslavski Book 2020 Springer Nature Switzerland AG 2020 convex optimization.ma

[复制链接]
查看: 26009|回复: 49
发表于 2025-3-21 16:29:14 | 显示全部楼层 |阅读模式
书目名称Convex Optimization with Computational Errors
编辑Alexander J. Zaslavski
视频videohttp://file.papertrans.cn/238/237847/237847.mp4
概述Studies the influence of computational errors in numerical optimization, for minimization problems on unbounded sets, and time zero-sum games with two players.Explains that for every algorithm its ite
丛书名称Springer Optimization and Its Applications
图书封面Titlebook: Convex Optimization with Computational Errors;  Alexander J. Zaslavski Book 2020 Springer Nature Switzerland AG 2020 convex optimization.ma
描述The book is devoted to the study of approximate solutions of optimization problems in the presence of computational errors. It contains a number of results on the convergence behavior of algorithms in a Hilbert space, which are known as important tools for solving optimization problems. The research presented in the book is the continuation and the further development of the author‘s (c) 2016 book .Numerical Optimization with Computational Errors., Springer 2016. Both books study the algorithms taking into account computational errors which are always present in practice. The main goal is, for a known computational error, to find out what an approximate solution can be obtained and how many iterates one needs for this. .The main difference between this new book and the 2016 book is that in this present book the discussion takes into consideration the fact that for every algorithm, its iteration consists of several steps and that computational errors for different steps are generally, different. This fact, which was not taken into account in the previous book, is indeed important in practice. For example, the subgradient projection algorithm consists of two steps. The first step is
出版日期Book 2020
关键词convex optimization; mathematical programming; computational error; nonlinear analysis; solving real-wor
版次1
doihttps://doi.org/10.1007/978-3-030-37822-6
isbn_softcover978-3-030-37824-0
isbn_ebook978-3-030-37822-6Series ISSN 1931-6828 Series E-ISSN 1931-6836
issn_series 1931-6828
copyrightSpringer Nature Switzerland AG 2020
The information of publication is updating

书目名称Convex Optimization with Computational Errors影响因子(影响力)




书目名称Convex Optimization with Computational Errors影响因子(影响力)学科排名




书目名称Convex Optimization with Computational Errors网络公开度




书目名称Convex Optimization with Computational Errors网络公开度学科排名




书目名称Convex Optimization with Computational Errors被引频次




书目名称Convex Optimization with Computational Errors被引频次学科排名




书目名称Convex Optimization with Computational Errors年度引用




书目名称Convex Optimization with Computational Errors年度引用学科排名




书目名称Convex Optimization with Computational Errors读者反馈




书目名称Convex Optimization with Computational Errors读者反馈学科排名




单选投票, 共有 0 人参与投票
 

0票 0%

Perfect with Aesthetics

 

0票 0%

Better Implies Difficulty

 

0票 0%

Good and Satisfactory

 

0票 0%

Adverse Performance

 

0票 0%

Disdainful Garbage

您所在的用户组没有投票权限
发表于 2025-3-21 23:24:06 | 显示全部楼层
Subgradient Projection Algorithm,f convex–concave functions, under the presence of computational errors. The problem is described by an objective function and a set of feasible points. For this algorithm each iteration consists of two steps. The first step is a calculation of a subgradient of the objective function while in the sec
发表于 2025-3-22 01:39:12 | 显示全部楼层
Gradient Algorithm with a Smooth Objective Function,rs. The problem is described by an objective function and a set of feasible points. For this algorithm each iteration consists of two steps. The first step is a calculation of a gradient of the objective function while in the second one we calculate a projection on the feasible set. In each of these
发表于 2025-3-22 04:43:17 | 显示全部楼层
Continuous Subgradient Method,nvex–concave functions, under the presence of computational errors. The problem is described by an objective function and a set of feasible points. For this algorithm we need a calculation of a subgradient of the objective function and a calculation of a projection on the feasible set. In each of th
发表于 2025-3-22 09:04:28 | 显示全部楼层
发表于 2025-3-22 16:41:01 | 显示全部楼层
发表于 2025-3-22 20:34:39 | 显示全部楼层
PDA-Based Method for Convex Optimization, steps. In each of these two steps there is a computational error. In general, these two computational errors are different. We show that our algorithm generates a good approximate solution, if all the computational errors are bounded from above by a small positive constant. Moreover, if we know the
发表于 2025-3-22 22:56:58 | 显示全部楼层
发表于 2025-3-23 03:08:49 | 显示全部楼层
A Projected Subgradient Method for Nonsmooth Problems,this class of problems, an objective function is assumed to be convex but a set of admissible points is not necessarily convex. Our goal is to obtain an .-approximate solution in the presence of computational errors, where . is a given positive number.
发表于 2025-3-23 08:50:44 | 显示全部楼层
Convex Optimization with Computational Errors978-3-030-37822-6Series ISSN 1931-6828 Series E-ISSN 1931-6836
 关于派博传思  派博传思旗下网站  友情链接
派博传思介绍 公司地理位置 论文服务流程 影响因子官网 吾爱论文网 大讲堂 北京大学 Oxford Uni. Harvard Uni.
发展历史沿革 期刊点评 投稿经验总结 SCIENCEGARD IMPACTFACTOR 派博系数 清华大学 Yale Uni. Stanford Uni.
QQ|Archiver|手机版|小黑屋| 派博传思国际 ( 京公网安备110108008328) GMT+8, 2025-8-9 18:57
Copyright © 2001-2015 派博传思   京公网安备110108008328 版权所有 All rights reserved
快速回复 返回顶部 返回列表