Fixate 发表于 2025-3-21 16:29:14

书目名称Convex Optimization with Computational Errors影响因子(影响力)<br>        http://impactfactor.cn/2024/if/?ISSN=BK0237847<br><br>        <br><br>书目名称Convex Optimization with Computational Errors影响因子(影响力)学科排名<br>        http://impactfactor.cn/2024/ifr/?ISSN=BK0237847<br><br>        <br><br>书目名称Convex Optimization with Computational Errors网络公开度<br>        http://impactfactor.cn/2024/at/?ISSN=BK0237847<br><br>        <br><br>书目名称Convex Optimization with Computational Errors网络公开度学科排名<br>        http://impactfactor.cn/2024/atr/?ISSN=BK0237847<br><br>        <br><br>书目名称Convex Optimization with Computational Errors被引频次<br>        http://impactfactor.cn/2024/tc/?ISSN=BK0237847<br><br>        <br><br>书目名称Convex Optimization with Computational Errors被引频次学科排名<br>        http://impactfactor.cn/2024/tcr/?ISSN=BK0237847<br><br>        <br><br>书目名称Convex Optimization with Computational Errors年度引用<br>        http://impactfactor.cn/2024/ii/?ISSN=BK0237847<br><br>        <br><br>书目名称Convex Optimization with Computational Errors年度引用学科排名<br>        http://impactfactor.cn/2024/iir/?ISSN=BK0237847<br><br>        <br><br>书目名称Convex Optimization with Computational Errors读者反馈<br>        http://impactfactor.cn/2024/5y/?ISSN=BK0237847<br><br>        <br><br>书目名称Convex Optimization with Computational Errors读者反馈学科排名<br>        http://impactfactor.cn/2024/5yr/?ISSN=BK0237847<br><br>        <br><br>

黄油没有 发表于 2025-3-21 23:24:06

Subgradient Projection Algorithm,f convex–concave functions, under the presence of computational errors. The problem is described by an objective function and a set of feasible points. For this algorithm each iteration consists of two steps. The first step is a calculation of a subgradient of the objective function while in the sec

认识 发表于 2025-3-22 01:39:12

Gradient Algorithm with a Smooth Objective Function,rs. The problem is described by an objective function and a set of feasible points. For this algorithm each iteration consists of two steps. The first step is a calculation of a gradient of the objective function while in the second one we calculate a projection on the feasible set. In each of these

设想 发表于 2025-3-22 04:43:17

Continuous Subgradient Method,nvex–concave functions, under the presence of computational errors. The problem is described by an objective function and a set of feasible points. For this algorithm we need a calculation of a subgradient of the objective function and a calculation of a projection on the feasible set. In each of th

figurine 发表于 2025-3-22 09:04:28

http://reply.papertrans.cn/24/2379/237847/237847_5.png

Classify 发表于 2025-3-22 16:41:01

http://reply.papertrans.cn/24/2379/237847/237847_6.png

Classify 发表于 2025-3-22 20:34:39

PDA-Based Method for Convex Optimization, steps. In each of these two steps there is a computational error. In general, these two computational errors are different. We show that our algorithm generates a good approximate solution, if all the computational errors are bounded from above by a small positive constant. Moreover, if we know the

吊胃口 发表于 2025-3-22 22:56:58

http://reply.papertrans.cn/24/2379/237847/237847_8.png

filial 发表于 2025-3-23 03:08:49

A Projected Subgradient Method for Nonsmooth Problems,this class of problems, an objective function is assumed to be convex but a set of admissible points is not necessarily convex. Our goal is to obtain an .-approximate solution in the presence of computational errors, where . is a given positive number.

Arb853 发表于 2025-3-23 08:50:44

Convex Optimization with Computational Errors978-3-030-37822-6Series ISSN 1931-6828 Series E-ISSN 1931-6836
页: [1] 2 3 4 5
查看完整版本: Titlebook: Convex Optimization with Computational Errors; Alexander J. Zaslavski Book 2020 Springer Nature Switzerland AG 2020 convex optimization.ma