AF: Small: Collaborative Research: Distributed Quasi-Newton Methods for Nonsmooth Optimization
AF:小:协作研究:非光滑优化的分布式拟牛顿方法
基本信息
- 批准号:1717391
- 负责人:
- 金额:$ 19.98万
- 依托单位:
- 依托单位国家:美国
- 项目类别:Standard Grant
- 财政年份:2017
- 资助国家:美国
- 起止时间:2017-09-01 至 2020-08-31
- 项目状态:已结题
- 来源:
- 关键词:
项目摘要
Optimization, which finds the inputs to a mathematical function that produce the minimum output, is a workhorse algorithm behind many of the advances in smart devices or applications in the cloud. As data gets larger and more distributed, new ideas are needed to maintain the speed and accuracy of optimization. Operator splitting, which expresses the function to minimize as the sum of two convex functions, one of which is smooth and the other non-differentiable, is an idea that has produced to new first-order optimization methods. This project explores operator splitting with second-order optimization methods, which have faster convergence to the minimum. The focus is on large, distributed, and streaming data sets, so that the resulting general-purpose numerical solvers and embedded systems implementations can support optimization in cyberphysical systems and the Internet-of-Things. The project has as priority the active engagement and training of students and researchers, with specific emphasis on the inclusion of women and under-represented minority groups. This project not only involves collaboration across three top-tier American universities, but also with European research institute, KU Leuven. In specific, this research project seeks to interpret existing methods for structured convex optimization (such as the celebrated ADMM algorithm) as gradient methods applied to specific functions arising from the original problem formulation, and interpret of operator-splitting techniques as fixed point iterations for appropriately selected operators. A key theoretical foundation is the introduction of new envelope functions (smooth upper approximations possessing the same sets of solutions) that can be used as merit functions for variable-metric backtracking line-search. To conclude, a principal focus of the project is to design distributed asynchronous methods applicable to large-scale multi-agent cyberphysical systems that involve big data and impose stringent real-time constraints for decision-making. In this purview, the goal is to deliver methods that will outperform current state-of-the-art in terms of (a) speed of computations, (b) scalability with big data sizes, (c) robustness to various types of uncertainty, and, most topically, (d) distributed asynchronous implementation over networks in real-time. The merits will be illustrated in the context of applications in signal processing, control, machine learning and robotics.This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
优化找到产生最小输出的数学函数的输入,是云中智能设备或应用程序中许多进步背后的主力算法。随着数据变得更大且分布更加分布,需要新的想法来保持优化的速度和准确性。运算符拆分表达了将功能最小化为两个凸功能的总和,其中一个是平滑的,另一个是非差异的,是对新的一阶优化方法产生的想法。 该项目探索了使用二阶优化方法分裂的操作员,这些方法具有更快的收敛速度。 重点放在大型,分布式和流数据集上,因此所得的通用数值求解器和嵌入式系统实现可以支持在网络物理系统和图像中的优化。 该项目优先考虑学生和研究人员的积极参与和培训,特别强调妇女和代表性不足的少数群体。该项目不仅涉及在三所美国顶级大学之间的合作,而且还涉及欧洲研究所KU Leuven。具体而言,该研究项目试图将现有的结构化凸优化方法(例如著名的ADMM算法)解释为应用于原始问题表述引起的特定功能的梯度方法,并将其解释将操作员拆分技术解释为适当选择的操作员的固定点迭代。 一个关键的理论基础是引入新的包络函数(具有相同解决方案集的平滑上近似值),可以用作可变 - 金属回溯线路搜索的功能功能。总而言之,该项目的主要重点是设计适用于大规模多代理网络物理系统的分布式异步方法,该方法涉及大数据并对决策施加严格的实时限制。在此职权范围内,目的是提供在(a)计算速度,(b)具有大数据尺寸的可伸缩性方面胜过最先进的方法,(c)对各种不确定性的鲁棒性,以及(最局部的)(d)分布在网络上实时分布的异步实现。这些奖项将在信号处理,控制,机器学习和机器人技术中的应用中进行说明。该奖项反映了NSF的法定任务,并被认为是值得通过基金会的知识分子优点和更广泛影响的评估标准通过评估来支持的。
项目成果
期刊论文数量(5)
专著数量(0)
科研奖励数量(0)
会议论文数量(0)
专利数量(0)
SUCAG: Stochastic Unbiased Curvature-aided Gradient Method for Distributed Optimization
- DOI:10.1109/cdc.2018.8619336
- 发表时间:2018-03
- 期刊:
- 影响因子:0
- 作者:Hoi-To Wai;N. Freris;A. Nedić;A. Scaglione
- 通讯作者:Hoi-To Wai;N. Freris;A. Nedić;A. Scaglione
Distributed Gradient Methods for Convex Machine Learning Problems in Networks: Distributed Optimization
- DOI:10.1109/msp.2020.2975210
- 发表时间:2020-05
- 期刊:
- 影响因子:14.9
- 作者:A. Nedić
- 通讯作者:A. Nedić
A Push-Pull Gradient Method for Distributed Optimization in Networks
- DOI:10.1109/cdc.2018.8619047
- 发表时间:2018-03
- 期刊:
- 影响因子:0
- 作者:Shi Pu;Wei Shi;Jinming Xu;A. Nedić
- 通讯作者:Shi Pu;Wei Shi;Jinming Xu;A. Nedić
A General Framework for Decentralized Optimization With First-Order Methods
- DOI:10.1109/jproc.2020.3024266
- 发表时间:2020-09
- 期刊:
- 影响因子:20.6
- 作者:Ran Xin;Shi Pu;Angelia Nedi'c;U. Khan
- 通讯作者:Ran Xin;Shi Pu;Angelia Nedi'c;U. Khan
Accelerating incremental gradient optimization with curvature information
- DOI:10.1007/s10589-020-00183-1
- 发表时间:2018-05
- 期刊:
- 影响因子:2.2
- 作者:Hoi-To Wai;Wei Shi;César A. Uribe;A. Nedić;A. Scaglione
- 通讯作者:Hoi-To Wai;Wei Shi;César A. Uribe;A. Nedić;A. Scaglione
{{
item.title }}
{{ item.translation_title }}
- DOI:
{{ item.doi }} - 发表时间:
{{ item.publish_year }} - 期刊:
- 影响因子:{{ item.factor }}
- 作者:
{{ item.authors }} - 通讯作者:
{{ item.author }}
数据更新时间:{{ journalArticles.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ monograph.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ sciAawards.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ conferencePapers.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ patent.updateTime }}
Angelia Nedich其他文献
Angelia Nedich的其他文献
{{
item.title }}
{{ item.translation_title }}
- DOI:
{{ item.doi }} - 发表时间:
{{ item.publish_year }} - 期刊:
- 影响因子:{{ item.factor }}
- 作者:
{{ item.authors }} - 通讯作者:
{{ item.author }}
{{ truncateString('Angelia Nedich', 18)}}的其他基金
Collaborative Research: SaTC: CORE: Medium: Foundations of Trust-Centered Multi-Agent Distributed Coordination
协作研究:SaTC:核心:媒介:以信任为中心的多智能体分布式协调的基础
- 批准号:
2147641 - 财政年份:2022
- 资助金额:
$ 19.98万 - 项目类别:
Standard Grant
Collaborative Research: CIF:Medium: Harnessing Intrinsic Dynamics for Inherently Privacy-preserving Decentralized Optimization
合作研究:CIF:Medium:利用内在动力学实现固有隐私保护的去中心化优化
- 批准号:
2106336 - 财政年份:2021
- 资助金额:
$ 19.98万 - 项目类别:
Continuing Grant
Optimization with Uncertainties over Time: Theory and Algorithms
随时间变化的不确定性优化:理论和算法
- 批准号:
1312907 - 财政年份:2013
- 资助金额:
$ 19.98万 - 项目类别:
Standard Grant
Four Mathematical Programming Paradigms with Operations Research Applications
运筹学应用的四种数学编程范式
- 批准号:
0969600 - 财政年份:2010
- 资助金额:
$ 19.98万 - 项目类别:
Standard Grant
Early Concept Grant for Exploratory Research ( EAGER ) Dynamic Traffic Equilibrium Problems: Distributed Algorithms and Error Analysis
探索性研究早期概念资助 (EAGER) 动态流量均衡问题:分布式算法和误差分析
- 批准号:
0948905 - 财政年份:2009
- 资助金额:
$ 19.98万 - 项目类别:
Standard Grant
CAREER: Cooperative Multi-Agent Optimization
职业:协作多智能体优化
- 批准号:
0742538 - 财政年份:2008
- 资助金额:
$ 19.98万 - 项目类别:
Standard Grant
相似国自然基金
基于超宽频技术的小微型无人系统集群协作关键技术研究与应用
- 批准号:
- 批准年份:2020
- 资助金额:57 万元
- 项目类别:面上项目
异构云小蜂窝网络中基于协作预编码的干扰协调技术研究
- 批准号:61661005
- 批准年份:2016
- 资助金额:30.0 万元
- 项目类别:地区科学基金项目
密集小基站系统中的新型接入理论与技术研究
- 批准号:61301143
- 批准年份:2013
- 资助金额:24.0 万元
- 项目类别:青年科学基金项目
ScFVCD3-9R负载Bcl-6靶向小干扰RNA治疗EAMG的试验研究
- 批准号:81072465
- 批准年份:2010
- 资助金额:31.0 万元
- 项目类别:面上项目
基于小世界网络的传感器网络研究
- 批准号:60472059
- 批准年份:2004
- 资助金额:21.0 万元
- 项目类别:面上项目
相似海外基金
Collaborative Research: AF: Small: New Directions in Algorithmic Replicability
合作研究:AF:小:算法可复制性的新方向
- 批准号:
2342244 - 财政年份:2024
- 资助金额:
$ 19.98万 - 项目类别:
Standard Grant
Collaborative Research: AF: Small: Exploring the Frontiers of Adversarial Robustness
合作研究:AF:小型:探索对抗鲁棒性的前沿
- 批准号:
2335411 - 财政年份:2024
- 资助金额:
$ 19.98万 - 项目类别:
Standard Grant
NSF-BSF: Collaborative Research: AF: Small: Algorithmic Performance through History Independence
NSF-BSF:协作研究:AF:小型:通过历史独立性实现算法性能
- 批准号:
2420942 - 财政年份:2024
- 资助金额:
$ 19.98万 - 项目类别:
Standard Grant
Collaborative Research: AF: Small: Structural Graph Algorithms via General Frameworks
合作研究:AF:小型:通过通用框架的结构图算法
- 批准号:
2347322 - 财政年份:2024
- 资助金额:
$ 19.98万 - 项目类别:
Standard Grant
Collaborative Research: AF: Small: Real Solutions of Polynomial Systems
合作研究:AF:小:多项式系统的实数解
- 批准号:
2331401 - 财政年份:2024
- 资助金额:
$ 19.98万 - 项目类别:
Standard Grant