We introduce a surrogate-based black-box optimization method, termed Polynomial-model-based optimization (PMBO). The algorithm alternates polynomial approximation with Bayesian optimization steps, using Gaussian processes to model the error between the objective and its polynomial fit. We describe the algorithmic design of PMBO and compare the results of the performance of PMBO with several optimization methods for a set of analytic test functions. The results show that PMBO outperforms the classic Bayesian optimization and is robust with respect to the choice of its correlation function family and its hyper-parameter setting, which, on the contrary, need to be carefully tuned in classic Bayesian optimization. Remarkably, PMBO performs comparably with state-of-the-art evolutionary algorithms such as the Covariance Matrix Adaptation -- Evolution Strategy (CMA-ES). This finding suggests that PMBO emerges as the pivotal choice among surrogate-based optimization methods when addressing low-dimensional optimization problems. Hereby, the simple nature of polynomials opens the opportunity for interpretation and analysis of the inferred surrogate model, providing a macroscopic perspective on the landscape of the objective function.
我们引入一种基于代理模型的黑箱优化方法,称为基于多项式模型的优化(PMBO)。该算法将多项式逼近与贝叶斯优化步骤交替进行,利用高斯过程对目标函数及其多项式拟合之间的误差进行建模。我们描述了PMBO的算法设计,并针对一组解析测试函数,将PMBO的性能结果与几种优化方法进行了比较。结果表明,PMBO优于经典的贝叶斯优化,并且在其相关函数族的选择及其超参数设置方面具有鲁棒性,而在经典贝叶斯优化中,这些参数则需要仔细调整。值得注意的是,PMBO的性能与诸如协方差矩阵自适应进化策略(CMA - ES)等最先进的进化算法相当。这一发现表明,在处理低维优化问题时,PMBO成为基于代理模型的优化方法中的关键选择。因此,多项式的简单性质为推断代理模型的解释和分析提供了机会,为目标函数的景观提供了宏观视角。