Skip to content

db-Lee/CFBO

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Cost-Sensitive Freeze-Thaw Bayesian Optimization for Efficient Hyperparameter Tuning

arXiv

This repository contains the official codebase for our NeurIPS 2025 paper,
“Cost-Sensitive Freeze-Thaw Bayesian Optimization for Efficient Hyperparameter Tuning.”


Quick Start

conda create -n cfbo python=3.11
conda activate cfbo
pip install -r requirements.txt

Data

Download the dataset from this Google Drive link and unzip it into this repository.


Learning-Curve (LC) Extrapolator

(Optional) Pretrain the LC extrapolator for transfer learning:

# BENCHMARK_NAME ∈ ["lcbench", "taskset", "pd1", "odbench"]
python train.py --benchmark_name BENCHMARK_NAME

Alternatively, download pretrained checkpoints from this Google Drive link and unzip them into this repository.


Cost-Sensitive Bayesian Optimization

We consider the following utility function: $U(b, \tilde{y}_b) = \tilde{y}_b - \alpha\left(\frac{b}{B}\right)^c$, where:

  • $b$ denotes the currently consumed budget, and $\tilde{y}_b$ denotes the best performance observed up to budget $b$,
  • budget_limit ($B \in \mathbb{N}$): the maximum allowable optimization budget,
  • alpha ($\alpha \in [0,1]$): the penalty coefficient for budget consumption ($\alpha = 0$ recovers conventional BO),
  • c ($c > 0$): controls the curvature of the utility function (e.g., $c=1$ for linear, $c=2$ for quadratic, $c=0.5$ for square-root).

Run BO:

# BENCHMARK_NAME ∈ ["lcbench", "taskset", "pd1", "odbench"]

# DyHPO
python run_bo.py --algorithm dyhpo --benchmark_name BENCHMARK_NAME --alpha ALPHA --c C

# ifBO
python run_bo.py --algorithm ifbo --benchmark_name BENCHMARK_NAME --alpha ALPHA --c C

# CFBO without transfer learning
python run_bo.py --algorithm CFBO --benchmark_name BENCHMARK_NAME --alpha ALPHA --c C

# CFBO with transfer learning
python run_bo.py --algorithm CFBO --benchmark_name BENCHMARK_NAME --alpha ALPHA --c C \
    --model_ckpt ./checkpoints/BENCHMARK_NAME/model.pt

Citation

@inproceedings{CFBO,
    title={Cost-Sensitive Freeze-thaw Bayesian Optimization for Efficient Hyperparameter Tuning},
    author={Lee, Dong Bok and Zhang, Aoxuan Silvia and Kim, Byungjoo and Park, Junhyeon and Adriaensen, Steven and Lee, Juho and Hwang, Sung Ju and Lee, Hae Beom},
    booktitle={The Thirty-Ninth Annual Conference on Neural Information Processing Systems},
    year={2025},
    url={https://openreview.net/pdf?id=ZUb4JpNoJe}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages