Optuna - Hyperparameter Optimization
Optuna is a Python library for automated parameter optimization using adaptive search instead of manual tuning.
It shines whenever:
- You have multiple knobs to tune.
- The system returns one final numeric score.
This pattern shows up everywhere: machine learning metrics, simulation outcomes, and trading strategy ROI.
What Is a Hyperparameter?
A hyperparameter is a setting you choose before running an evaluation.
Examples:
- A model’s learning rate or tree depth.
- A trading rule’s lookback window or threshold.
- A simulator’s step size or penalty weight.
Unlike learned parameters (like model weights), hyperparameters are not fit directly by gradient descent. You set them, run the system, and observe a final score.
Optuna automates this loop:
- Propose a set of hyperparameters.
- Run the evaluation.
- Record the score.
- Use the results to propose better values next time.
Multi-Parameter Systems With One Score
Optuna works especially well for systems that require multiple parameters but produce a single final score.
All you need is an objective function that:
- Accepts a
trial. - Suggests several parameters.
- Returns one numeric score to maximize or minimize.
Minimal Example
import optuna
def objective(trial):
threshold = trial.suggest_float("threshold", 0.0, 1.0)
window = trial.suggest_int("window", 5, 50)
score = evaluate_model(threshold=threshold, window=window)
return score
study = optuna.create_study(direction="maximize")
study.optimize(objective, n_trials=100)
print(study.best_params)
Example: Trading Strategy ROI Over a Time Series
Trading strategies often depend on several parameters but ultimately produce one score, such as ROI, Sharpe ratio, or drawdown-adjusted return.
Optuna is a great fit for this pattern:
import optuna
def backtest_strategy(prices, window, entry_z, exit_z, fee_bps):
# Placeholder: implement your backtest here.
# Return a single score such as ROI.
return roi_over_time_series(
prices=prices,
window=window,
entry_z=entry_z,
exit_z=exit_z,
fee_bps=fee_bps,
)
def objective(trial):
window = trial.suggest_int("window", 10, 200)
entry_z = trial.suggest_float("entry_z", 0.5, 3.0)
exit_z = trial.suggest_float("exit_z", 0.1, 2.0)
fee_bps = trial.suggest_float("fee_bps", 0.0, 25.0)
roi = backtest_strategy(
prices=load_prices(),
window=window,
entry_z=entry_z,
exit_z=exit_z,
fee_bps=fee_bps,
)
return roi
study = optuna.create_study(direction="maximize")
study.optimize(objective, n_trials=200)
print("Best ROI:", study.best_value)
print("Best params:", study.best_params)
The key idea is simple: if you can compute one score from many parameters, Optuna can search the space for you.
Common Use Cases
- Trading strategy backtests
- Machine learning hyperparameters
- Simulation tuning (robotics, physics, finance)
- Cost and path optimization problems