derivative-free-optimization
// Optimization without gradient information
$ git log --oneline --stat
stars:384
forks:73
updated:March 4, 2026
SKILL.mdreadonly
SKILL.md Frontmatter
namederivative-free-optimization
descriptionOptimization without gradient information
allowed-toolsBash,Read,Write,Edit,Glob,Grep
metadata[object Object]
Derivative-Free Optimization
Purpose
Provides optimization capabilities for problems where gradient information is unavailable or unreliable.
Capabilities
- Nelder-Mead simplex method
- Powell's method
- Surrogate-based optimization
- Bayesian optimization
- Pattern search methods
- Trust region methods
Usage Guidelines
- Method Selection: Choose based on problem characteristics
- Function Evaluations: Minimize expensive function calls
- Surrogate Models: Build and refine surrogate approximations
- Exploration-Exploitation: Balance search strategies
Tools/Libraries
- scipy.optimize
- Optuna
- GPyOpt