Назад към всички

derivative-free-optimization

// Optimization without gradient information

$ git log --oneline --stat
stars:384
forks:73
updated:March 4, 2026
SKILL.mdreadonly
SKILL.md Frontmatter
namederivative-free-optimization
descriptionOptimization without gradient information
allowed-toolsBash,Read,Write,Edit,Glob,Grep
metadata[object Object]

Derivative-Free Optimization

Purpose

Provides optimization capabilities for problems where gradient information is unavailable or unreliable.

Capabilities

  • Nelder-Mead simplex method
  • Powell's method
  • Surrogate-based optimization
  • Bayesian optimization
  • Pattern search methods
  • Trust region methods

Usage Guidelines

  1. Method Selection: Choose based on problem characteristics
  2. Function Evaluations: Minimize expensive function calls
  3. Surrogate Models: Build and refine surrogate approximations
  4. Exploration-Exploitation: Balance search strategies

Tools/Libraries

  • scipy.optimize
  • Optuna
  • GPyOpt