Cost Optimizer (Cloud Data Platforms)
// Analyzes and optimizes costs for cloud data platforms
$ git log --oneline --stat
stars:384
forks:73
updated:March 4, 2026
SKILL.mdreadonly
SKILL.md Frontmatter
nameCost Optimizer (Cloud Data Platforms)
descriptionAnalyzes and optimizes costs for cloud data platforms
version1.0.0
categoryCost Management
skillIdSK-DEA-012
allowed-toolsRead,Write,Edit,Glob,Grep,Bash
Cost Optimizer (Cloud Data Platforms)
Overview
Analyzes and optimizes costs for cloud data platforms. This skill provides deep expertise in platform-specific cost structures and optimization strategies.
Capabilities
- Snowflake credit analysis and optimization
- BigQuery slot and on-demand optimization
- Redshift node sizing
- Storage cost optimization
- Query cost estimation
- Warehouse scheduling recommendations
- Data lifecycle policy recommendations
- Reserved capacity planning
Input Schema
{
"platform": "snowflake|bigquery|redshift|databricks",
"usageMetrics": "object",
"billingData": "object",
"queryHistory": "object"
}
Output Schema
{
"currentCost": "number",
"optimizedCost": "number",
"savings": "percentage",
"recommendations": [{
"category": "string",
"action": "string",
"impact": "number",
"effort": "low|medium|high"
}]
}
Target Processes
- Data Warehouse Setup
- Query Optimization
- Pipeline Migration
Usage Guidelines
- Provide platform-specific usage metrics
- Include billing data for cost baseline
- Share query history for optimization analysis
- Prioritize recommendations by impact and effort
Best Practices
- Regularly review and optimize warehouse sizes
- Implement auto-suspend and auto-resume policies
- Use clustering and partitioning to reduce scan costs
- Consider reserved capacity for predictable workloads
- Monitor and alert on cost anomalies