Quick Start
This guide walks you through running your first optimization with fugue-evo in under 5 minutes.
Prerequisites
Ensure you have installed fugue-evo.
The Problem: Minimize the Sphere Function
The Sphere function is a classic optimization benchmark:
f(x) = x₁² + x₂² + ... + xₙ²
The global minimum is at the origin (all zeros) with a value of 0.
Full Example
Here's the complete code to optimize the Sphere function:
//! Sphere Function Optimization
//!
//! This example demonstrates basic continuous optimization using the Simple GA
//! to minimize the Sphere function (sum of squares).
//!
//! The Sphere function is a simple unimodal, convex, and separable benchmark
//! that's easy to optimize but useful for verifying the GA is working correctly.
use fugue_evo::prelude::*;
use rand::rngs::StdRng;
use rand::SeedableRng;
fn main() -> Result<(), Box<dyn std::error::Error>> {
println!("=== Sphere Function Optimization ===\n");
// Create a seeded RNG for reproducibility
let mut rng = StdRng::seed_from_u64(42);
// Define the problem dimension
const DIM: usize = 10;
// Create the fitness function (Sphere minimizes to 0 at origin)
// We negate because fugue-evo maximizes by default
let fitness = Sphere::new(DIM);
// Define search bounds: each dimension in [-5.12, 5.12]
let bounds = MultiBounds::symmetric(5.12, DIM);
// Build and run the Simple GA
let result = SimpleGABuilder::<RealVector, f64, _, _, _, _, _>::new()
.population_size(100)
.bounds(bounds)
.selection(TournamentSelection::new(3))
.crossover(SbxCrossover::new(20.0))
.mutation(PolynomialMutation::new(20.0))
.fitness(fitness)
.max_generations(200)
.build()?
.run(&mut rng)?;
// Print results
println!("Optimization complete!");
println!(" Best fitness: {:.6}", result.best_fitness);
println!(" Generations: {}", result.generations);
println!(" Evaluations: {}", result.evaluations);
println!("\nBest solution:");
for (i, val) in result.best_genome.genes().iter().enumerate() {
println!(" x[{}] = {:.6}", i, val);
}
// The optimal solution is at the origin with fitness 0
let distance_from_optimum: f64 = result
.best_genome
.genes()
.iter()
.map(|x| x * x)
.sum::<f64>()
.sqrt();
println!("\nDistance from optimum: {:.6}", distance_from_optimum);
// Show convergence statistics
println!("\n{}", result.stats.summary());
Ok(())
}
Source:
examples/sphere_optimization.rs
Running the Example
Run the example directly:
cargo run --example sphere_optimization
Expected output:
=== Sphere Function Optimization ===
Optimization complete!
Best fitness: -0.000023
Generations: 200
Evaluations: 20000
Best solution:
x[0] = 0.001234
x[1] = -0.000567
...
Distance from optimum: 0.004567
Code Breakdown
1. Imports and Setup
use fugue_evo::prelude::*;
use rand::rngs::StdRng;
use rand::SeedableRng;
let mut rng = StdRng::seed_from_u64(42);
The prelude imports everything you need. We use a seeded RNG for reproducibility.
2. Define the Problem
const DIM: usize = 10;
let fitness = Sphere::new(DIM);
let bounds = MultiBounds::symmetric(5.12, DIM);
DIM: Number of variables to optimizeSphere::new(DIM): Built-in benchmark functionMultiBounds::symmetric(5.12, DIM): Search space [-5.12, 5.12] per dimension
3. Configure the Algorithm
let result = SimpleGABuilder::<RealVector, f64, _, _, _, _, _>::new()
.population_size(100)
.bounds(bounds)
.selection(TournamentSelection::new(3))
.crossover(SbxCrossover::new(20.0))
.mutation(PolynomialMutation::new(20.0))
.fitness(fitness)
.max_generations(200)
.build()?
.run(&mut rng)?;
| Setting | Value | Purpose |
|---|---|---|
population_size | 100 | Number of candidate solutions |
selection | Tournament(3) | Select best of 3 random individuals |
crossover | SBX(20.0) | Simulated Binary Crossover |
mutation | Polynomial(20.0) | Polynomial mutation |
max_generations | 200 | When to stop |
4. Analyze Results
println!("Best fitness: {:.6}", result.best_fitness);
println!("Generations: {}", result.generations);
for (i, val) in result.best_genome.genes().iter().enumerate() {
println!(" x[{}] = {:.6}", i, val);
}
Understanding the Output
The fitness value should be close to 0 (the global minimum). The solution values should be close to 0 (the optimal point).
What if Results Aren't Good?
If the solution isn't converging well:
- Increase population size: More diversity helps exploration
- Increase generations: More time to converge
- Adjust mutation: Higher rates for exploration, lower for exploitation
- Try different selection pressure: Higher tournament size = more exploitation
Next Steps
- Your First Optimization - Build a custom fitness function
- Continuous Optimization Tutorial - Deep dive into real-valued optimization
- Choosing an Algorithm - When to use different algorithms