Continuous Optimization Tutorial

This tutorial covers optimizing functions with real-valued variables. You'll learn how to set up, run, and analyze continuous optimization problems.

The Sphere Function

We'll start with the Sphere function, a simple unimodal benchmark:

f(x) = Σxᵢ² = x₁² + x₂² + ... + xₙ²

Properties:

  • Global minimum: 0 at origin (all zeros)
  • Unimodal: Single optimum, no local minima
  • Convex: Any local minimum is global
  • Separable: Each variable is independent

Complete Example

//! Sphere Function Optimization
//!
//! This example demonstrates basic continuous optimization using the Simple GA
//! to minimize the Sphere function (sum of squares).
//!
//! The Sphere function is a simple unimodal, convex, and separable benchmark
//! that's easy to optimize but useful for verifying the GA is working correctly.

use fugue_evo::prelude::*;
use rand::rngs::StdRng;
use rand::SeedableRng;

fn main() -> Result<(), Box<dyn std::error::Error>> {
    println!("=== Sphere Function Optimization ===\n");

    // Create a seeded RNG for reproducibility
    let mut rng = StdRng::seed_from_u64(42);

    // Define the problem dimension
    const DIM: usize = 10;

    // Create the fitness function (Sphere minimizes to 0 at origin)
    // We negate because fugue-evo maximizes by default
    let fitness = Sphere::new(DIM);

    // Define search bounds: each dimension in [-5.12, 5.12]
    let bounds = MultiBounds::symmetric(5.12, DIM);

    // Build and run the Simple GA
    let result = SimpleGABuilder::<RealVector, f64, _, _, _, _, _>::new()
        .population_size(100)
        .bounds(bounds)
        .selection(TournamentSelection::new(3))
        .crossover(SbxCrossover::new(20.0))
        .mutation(PolynomialMutation::new(20.0))
        .fitness(fitness)
        .max_generations(200)
        .build()?
        .run(&mut rng)?;

    // Print results
    println!("Optimization complete!");
    println!("  Best fitness: {:.6}", result.best_fitness);
    println!("  Generations:  {}", result.generations);
    println!("  Evaluations:  {}", result.evaluations);
    println!("\nBest solution:");
    for (i, val) in result.best_genome.genes().iter().enumerate() {
        println!("  x[{}] = {:.6}", i, val);
    }

    // The optimal solution is at the origin with fitness 0
    let distance_from_optimum: f64 = result
        .best_genome
        .genes()
        .iter()
        .map(|x| x * x)
        .sum::<f64>()
        .sqrt();
    println!("\nDistance from optimum: {:.6}", distance_from_optimum);

    // Show convergence statistics
    println!("\n{}", result.stats.summary());

    Ok(())
}

Source: examples/sphere_optimization.rs

Running the Example

cargo run --example sphere_optimization

Understanding the Components

Search Bounds

let bounds = MultiBounds::symmetric(5.12, DIM);

This creates bounds [-5.12, 5.12] for each of the 10 dimensions. The Sphere function is typically tested in this range.

Operators for Real-Valued Optimization

Simulated Binary Crossover (SBX)

SbxCrossover::new(20.0)

SBX creates offspring similar to parent values. The distribution index (eta = 20.0) controls spread:

  • Higher eta → offspring closer to parents (exploitation)
  • Lower eta → more diverse offspring (exploration)

Polynomial Mutation

PolynomialMutation::new(20.0)

Adds bounded perturbations to gene values. The distribution index controls mutation magnitude.

Fitness Function

let fitness = Sphere::new(DIM);

Built-in Sphere computes the sum of squares. Since GAs maximize by default, the function is negated internally.

Analyzing Results

println!("Best fitness: {:.6}", result.best_fitness);
println!("Generations:  {}", result.generations);
println!("Evaluations:  {}", result.evaluations);

Key metrics:

  • Best fitness: Should approach 0 (the global minimum)
  • Generations: How many evolutionary cycles completed
  • Evaluations: Total fitness function calls

Convergence Statistics

println!("{}", result.stats.summary());

The summary shows:

  • Mean and standard deviation per generation
  • Diversity metrics
  • Improvement trends

Parameter Tuning

Population Size

.population_size(100)
SizeTrade-off
Small (20-50)Fast generations, less diversity
Medium (100-200)Good balance
Large (500+)More exploration, slower per generation

For Sphere (unimodal), smaller populations work well. Multimodal functions need larger populations.

Operator Parameters

SBX Distribution Index

.crossover(SbxCrossover::new(eta))
etaEffect
1-5High exploration
15-25Balanced
50+High exploitation

Mutation Probability

.mutation(PolynomialMutation::new(20.0).with_probability(0.1))

Default is 1/dimension. Increase for more exploration.

Elitism

.elitism(true)
.elite_count(2)

Elitism preserves the best individuals across generations, preventing loss of good solutions.

Common Issues

Premature Convergence

Symptoms: Population converges too quickly, stuck at suboptimal solution

Solutions:

  1. Increase population size
  2. Lower selection pressure (smaller tournament size)
  3. Increase mutation probability

Slow Convergence

Symptoms: Fitness improves very slowly

Solutions:

  1. Increase selection pressure
  2. Reduce mutation (more exploitation)
  3. Use elitism to preserve progress

Exercises

  1. Change dimensions: Modify DIM to 30 and observe how convergence changes
  2. Adjust operators: Try eta values of 5 and 50 for crossover
  3. Compare selections: Replace TournamentSelection with RouletteWheelSelection

Next Steps