Fact-checked by Grok 2 weeks ago

Rastrigin function

The Rastrigin function is a non-convex, mathematical commonly employed as a test problem to evaluate the performance of algorithms. It is defined over an n-dimensional domain as
f(\mathbf{x}) = 10n + \sum_{i=1}^{n} \left( x_i^2 - 10 \cos(2\pi x_i) \right),
where \mathbf{x} = (x_1, \dots, x_n) and the constant A = 10 scales the to emphasize its periodic structure. The features a global minimum of 0 at \mathbf{x} = \mathbf{0}, surrounded by numerous local minima arranged in a highly regular, lattice-like pattern that challenges algorithms to escape deceptive traps.
Originally proposed in 1974 by Latvian mathematician Leonid A. Rastrigin as a two-dimensional problem in his book Systems of Extreme Control, the function was later generalized to higher dimensions, notably by Günter Rudolph in the context of evolution strategies. This generalization has made it a staple in the field of and optimization since the 1990s, with its complex landscape—combining quadratic terms for convexity near the origin and cosine oscillations for —serving to test an algorithm's ability to navigate rugged search spaces without relying on gradients. Typically evaluated over the bounded domain x_i \in [-5.12, 5.12] for all i, the Rastrigin function's deceptive simplicity belies its difficulty, as the number of local minima grows exponentially with dimensionality. In practice, the function's properties have been leveraged in diverse applications, from validating and genetic algorithms to assessing hybrid metaheuristics, with empirical studies confirming its utility in revealing issues in population-based methods. Its periodic nature also allows for analytical insights into optimization dynamics, such as progress rates in evolution strategies, underscoring its enduring role in advancing non-convex optimization research.

Definition

Mathematical Formulation

The Rastrigin function is a multidimensional test function commonly used in studies. For an n-dimensional input \mathbf{x} = (x_1, \dots, x_n) \in \mathbb{R}^n, it is defined mathematically as f(\mathbf{x}) = 10n + \sum_{i=1}^n \left( x_i^2 - 10 \cos(2\pi x_i) \right). This standard form was originally proposed by Leonid A. Rastrigin in his 1974 book on extremal control systems. The function is typically considered over a bounded search space, such as [-5.12, 5.12]^n, which aligns with the period of the cosine oscillations and ensures the presence of numerous local optima within the domain. Each term in the sum combines a component x_i^2, which contributes a convex parabolic shape centered at the , with a cosine perturbation -10 \cos(2\pi x_i), whose and periodicity introduce regular oscillations that disrupt the global . This structure arises from augmenting a simple —known for its unimodal, easy-to-optimize nature—with trigonometric terms to simulate periodic barriers, thereby creating a highly surface suitable for testing optimization algorithms' ability to escape local traps.

Variants and Generalizations

The Rastrigin function was originally proposed by L. A. Rastrigin in as a two-dimensional test problem for extremal control systems. This initial formulation focused on capturing behavior in low dimensions to evaluate optimization methods. It was subsequently generalized to arbitrary n dimensions by Mühlenbein, Schomisch, and Born in 1991, extending its applicability to higher-dimensional search spaces while preserving the characteristic oscillatory landscape. Common variants of the Rastrigin function include shifted versions, where the location of the global minimum is displaced from the origin via an input transformation, thereby testing algorithms' ability to escape origin bias; this form appears as function F9 in the CEC 2005 benchmark suite. Rotated variants introduce non-separability by premultiplying the input vector with an orthogonal , which couples variables and challenges coordinate-descent-based approaches; a combined shifted and rotated form is defined as F10 in the same CEC 2005 suite. These modifications enhance the function's utility in assessing robustness across diverse problem characteristics. Parameter variations typically adjust the amplitude coefficient A (standard value 10) or the angular frequency ω (standard value 2π) within the cosine term, altering the depth and frequency of oscillations to create landscapes with varying numbers of local minima. Such adjustments allow for tailored benchmarking of optimization techniques sensitive to oscillation scale. Higher-dimensional adaptations scale the generalized n-dimensional form to dimensions such as 10, 30, 50, or 100, as implemented in CEC benchmark suites to probe algorithmic performance in large-scale global optimization. These extensions are particularly prevalent in competitions like CEC 2005 and later, where the function's multimodality intensifies with dimensionality.

Properties

Multimodality

The Rastrigin function exhibits strong , featuring numerous local minima generated by cosine oscillations overlaid on a surface. This oscillatory component introduces deceptive basins that complicate the identification of the global optimum. The local minima are regularly distributed throughout the search space, positioned near coordinates of the variables x_i, which results in a rugged resembling alternating hills and valleys. In the typical bounded domain [-5.12, 5.12]^n, the count of these local minima grows exponentially with the problem dimension n. This structure renders the function particularly challenging for optimization, as it frequently ensnares gradient-based and local search methods in suboptimal traps, thereby serving as a rigorous for evaluating the efficacy of algorithms.

Global and Local Minima

The Rastrigin function features a unique global minimum at the origin, \mathbf{x} = (0, 0, \dots, 0), where f(\mathbf{x}) = 0. This location achieves the minimum value because the oscillatory cosine terms reach their maximum of 1 at integer coordinates, particularly zero, while the quadratic terms vanish. Analytically, the global minimum can be confirmed by bounding the function. For the standard formulation f(\mathbf{x}) = 10d + \sum_{i=1}^d [x_i^2 - 10 \cos(2\pi x_i)], each component satisfies x_i^2 - 10 \cos(2\pi x_i) + 10 \geq 0 since x_i^2 \geq 0 and -10 \cos(2\pi x_i) \geq -10, with equality only when x_i = 0 (where \cos(0) = 1). Thus, f(\mathbf{x}) \geq 0, with equality solely at the origin. This structure ensures the global minimum is isolated and verifiable without numerical search. In addition to the global minimum, the function exhibits numerous local minima located approximately at points where each coordinate x_i is near a non-zero , such as \pm 1, \pm 2, due to the periodic nature of the cosine terms aligning near these values. For instance, in one , local minima occur near x = \pm 1, \pm 2, \dots, with corresponding function values strictly greater than 0, often close to the global minimum in magnitude but separated in the search space. These local minima arise as solutions to the critical point x_i = -10\pi \sin(2\pi x_i) for each , which cluster near integers beyond zero. The basin of attraction for the global minimum is relatively large, driven by the dominant quadratic terms that funnel trajectories toward the origin over broad regions, whereas the local minima possess shallow basins that are numerous but narrow, scaling exponentially with dimensionality. This contrast highlights the function's deceptive landscape, where local trap optimizers but the global prevails in wider exploratory searches.

Differentiability and Continuity

The Rastrigin function, defined as f(\mathbf{x}) = 10n + \sum_{i=1}^n \left( x_i^2 - 10 \cos(2\pi x_i) \right) for \mathbf{x} \in \mathbb{R}^n, is on the entire domain \mathbb{R}^n. This property arises from its construction as a finite of continuous components: terms x_i^2, which are polynomials, and cosine functions \cos(2\pi x_i), both of which are continuous everywhere. Continuity ensures that small changes in the input \mathbf{x} result in correspondingly small changes in the function value, a fundamental requirement for many theoretical analyses in optimization. Beyond continuity, the Rastrigin function is infinitely differentiable, classifying it as a C^\infty () function across \mathbb{R}^n. This smoothness stems from the fact that polynomials and the cosine function are themselves infinitely differentiable, and sums and compositions of such functions preserve this property. The C^\infty nature allows for the computation of higher-order derivatives if needed, though first-order derivatives suffice for most gradient-based techniques. In benchmark contexts, this regularity contrasts with the function's multimodal landscape, where smooth local irregularities arise from the oscillatory cosine modulation. The of the Rastrigin function is separable, with partial given by \frac{\partial f}{\partial x_i}(\mathbf{x}) = 2x_i + 20\pi \sin(2\pi x_i) for each i = 1, \dots, n. Thus, the full is \nabla f(\mathbf{x}) = (2x_1 + 20\pi \sin(2\pi x_1), \dots, 2x_n + 20\pi \sin(2\pi x_n)). These are derived directly from the chain rule applied to the and trigonometric terms, confirming the function's differentiability. The availability of an explicit, smooth makes the Rastrigin function amenable to derivative-based optimization algorithms, such as or quasi-Newton methods, which rely on local information to navigate the search space. However, the periodic oscillations introduced by the sine terms in the can produce misleading directions, often trapping optimizers in local minima despite the underlying . This duality—smooth yet deceptive —highlights the function's utility as a for assessing the robustness of gradient-utilizing solvers against challenges.

Visualization and Analysis

Graphical Representations

The Rastrigin function is commonly visualized in two dimensions using contour plots, which reveal a grid-like of local minima arranged near coordinates, with closed level curves around each minimum superimposed on the overall parabolic shape, illustrating the periodic oscillatory behavior due to the cosine terms. In three dimensions, surface plots provide a more immersive view, portraying the function as a rugged, wavy with a prominent central representing the global minimum, surrounded by alternating ridges and depressions that correspond to local maxima and minima. This emphasizes the deceptive landscape that challenges optimization algorithms, as the surface undulates with increasing away from the . For dimensions beyond three, direct plotting becomes infeasible, so heatmaps of two-dimensional projections or fixed slices are employed to capture the . These representations show dense patterns of hot and cool spots, highlighting clusters of local minima arranged in a lattice-like fashion across the projected space, which underscores the function's regular yet numerous deceptive attractors. Such graphical representations make the Rastrigin's multimodal properties immediately apparent, aiding in the intuitive understanding of its optimization challenges. To generate these plots, tools like utilize built-in functions such as contour for 2D levels and surf for 3D surfaces, often within optimization toolboxes. Similarly, Python's library enables comparable visualizations through contourf and plot_surface methods, integrated in frameworks like PyMOO for benchmark functions.

Behavior in Low Dimensions

In the one-dimensional case, the Rastrigin function simplifies to f(x) = x^2 - 10 \cos(2\pi x) + 10, defined over the standard search interval [-5.12, 5.12]. This reduction highlights the function's inherent , with approximately 12 local minima distributed regularly across the interval, creating a series of oscillatory "valleys" superimposed on the parabolic x^2 trend. The global minimum occurs at x = 0, where f(0) = 0, while the local minima are positioned near values, slightly displaced by the influence of the term that dominates far from the origin. These local minima arise from the balance between the smooth growth and the rapid oscillations of the cosine term, which has a of 1, leading to roughly one potential minimum per within the bounded domain. This structure makes the 1D case a foundational example for understanding how the function traps gradient-based optimizers in suboptimal points, yet the limited number of extrema allows for relatively straightforward or to locate the optimum. The two-dimensional formulation represents the original proposal by Rastrigin in , where the function was introduced as f(x, y) = 20 + x^2 + y^2 - 10 \cos(2\pi x) - 10 \cos(2\pi y) to test extremal systems. Despite being separable—composed additively of univariate terms—the 2D features axis-aligned minima arranged on a Cartesian near integer pairs (k, l) for k, l, resulting in a product of the 1D minima counts and yielding hundreds of local optima within the domain [-5.12, 5.12]^2. This -like pattern facilitates intuitive of search , such as how trajectories might follow coordinate axes toward deceptive basins, but the sheer volume of traps already poses significant challenges for exhaustive . Behavior in these low dimensions provides critical intuition for scalability in higher dimensions, foreshadowing the curse of dimensionality: the exponential proliferation of local minima, approximately proportional to m^n where m \approx 12 is the 1D count and n is the dimension, transforms the landscape into an increasingly rugged and deceptive terrain that overwhelms local search strategies. In contrast to higher-dimensional instances, low-dimensional versions of the permit manual optimization or simple techniques like grid sampling, as the finite and predictable arrangement of minima enables systematic coverage of the space without excessive computational cost, underscoring the function's role in demonstrating dimensionality-dependent difficulty.

Applications

Benchmark Function in Optimization

The Rastrigin function serves as a prominent in research, designed to assess algorithms' proficiency in navigating highly landscapes to escape local minima and reach the global optimum. Its structure, featuring numerous regularly distributed local optima, challenges methods to balance exploration and effectively. This function is incorporated into key benchmark suites, such as the IEEE Congress on Evolutionary Computation (CEC) 2005 and 2013 competitions on real-parameter optimization, where it functions as a core test problem among 25 and 28 functions, respectively. It also appears in the Black-Box Optimization Benchmarking (BBOB) suite as function f_{15}, tailored for evaluating derivative-free and black-box optimizers across scalable dimensions. Algorithm performance on the Rastrigin function is commonly measured by success rate (proportion of runs reaching the global optimum within a ), convergence speed (rate of error reduction over iterations), and computational efficiency via the number of evaluations needed. These metrics highlight the function's utility in comparing optimizer robustness under controlled . In comparison to other benchmarks, the Rastrigin presents greater difficulty than the unimodal Sphere function, which lacks local minima and allows straightforward gradient-based convergence, but it is less challenging than the Griewank function due to the latter's non-separability from the product term, which introduces deceptive couplings and irregular optima placement. The Rastrigin's separable, cosine-modulated maintains a predictable global structure, aiding visibility of the optimum amid local traps.

Role in Evolutionary Algorithms

The Rastrigin function serves as a challenging for evaluating the performance of strategies (ES), particularly in assessing rates and progress toward the global minimum amid its numerous local optima. In analyses of the intermediate multi-recombinative (μ/μ_I, λ)-ES, researchers have derived progress rates to quantify the algorithm's advancement on this function, revealing linear in initial phases followed by slowdowns near local minima and acceleration closer to the global optimum. These studies, using large population sizes such as μ=1500 and λ=3000, demonstrate that theoretical approximations align closely with simulations when population fluctuations are minimized, highlighting the function's utility in probing ES over multiple generations. Further investigations into (μ/μ_I, λ)-ES on the Rastrigin function have focused on properties, introducing aggregated rate measures that depend on the to the optimizer. For moderate strengths, a distance-dependent emerges, complicating global search, while small strengths lead to or escape conditions that trap solutions in suboptimal regions. To counter these challenges, scaling population sizes proportionally to the dimensionality—such as increasing μ and λ—enhances global rates, as validated through experimental runs on 100-dimensional instances. A 2022 analysis at the Parallel from (PPSN) specifically examined strengths, showing that normalized strengths (σ*) in the range [0,1] yield varying based on to the optimum, with optimal regimes avoiding premature stagnation. In genetic algorithms (GAs), the Rastrigin function is commonly employed to demonstrate and tune optimization solvers, such as the ga() function, which minimizes the two-dimensional variant through selection, crossover, and . Typical implementations use default population sizes of around 50-100 individuals, achieving solutions near the global minimum (e.g., objective values ≈2.5 after 100 generations in runs), though multiple executions are needed due to variability. This setup illustrates GA's ability to navigate the function's oscillatory landscape, with parameters like crossover fraction (0.8) and mutation rate (0.01) directly influencing escape from local minima. Addressing in GAs on the Rastrigin function often involves tuning and crossover rates to balance and , as explored in foundational studies using 10-dimensional instances with string of 130 bits. Experiments varying crossover probabilities (p_c from 0.0 to 0.9) and rates (p_m from 0.1/l to 1.0/l, where l is locus ) across sizes up to 2000 reveal that higher aids to overcome traps, while moderate crossover promotes effective recombination, improving overall metrics like time to the optimum. These interactions underscore the function's role in refining GA parameter strategies for robust handling of deceptive landscapes.

History

Origin

The Rastrigin function was introduced by Leonid A. Rastrigin, a Soviet renowned for his contributions to optimization and methods. Born on July 23, 1929, Rastrigin specialized in extremal control and adaptive systems during his career at institutions such as the Institute of in . His work emphasized approaches to solving complex optimization problems in engineering and . Rastrigin proposed the function in 1974 as part of his research into optimization landscapes. Initially formulated as a two-dimensional , it served to illustrate challenges in locating global extrema amid numerous local minima. This proposal appeared in his book Systems of Extreme Control, published by Nauka in , where it was used to analyze the performance of strategies in high-dimensional search spaces. The original context centered on theoretical studies of extremal systems, particularly how randomized and adaptive algorithms navigate rugged fitness surfaces in optimization theory. Rastrigin's development drew from his earlier explorations of techniques, aiming to model real-world problems in where deterministic methods often fail. The publication, written in , remained primarily within Soviet-era literature until English translations and citations in Western journals facilitated its in the late 1980s and beyond.

Subsequent Developments

Following its initial proposal as a two-dimensional test problem, the Rastrigin function was generalized to arbitrary dimensions by Heinz Mühlenbein, Michael Schomisch, and Johannes Born in 1991, enabling its use for evaluating optimization algorithms in higher-dimensional spaces up to 400 dimensions. This extension, detailed in their work on parallel genetic algorithms, transformed the function into a standard by introducing scalable complexity through summed sinusoidal terms across dimensions, facilitating broader testing of global search capabilities. In the , the generalized Rastrigin function gained prominence in libraries for , appearing in early test suites for and to assess performance on deceptive landscapes. Its adoption accelerated with inclusions in competitions like the IEEE on (CEC), starting from special sessions in the early 2000s, such as the 2008 large-scale where rotated and shifted variants were used to evaluate scalability. By 2025, the function has been referenced in over 10,000 optimization papers, underscoring its enduring role in validation across CEC events and beyond. Recent theoretical advancements have focused on progress rate analyses for evolution strategies on the Rastrigin function, providing insights into convergence dynamics under . In 2022, Omeradžić and Beyer derived a first-order progress rate for intermediate multi-recombinative strategies, incorporating noisy order to approximate mutation-induced variance and compare theoretical predictions with empirical simulations. Building on this, subsequent 2023 studies extended the analysis to population sizing models, revealing that larger populations mitigate trapping in local minima by enhancing exploration rates on highly instances. Additionally, explorations of fractal-like properties in Rastrigin variants have emerged in 2024 frameworks, using techniques to exploit self-similar structures for improved optimization in continuous problems.

References

  1. [1]
    Rastrigin Function
    The Rastrigin function has several local minima. It is highly multimodal, but locations of the minima are regularly distributed.Missing: definition mathematics<|control11|><|separator|>
  2. [2]
    Rastrigin Function - R
    Description. f ( x ) = 10 n + ∑ i = 1 n ( x i 2 − 10 cos ⁡ f(\mathbf{x}) = 10n + \sum_{i=1}^{n} \left(\mathbf{x}_i^2 - 10 \cos(2\pi \mathbf{x}_i)\right). f(x)= ...
  3. [3]
    Minimize Rastrigin's Function - MATLAB & Simulink - MathWorks
    Rastrigin's function is often used to test the genetic algorithm, because its many local minima make it difficult for standard, gradient-based methods to find ...
  4. [4]
    [PDF] Computational Technology for Global Search Based on the Modified ...
    2) Rastrigin function. The Rastrigin function proposed by L.A. Rastrigin in 1974 [9] is one of the classic non-convex optimization test problems [10–11]. It is ...<|control11|><|separator|>
  5. [5]
    The Rastrigin function for x ∈ [−5, 12, 5.12]. Note the global maxima...
    n-dimensional Rastrigin function, illustrated in Fig 1, is a well-known optimization benchmark with one global minimum, 2 n global maxima, and a exponential ...Missing: formula | Show results with:formula
  6. [6]
    Progress analysis of a multi-recombinative evolution strategy on the ...
    Nov 2, 2023 · Progress analysis of a multi-recombinative evolution strategy on the highly multimodal Rastrigin function☆. Author links open overlay panel
  7. [7]
    Convergence Properties of the (μ/μI, λ)-ES on the Rastrigin Function
    Both mutation strength regimes combined pose a major challenge optimizing the Rastrigin function, which can be counteracted by increasing the population size.<|control11|><|separator|>
  8. [8]
    rastrigin module — GEMSEO 3.2.2 documentation
    [Rastrigin] Rastrigin, L. A. “Systems of extremal control.” Mir, Moscow (1974). [MuhlenbeinEtAl] H. Mühlenbein, D. Schomisch and J. Born. “The Parallel ...
  9. [9]
    Progress Rate Analysis of Evolution Strategies on the Rastrigin ...
    This paper provides first results of a scientific program that aims at an analysis of the performance of the (μ/μI, λ)-ES on Rastrigin's test function based on ...
  10. [10]
    [PDF] Presentation of the Noiseless Functions
    The transformations Tasy and Tosz alleviate the symmetry and regularity of the original Rastrigin function. • non-separable less regular counterpart of f3. • ...
  11. [11]
    Parameter setting of meta-heuristic algorithms: a new hybrid method ...
    Nov 17, 2021 · Also, the Rastrigin function was presented in 1974 by Rastrigin as a 2-dimensional function (Rastrigin, 1974) and has been expended by ...
  12. [12]
    [PDF] Problem Definitions and Evaluation Criteria for the CEC 2005 ...
    ✧ F9: Shifted Rastrigin's Function. ✧ F10: Shifted Rotated Rastrigin's Function. ✧ F11: Shifted Rotated Weierstrass Function. ✧ F12: Schwefel's Problem ...Missing: variants | Show results with:variants
  13. [13]
    The Generalized Rastrigin Function - TRACER
    The Rastrigin function has a complexity of O(nln(n)), where n is the dimension of the problem. The surface of the function is determined by the external ...Missing: formula | Show results with:formula
  14. [14]
    Progress Rate Analysis of Evolution Strategies on the Rastrigin ...
    Aug 15, 2022 · A first order progress rate is derived for the intermediate multi-recombinative Evolution Strategy -ES on the highly multimodal Rastrigin test function.
  15. [15]
    ciren/benchmarks: A collection of n-dimensional functions - GitHub
    Additionally, the CEC 2005 benchmark function set has been implemented in 2, 10, 30, and 50 dimensions from Problem Definitions and Evaluation Criteria for the ...
  16. [16]
    [PDF] Examples of Objective Functions - GEATbx
    Rastrigin's function is based on function 1 with the addition of cosine modulation to produce many local min- ima. Thus, the test function is highly multimodal.<|control11|><|separator|>
  17. [17]
    Convergence Properties of the (μ/μI, λ)-ES on the Rastrigin Function
    This paper investigates the convergence properties on the Rastrigin function based on progress rate theory results of [11] and [12].
  18. [18]
    Optimization of the Rastrigin test function - OpenTURNS
    This function has many local minima, so optimization algorithms must be run from multiple starting points.
  19. [19]
  20. [20]
    Checking Validity of Gradients or Jacobians - MATLAB & Simulink
    Check whether a derivative function matches finite difference estimates ... The following function computes Rastrigin's function and its gradient correctly.
  21. [21]
    A contour plot of the two-dimensional Rastrigin function f (x). The ...
    A contour plot of the two-dimensional Rastrigin function f (x). The global minimum f (x) = 0 is at (0, 0) and is marked with an open white circle.
  22. [22]
    Rastrigin - pymoo
    The Rastrigin function has several local minima. It is highly multimodal, but locations of the minima are regularly distributed. It is shown in the plot above ...
  23. [23]
    [PDF] OptMap: Using Dense Maps for Visualizing Multidimensional ...
    In this paper we presented OptMap, an image-based visualization technique that allows the visualization of multidimensional functions and optimization prob-.Missing: higher | Show results with:higher
  24. [24]
    2.1 De Jong's function 1 - GEATbx
    Rastrigin's function is based on function 1 with the addition of cosine modulation to produce many local minima. Thus, the test function is highly multimodal.
  25. [25]
    Finding Global Minima with a Computable Filled Function
    This paper proposes a new filled function. This function needs only one ... Rastrigin, L. (1974), Systems of Extremal Control, Nauka, Moscow. Google ...
  26. [26]
    Rastrigin Function
    Number of variables: n variables. · Definition: · Search domain: −5.12 ≤ xi ≤ 5.12, i = 1, 2, . . . , n. · Number of local minima: several local minima. · The ...Missing: 1D | Show results with:1D
  27. [27]
    f 15 : Rastrigin - bbob
    Prototypical highly multimodal function which has originally a very regular and symmetric structure for the placement of the optima.
  28. [28]
    [PDF] Problem Definitions and Evaluation Criteria for the CEC 2013 ...
    on the CEC'13 test suite which includes 28 benchmark functions. The ... 13 Non-Continuous Rotated Rastrigin's Function. -200. 14 Schwefel's Function. -100.
  29. [29]
    Page not found · GitHub Pages
    - **Summary**: Insufficient relevant content. The URL (https://coco-platform.org/testsuites/bbob/) returns a 404 error, indicating the requested file is not found. No information about the bbob test suite, including function numbers for Sphere, Griewank, or Rastrigin, is available.
  30. [30]
    [PDF] Understanding Interactions Among Genetic Algorithm Parameters
    In the 10-variable Rastrigin's function, we use 13 bits to code each variable, thereby making the complete string equal to 130 bits. For our study, we use e.
  31. [31]
    L. A. Rastrigin, “Systems of Extreme Control,” Nauka, Moscow, 1974.
    In this paper, we introduce a parallel algorithm that exploits the latest computers in the market equipped with more than one processor, and used in clusters ...
  32. [32]
    [PDF] Surrogate-Assisted Evolutionary Algorithms
    May 18, 2013 · We present several state-of-the-art algorithms, derived from CMA-ES, for solving single- and multi-objective black-box optimization problems.
  33. [33]
    The parallel genetic algorithm as function optimizer - ScienceDirect
    The PGA is able to find the global minimum of Rastrigin's function of dimension 400 on a 64 processor system! Furthermore, we give an example of a ...
  34. [34]
    (PDF) Empirical Review of Standard Benchmark Functions Using ...
    Aug 9, 2025 · Figure 4: Scaling behaviour of Ackley's function with a modified gradient expression. 3.2 Rastrigin's Function. Rastrigin's function [27,28] do ...
  35. [35]
    (PDF) Benchmark functions for the CEC'2008 special session and ...
    Benchmark functions for the CEC'2008 special session and competition on large scale global optimization ... Rastrigin's Function. 6. The Rotated Ackley's ...Missing: history | Show results with:history
  36. [36]
    A Review of Benchmark and Test Functions for Global Optimization ...
    May 26, 2025 · Similarly, functions like Ackley, Griewank, and Rastrigin have well-known d-dimensional generalizations, retaining key features such as ...
  37. [37]
    On a Population Sizing Model for Evolution Strategies Optimizing ...
    2022. Progress Rate Analysis of Evolution Strategies on the Rastrigin Function: First Results. ... Acceptance Rates. Overall Acceptance Rate 1,669 of 4,410 ...
  38. [38]
    [PDF] A fractal-based decomposition framework for continuous optimization
    Feb 23, 2024 · Abstract. In this paper, we propose a generic algorithmic framework which defines a unified view of fractal.Missing: 2022-2025 | Show results with:2022-2025