Skip to content

Solver Primer

Michael Waite edited this page Mar 16, 2026 · 3 revisions

This page provides a high-level overview of the principles behind spreadsheet optimization. It is intended as background for users who are new to Solver, and can be skipped by those already familiar with the fundamentals.


What Is Spreadsheet Optimization?

At its core, spreadsheet optimization is about finding the best values for a set of cells in order to achieve a desired outcome — such as maximizing profit, minimizing cost, or hitting a specific target — while respecting any rules or limits that apply. The "best" is defined mathematically by an Objective Function, represented by a single cell whose formula depends on other cells that Solver is free to adjust.

In practical terms, think of it as asking Excel: "What inputs should I use to get the best possible output, given these restrictions?"

The Three Building Blocks

Every Solver problem is defined by three elements:

Decision Variables are the cells whose values Solver is allowed to change during the optimization. These are the "knobs" that Solver turns while searching for the best solution. For example, in a portfolio allocation problem, the Decision Variables might represent the percentage of funds invested in each security.

The Objective is a single cell containing a formula that Solver seeks to optimize. You tell Solver whether to Maximize, Minimize, or drive the Objective to a specific target value. The Objective cell's formula must depend, directly or indirectly, on the Decision Variables — otherwise, changing them would have no effect on the outcome.

Constraints are conditions that any valid solution must satisfy. They represent real-world limits such as budgets, capacities, or regulatory requirements. For example, a constraint might require that the total portfolio allocation sums to 100%, or that a risk measure stays below a given threshold. While constraints are technically optional, most real problems have them, and well-chosen constraints help Solver find a solution more efficiently by narrowing the search space.

The Solution Space

The Solution Space (or Feasible Region) is the set of all possible combinations of Decision Variable values that satisfy the constraints. Solver's job is to search this space to find the combination that yields the best Objective value.

A small, tightly-constrained solution space is generally easier and faster to solve than a large, loosely-constrained one. This is why it is good practice to apply reasonable bounds to each Decision Variable, even when the problem does not strictly require them. Without bounds, Solver may waste iterations exploring extreme or nonsensical regions of the space.

Linear vs. Nonlinear Problems

The nature of the relationships between your Decision Variables and the Objective determines what kind of optimization problem you have, which in turn dictates how Solver should approach it.

A Linear problem is one where the Objective and all Constraints are linear functions of the Decision Variables — meaning they involve only addition, subtraction, and multiplication by constants. Linear problems are the most well-behaved: they are guaranteed to have a single optimal solution (if one exists), and the LP Simplex method can find it efficiently and reliably.

A Nonlinear problem involves curved or otherwise non-linear relationships — for example, products of Decision Variables, exponentials, or trigonometric functions. These problems can have multiple peaks and valleys (local optima) in the solution space, making them more challenging. The GRG Nonlinear method handles smooth nonlinear problems well, but it may converge to a local optimum rather than the global optimum depending on the starting values of the Decision Variables.

A Non-smooth problem contains discontinuities, IF statements, or lookup functions that create abrupt changes in the Objective or Constraint surfaces. These are the hardest to solve because gradient-based methods cannot navigate the jumps. The Evolutionary method, which is inspired by genetic algorithms, is designed for these types of problems.

Choosing the Right Solving Method

Solver provides three solving engines, and selecting the right one is critical to getting a good result:

LP Simplex is the method of choice for linear problems. It is fast, deterministic, and always finds the global optimum if one exists. If your model uses only addition and multiplication by constants — no IF statements, no VLOOKUP, no exponents — try LP Simplex first.

GRG Nonlinear (Generalized Reduced Gradient) works well for smooth nonlinear problems. It uses gradient information to navigate the solution surface efficiently. However, because nonlinear problems can have multiple local optima, the solution GRG finds may depend on the starting values of your Decision Variables. Running the optimization from several different starting points can help increase confidence that the result is near-global.

Evolutionary is a population-based search method suited for non-smooth or highly irregular problems. It does not rely on gradients, so it can handle discontinuities and non-differentiable functions. The trade-off is that it is slower and does not guarantee an optimal solution — it finds a "good" solution given enough time and iterations.

The Importance of Starting Values

For nonlinear and evolutionary problems, the initial values of the Decision Variables matter. GRG Nonlinear, in particular, follows the gradient "downhill" (or "uphill" for maximization) from wherever it starts. If the solution surface has multiple peaks and valleys, different starting points can lead to different solutions.

A practical strategy is to use domain knowledge to choose reasonable starting values. If no such knowledge is available, running Solver multiple times with randomized starting values (a technique sometimes called multi-start) can improve the chances of finding a global optimum.

For LP Simplex problems, starting values do not affect the final answer — the method always converges to the same global optimum regardless of where it begins.

Convergence, Precision, and Iteration Limits

Solver uses several settings to determine when to stop searching:

Convergence controls how small the relative change in the Objective must be over several consecutive iterations before Solver declares that it has found a solution. A tighter convergence setting (smaller value) demands more precision but takes longer.

Constraint Precision determines how closely a constraint must be satisfied. For example, if a constraint requires a cell to equal 1.0 and the precision is set to 0.000001, then a value of 1.0000005 would be considered acceptable.

Max Iterations and Max Time are safety limits that prevent Solver from running indefinitely on difficult problems. If Solver hits either limit before converging, it stops and reports that the iteration or time limit was reached — which does not necessarily mean the current solution is poor, only that Solver was not given enough room to confirm optimality.

Interpreting Results

After Solver finishes, it reports a result code indicating what happened. The most common outcomes are:

A solution was found that satisfies all constraints and meets the convergence criteria. This is the ideal outcome, but for nonlinear problems, remember that it may be a local rather than global optimum.

Solver could not find a feasible solution, meaning no combination of Decision Variable values could satisfy all constraints simultaneously. This usually indicates that the constraints are too restrictive or contradictory. Reviewing and relaxing one or more constraints is typically the remedy.

Solver stopped early due to hitting the maximum iteration count, maximum time limit, or because the user halted execution. The current solution may still be useful — it simply has not been verified as optimal.

Tips for Building Good Models

A few guidelines can make the difference between a model that solves cleanly and one that frustrates:

Keep the model structure as simple as possible. Prefer linear relationships where the problem naturally allows them — linear models are faster and more reliable.

Bound your Decision Variables. Even rough upper and lower limits help Solver by reducing the search space dramatically.

Avoid unnecessary non-smoothness. An IF statement in a formula that feeds the Objective or a Constraint cell can turn an otherwise smooth problem into a non-smooth one, forcing a switch from GRG to the slower Evolutionary method.

Test with known answers. If possible, set up a small version of your problem where you know the correct solution, and verify that Solver finds it. This builds confidence before scaling up.

Provide good starting values. For nonlinear problems, starting close to the expected solution reduces the risk of converging to a poor local optimum.

Next Topic

A Walkthrough Example

Clone this wiki locally