# Optimisation Run FlexibilityΒΆ

It is possible to change the configuration of the algorithm part way through the optimization process, or even to switch algorithms completely. This allows an optimization process to be configured to be more exploritative early on, to explore the whole design space, then later to be more exploitative, to home in in exact optimal solutions. Doing so requires using Platypus algorithms directly, instead of the algorithm wrappers provided through the BESOS optimizer module.

```
import platypus
from besos import eppy_funcs as ef, optimizer
from besos.evaluator import EvaluatorEP
from besos.parameters import expand_plist
from besos.problem import EPProblem
```

First we create an example problem, see here for details.

```
idf = ef.get_idf()
parameters = expand_plist(
{
"NonRes Fixed Assembly Window": {
"UFactor": (0.1, 5),
"Solar Heat Gain Coefficient": (0.01, 0.99),
},
"Mass NonRes Wall Insulation": {"Thickness": (0.01, 0.09)},
}
)
objectives = ["Electricity:Facility", "Gas:Facility"]
problem = EPProblem(parameters, objectives)
evaluator = EvaluatorEP(problem, idf)
```

Next we set up NSGA-II as the first algorithm, a good general purpose
multi-objective genetic algorithm. The `to_platypus`

shortcut converts
the Evaluator object to a `platypus.Problem`

object.

```
platypus_problem = evaluator.to_platypus()
algorithm = platypus.NSGAII(problem=platypus_problem)
```

Now we can run the algorithm for a lot of generations, and pause it at
some point. Use the **stop button** at the top of the notebook to
interrupt the following cell. Note: The output from the next cells will
vary from run to run, due to the randomness of the underlying algorithm
as well as the amount of time this cell is run for.

```
try:
algorithm.run(10)
except KeyboardInterrupt:
print("Algorithm interrupted")
algorithm.population[:5]
```

```
[Solution[1.0908340504277432,0.16551557505479184,0.08624645903497696|1790121927.5090928,2112480396.5036483|0],
Solution[3.669368008669323,0.4536590257731561,0.02927202264111818|1888165256.2375371,2864447993.238228|0],
Solution[0.4048842481345535,0.21011098820673654,0.0820871977282966|1811316457.5204759,1985648992.8552308|0],
Solution[4.51833839590032,0.05906519561246457,0.028176520417913818|1774638001.8351984,3011098878.738435|0],
Solution[0.9949394089562864,0.9225128363028766,0.04547968363721608|2082915722.0566134,2208464014.055323|0]]
```

Now we want to continue from where the first algorithm left off, running
`EpsMOEA`

for 10 evaluations. In order to make the population carry
over, we use the `InjectedPopulation`

generator, then run the second
algorithm.

If we had let the first algorithm finish, we could use
`algorithm.result`

instead of `algorithm.population`

to use the
solutions found by the first algorithm as a starting point for the next.

```
generator = platypus.InjectedPopulation(algorithm.population)
alg2 = platypus.EpsMOEA(
problem=platypus_problem, generator=generator, epsilons=3, population_size=10
)
alg2.run(10)
```

Now we convert the solutions to a dataframe using the BESOS helper function and display them.

```
optimizer.solutions_to_df(alg2.result, problem, parts=["inputs", "outputs"])
```

UFactor | Solar Heat Gain Coefficient | Thickness | Electricity:Facility | Gas:Facility | pareto-optimal | |
---|---|---|---|---|---|---|

0 | 0.120118 | 0.021326 | 0.083123 | 1.767805e+09 | 1.932621e+09 | True |

1 | 0.258848 | 0.821173 | 0.087239 | 2.023985e+09 | 1.865141e+09 | True |

2 | 4.476737 | 0.044301 | 0.068937 | 1.743882e+09 | 2.693421e+09 | True |

3 | 0.163817 | 0.966123 | 0.077487 | 2.098469e+09 | 1.852508e+09 | True |