You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: doc/guide.rst
+34-15Lines changed: 34 additions & 15 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -22,6 +22,7 @@ The optimization class is created using the following call:
22
22
23
23
.. code-block:: python
24
24
25
+
from pyoptsparse import Optimization
25
26
optProb = Optimization("name", objconFun)
26
27
27
28
The general template of the objective and constraint function is as follows:
@@ -47,7 +48,7 @@ If the Optimization problem is unconstrained, ``funcs`` will contain only the ob
47
48
Design Variables
48
49
++++++++++++++++
49
50
The simplest way to add a single continuous variable with no bounds (side constraints) and initial value of 0.0 is
50
-
to simply call :meth:`addVar <pyoptsparse.pyOpt_optimization.Optimization.addVar>`:
51
+
to simply call :meth:`~pyoptsparse.pyOpt_optimization.Optimization.addVar`:
51
52
52
53
.. code-block:: python
53
54
@@ -77,8 +78,8 @@ The ``scale`` keyword will result in the following:
77
78
The purpose of the scale factor is ensure that design variables of widely different magnitudes can be used in the same optimization.
78
79
It is desirable to have the magnitude of all variables within an order of magnitude or two of each other.
79
80
80
-
The :meth:`addVarGroup <pyoptsparse.pyOpt_optimization.Optimization.addVarGroup>` call is similar to
81
-
:meth:`addVar <pyoptsparse.pyOpt_optimization.Optimization.addVar>` except that it adds a group of 1 or more variables.
81
+
The :meth:`~pyoptsparse.pyOpt_optimization.Optimization.addVarGroup` call is similar to
82
+
:meth:`~pyoptsparse.pyOpt_optimization.Optimization.addVar` except that it adds a group of 1 or more variables.
82
83
These variables are then returned as a numpy array within the x-dictionary.
83
84
For example, to add 10 variables with no lower bound, and a scale factor of 0.1:
84
85
@@ -91,7 +92,7 @@ Constraints
91
92
+++++++++++
92
93
93
94
The simplest way to add a single constraint with no bounds (i.e., not a very useful constraint!) is
94
-
to use the function :meth:`addCon <pyoptsparse.pyOpt_optimization.Optimization.addCon>`:
95
+
to use the function :meth:`~pyoptsparse.pyOpt_optimization.Optimization.addCon`:
95
96
96
97
.. code-block:: python
97
98
@@ -148,7 +149,7 @@ Consider the optimization problem given below::
148
149
149
150
The ``X``'s denote which parts of the Jacobian have non-zero values.
150
151
pyOptSparse does not determine the sparsity structure of the Jacobian automatically,
151
-
it must be specified by the user during calls to :meth:`addCon <pyoptsparse.pyOpt_optimization.Optimization.addCon>` and :meth:`addConGroup <pyoptsparse.pyOpt_optimization.Optimization.addConGroup>`.
152
+
it must be specified by the user during calls to :meth:`~pyoptsparse.pyOpt_optimization.Optimization.addCon` and :meth:`~pyoptsparse.pyOpt_optimization.Optimization.addConGroup`.
152
153
By way of example, the code that generates the hypothetical optimization problem is as follows:
153
154
154
155
.. code-block:: python
@@ -182,6 +183,7 @@ By way of example, the call instead may be as follows:
182
183
183
184
.. code-block:: python
184
185
186
+
from scipy import sparse
185
187
jac = sparse.lil_matrix((3, 3))
186
188
jac[0, 0] =1.0
187
189
jac[1, 1] =4.0
@@ -224,7 +226,7 @@ Objectives
224
226
++++++++++
225
227
226
228
Each optimization will require at least one objective to be added.
227
-
This is accomplished using a the call to :meth:`addObj <pyoptsparse.pyOpt_optimization.Optimization.addObj>`:
229
+
This is accomplished using a the call to :meth:`~pyoptsparse.pyOpt_optimization.Optimization.addObj`:
228
230
229
231
.. code-block:: python
230
232
@@ -262,12 +264,6 @@ For example, if the optimization problem has one objective ``obj``, two constrai
Once this function is constructed, users can pass its function handle to the optimizer when it's called via:
266
-
267
-
.. code-block:: python
268
-
269
-
sol = opt(optProb, sens=sens, ...)
270
-
271
267
272
268
Optimizer Instantiation
273
269
+++++++++++++++++++++++
@@ -278,17 +274,40 @@ The first, and most explicit approach is to directly import the optimizer class,
278
274
279
275
from pyoptsparse importSLSQP
280
276
281
-
opt = SLSQP(...)
277
+
opt = SLSQP(options=options)
282
278
283
279
However, in order to easily switch between different optimizers without having to import each class, a convenience function called
284
-
:meth:`OPT <pyoptsparse.pyOpt_optimizer.OPT>` is provided.
280
+
:meth:`~pyoptsparse.pyOpt_optimizer.OPT` is provided.
285
281
It accepts a string argument in addition to the usual options, and instantiates the optimizer object based on the string:
286
282
287
283
.. code-block:: python
288
284
289
285
from pyoptsparse importOPT
290
286
291
-
opt = OPT("SLSQP", ...)
287
+
opt = OPT("SLSQP", options=options)
292
288
293
289
Note that the name of the optimizer is case-insensitive, so ``slsqp`` can also be used.
294
290
This makes it easy to for example choose the optimizer from the command-line, or more generally select the optimizer using strings without preemptively importing all classes.
291
+
292
+
Calling the Optimizer
293
+
+++++++++++++++++++++
294
+
295
+
The optimization is started by invoking the ``__call__`` function of the optimizer object with the optimization problem as an argument.
296
+
For example, to use finite difference, the call would look like:
297
+
298
+
.. code-block:: python
299
+
300
+
sol = opt(optProb, sens="FD")
301
+
302
+
To provide analytic gradients, the call would look like:
303
+
304
+
.. code-block:: python
305
+
306
+
sol = opt(optProb, sens=sens)
307
+
308
+
Some of the optimizers also have additional options that can be passed in.
309
+
See the optimizer-specific documentation page for more details.
310
+
311
+
Postprocessing
312
+
++++++++++++++
313
+
The result of the optimization is returned in a :class:`pyoptsparse.pyOpt_solution.Solution` object.
0 commit comments