Skip to content

Commit b9ff2c3

Browse files
authored
Add autolink (#452)
* add sphinx autolink * cleanup docs * simplify links * minor cleanup * add return type * update requirements.txt * remove intersphinx
1 parent b687be3 commit b9ff2c3

File tree

7 files changed

+55
-32
lines changed

7 files changed

+55
-32
lines changed

doc/advancedFeatures.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ To turn the history recording on, use the ``storeHistory`` attribute when invoki
2727

2828
.. code-block:: python
2929
30-
sol = opt(optProb, sens=sens, storeHistory="<your-history-file-name>.hst", ...)
30+
sol = opt(optProb, sens=sens, storeHistory="history_file.hst")
3131
3232
3333
Hot start
@@ -75,7 +75,7 @@ To enable this feature, use the ``timeLimit`` option when invoking the optimizer
7575

7676
.. code-block:: python
7777
78-
sol = opt(optProb, sens=sens, timeLimit=24 * 3600, ...)
78+
sol = opt(optProb, sens=sens, timeLimit=24 * 3600)
7979
8080
Note that the attribute takes the maximum wall time *in seconds* as an integer number.
8181

doc/api/optimizer.rst

Lines changed: 1 addition & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -3,9 +3,5 @@
33
Optimizer
44
---------
55

6-
.. currentmodule:: pyoptsparse.pyOpt_optimizer
7-
8-
.. autoclass:: Optimizer
6+
.. automodule:: pyoptsparse.pyOpt_optimizer
97
:members:
10-
11-
.. autofunction:: pyoptsparse.pyOpt_optimizer.OPT

doc/conf.py

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -35,3 +35,9 @@
3535

3636
# bibtex
3737
bibtex_bibfiles = ["pyoptsparse.bib"]
38+
39+
# autolink
40+
extensions.extend(["sphinx_codeautolink"])
41+
codeautolink_concat_default = True
42+
codeautolink_warn_on_missing_inventory = True
43+
codeautolink_warn_on_failed_resolve = True

doc/guide.rst

Lines changed: 34 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -22,6 +22,7 @@ The optimization class is created using the following call:
2222

2323
.. code-block:: python
2424
25+
from pyoptsparse import Optimization
2526
optProb = Optimization("name", objconFun)
2627
2728
The general template of the objective and constraint function is as follows:
@@ -47,7 +48,7 @@ If the Optimization problem is unconstrained, ``funcs`` will contain only the ob
4748
Design Variables
4849
++++++++++++++++
4950
The simplest way to add a single continuous variable with no bounds (side constraints) and initial value of 0.0 is
50-
to simply call :meth:`addVar <pyoptsparse.pyOpt_optimization.Optimization.addVar>`:
51+
to simply call :meth:`~pyoptsparse.pyOpt_optimization.Optimization.addVar`:
5152

5253
.. code-block:: python
5354
@@ -77,8 +78,8 @@ The ``scale`` keyword will result in the following:
7778
The purpose of the scale factor is ensure that design variables of widely different magnitudes can be used in the same optimization.
7879
It is desirable to have the magnitude of all variables within an order of magnitude or two of each other.
7980

80-
The :meth:`addVarGroup <pyoptsparse.pyOpt_optimization.Optimization.addVarGroup>` call is similar to
81-
:meth:`addVar <pyoptsparse.pyOpt_optimization.Optimization.addVar>` except that it adds a group of 1 or more variables.
81+
The :meth:`~pyoptsparse.pyOpt_optimization.Optimization.addVarGroup` call is similar to
82+
:meth:`~pyoptsparse.pyOpt_optimization.Optimization.addVar` except that it adds a group of 1 or more variables.
8283
These variables are then returned as a numpy array within the x-dictionary.
8384
For example, to add 10 variables with no lower bound, and a scale factor of 0.1:
8485

@@ -91,7 +92,7 @@ Constraints
9192
+++++++++++
9293

9394
The simplest way to add a single constraint with no bounds (i.e., not a very useful constraint!) is
94-
to use the function :meth:`addCon <pyoptsparse.pyOpt_optimization.Optimization.addCon>`:
95+
to use the function :meth:`~pyoptsparse.pyOpt_optimization.Optimization.addCon`:
9596

9697
.. code-block:: python
9798
@@ -148,7 +149,7 @@ Consider the optimization problem given below::
148149

149150
The ``X``'s denote which parts of the Jacobian have non-zero values.
150151
pyOptSparse does not determine the sparsity structure of the Jacobian automatically,
151-
it must be specified by the user during calls to :meth:`addCon <pyoptsparse.pyOpt_optimization.Optimization.addCon>` and :meth:`addConGroup <pyoptsparse.pyOpt_optimization.Optimization.addConGroup>`.
152+
it must be specified by the user during calls to :meth:`~pyoptsparse.pyOpt_optimization.Optimization.addCon` and :meth:`~pyoptsparse.pyOpt_optimization.Optimization.addConGroup`.
152153
By way of example, the code that generates the hypothetical optimization problem is as follows:
153154

154155
.. code-block:: python
@@ -182,6 +183,7 @@ By way of example, the call instead may be as follows:
182183

183184
.. code-block:: python
184185
186+
from scipy import sparse
185187
jac = sparse.lil_matrix((3, 3))
186188
jac[0, 0] = 1.0
187189
jac[1, 1] = 4.0
@@ -224,7 +226,7 @@ Objectives
224226
++++++++++
225227

226228
Each optimization will require at least one objective to be added.
227-
This is accomplished using a the call to :meth:`addObj <pyoptsparse.pyOpt_optimization.Optimization.addObj>`:
229+
This is accomplished using a the call to :meth:`~pyoptsparse.pyOpt_optimization.Optimization.addObj`:
228230

229231
.. code-block:: python
230232
@@ -262,12 +264,6 @@ For example, if the optimization problem has one objective ``obj``, two constrai
262264
263265
{"obj": {"xvars": [1, 2, 3]}, "con": {"xvars": [[4, 5, 6], [7, 8, 9]]}}
264266
265-
Once this function is constructed, users can pass its function handle to the optimizer when it's called via:
266-
267-
.. code-block:: python
268-
269-
sol = opt(optProb, sens=sens, ...)
270-
271267
272268
Optimizer Instantiation
273269
+++++++++++++++++++++++
@@ -278,17 +274,40 @@ The first, and most explicit approach is to directly import the optimizer class,
278274
279275
from pyoptsparse import SLSQP
280276
281-
opt = SLSQP(...)
277+
opt = SLSQP(options=options)
282278
283279
However, in order to easily switch between different optimizers without having to import each class, a convenience function called
284-
:meth:`OPT <pyoptsparse.pyOpt_optimizer.OPT>` is provided.
280+
:meth:`~pyoptsparse.pyOpt_optimizer.OPT` is provided.
285281
It accepts a string argument in addition to the usual options, and instantiates the optimizer object based on the string:
286282

287283
.. code-block:: python
288284
289285
from pyoptsparse import OPT
290286
291-
opt = OPT("SLSQP", ...)
287+
opt = OPT("SLSQP", options=options)
292288
293289
Note that the name of the optimizer is case-insensitive, so ``slsqp`` can also be used.
294290
This makes it easy to for example choose the optimizer from the command-line, or more generally select the optimizer using strings without preemptively importing all classes.
291+
292+
Calling the Optimizer
293+
+++++++++++++++++++++
294+
295+
The optimization is started by invoking the ``__call__`` function of the optimizer object with the optimization problem as an argument.
296+
For example, to use finite difference, the call would look like:
297+
298+
.. code-block:: python
299+
300+
sol = opt(optProb, sens="FD")
301+
302+
To provide analytic gradients, the call would look like:
303+
304+
.. code-block:: python
305+
306+
sol = opt(optProb, sens=sens)
307+
308+
Some of the optimizers also have additional options that can be passed in.
309+
See the optimizer-specific documentation page for more details.
310+
311+
Postprocessing
312+
++++++++++++++
313+
The result of the optimization is returned in a :class:`pyoptsparse.pyOpt_solution.Solution` object.

doc/quickstart.rst

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -28,8 +28,8 @@ For the TP37, the objective function is a simple analytic function:
2828
Notes:
2929

3030
* The ``xdict`` variable is a dictionary whose keys are the names from each
31-
:meth:`addVar <pyoptsparse.pyOpt_optimization.Optimization.addVar>` and
32-
:meth:`addVarGroup <pyoptsparse.pyOpt_optimization.Optimization.addVarGroup>` call. The line
31+
:meth:`~pyoptsparse.pyOpt_optimization.Optimization.addVar` and
32+
:meth:`~pyoptsparse.pyOpt_optimization.Optimization.addVarGroup` call. The line
3333

3434
.. code-block:: python
3535
@@ -45,9 +45,9 @@ Notes:
4545
4646
creates a list of length 2, which stores the numerical values of the two constraints.
4747
The ``funcs`` dictionary return must contain keys that match the constraint names from
48-
:meth:`addCon <pyoptsparse.pyOpt_optimization.Optimization.addCon>` and
49-
:meth:`addConGroup <pyoptsparse.pyOpt_optimization.Optimization.addConGroup>`
50-
as well as the objectives from :meth:`addObj <pyoptsparse.pyOpt_optimization.Optimization.addObj>` calls.
48+
:meth:`~pyoptsparse.pyOpt_optimization.Optimization.addCon` and
49+
:meth:`~pyoptsparse.pyOpt_optimization.Optimization.addConGroup`
50+
as well as the objectives from :meth:`~pyoptsparse.pyOpt_optimization.Optimization.addObj` calls.
5151
This is done in the following calls:
5252

5353
.. code-block:: python
@@ -87,7 +87,7 @@ This call adds two variables with name ``con``.
8787
There is no lower bound for the variables and the upper bound is 0.0.
8888

8989
We must also assign the the key value for the objective using the
90-
:meth:`addObj <pyoptsparse.pyOpt_optimization.Optimization.addObj>` call:
90+
:meth:`~pyoptsparse.pyOpt_optimization.Optimization.addObj` call:
9191

9292
.. literalinclude:: ../examples/tp037.py
9393
:start-after: # rst begin addObj

doc/requirements.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1 +1,2 @@
11
sphinx_mdolab_theme>=1.2
2+
sphinx-codeautolink

pyoptsparse/pyOpt_optimizer.py

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -961,10 +961,11 @@ def getInform(self, infocode=None):
961961

962962
# List of optimizers as an enum
963963
Optimizers = Enum("Optimizers", "SNOPT IPOPT SLSQP NLPQLP CONMIN NSGA2 PSQP ALPSO ParOpt")
964+
"""Special enum containing all possible optimizers"""
964965

965966

966-
def OPT(optName, *args, **kwargs):
967-
"""
967+
def OPT(optName, *args, **kwargs) -> Optimizer:
968+
r"""
968969
This is a simple utility function that enables creating an
969970
optimizer based on the 'optName' string. This can be useful for
970971
doing optimization studies with respect to optimizer since you
@@ -974,9 +975,9 @@ def OPT(optName, *args, **kwargs):
974975
----------
975976
optName : str or enum
976977
Either a string identifying the optimizer to create, e.g. "SNOPT", or
977-
an enum accessed via ``pyoptsparse.Optimizers``, e.g. ``Optimizers.SNOPT``.
978+
an enum accessed via :class:`~pyoptsparse.pyOpt_optimizer.Optimizers`, e.g. ``pyoptsparse.Optimizers.SNOPT``.
978979
979-
\\*args, \\*\\*kwargs : varies
980+
\*args, \*\*kwargs : varies
980981
Passed to optimizer creation.
981982
982983
Returns

0 commit comments

Comments
 (0)