Skip to content

Commit

Permalink
Add technical details to the ARC solver.
Browse files Browse the repository at this point in the history
  • Loading branch information
kellertuer committed Nov 17, 2023
1 parent a9b1be8 commit a7edf1e
Show file tree
Hide file tree
Showing 9 changed files with 27 additions and 17 deletions.
5 changes: 4 additions & 1 deletion docs/.vale.ini
Original file line number Diff line number Diff line change
@@ -1,9 +1,12 @@
StylesPath = styles
MinAlertLevel = error
MinAlertLevel = warning
Vocab = Manopt

Packages = Google

[formats]
qmd = md

[*.md]
BasedOnStyles = Vale, Google
TokenIgnores = \
Expand Down
5 changes: 2 additions & 3 deletions docs/src/notation.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,7 @@
# Notation

In this package, we follow the notation introduced in [Manifolds.jl Notation](https://juliamanifolds.github.io/Manifolds.jl/latest/misc/notation.html)

with the following additional notation
In this package,the notation introduced in [Manifolds.jl Notation](https://juliamanifolds.github.io/Manifolds.jl/latest/misc/notation.html) is used
with the following additional parts.

| Symbol | Description | Also used | Comment |
|:--:|:--------------- |:--:|:-- |
Expand Down
2 changes: 1 addition & 1 deletion docs/src/plans/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ information is required about both the optimisation task or “problem” at han
This together is called a __plan__ in `Manopt.jl` and it consists of two data structures:

* The [Manopt Problem](@ref ProblemSection) describes all _static_ data of a task, most prominently the manifold and the objective.
* The [Solver State](@ref SolverStateSection) describes all _varying_ data and parameters for the solver that is used. This also means that each solver has its own data structure for the state.
* The [Solver State](@refsec:solver-state) describes all _varying_ data and parameters for the solver that is used. This also means that each solver has its own data structure for the state.

By splitting these two parts, one problem can be define an then be solved using different solvers.

Expand Down
2 changes: 1 addition & 1 deletion docs/src/plans/state.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# [The solver state](@id SolverStateSection)
# [The solver state](@id sec-solver-state)

```@meta
CurrentModule = Manopt
Expand Down
12 changes: 12 additions & 0 deletions docs/src/solvers/adaptive-regularization-with-cubics.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,18 @@ StopWhenAllLanczosVectorsUsed
StopWhenFirstOrderProgress
```

## [Technical Details](@id sec-arc-technical-details)

The [`adaptive_regularization_with_cubics`](@ref) requires the following functions
of a manifolds to be available

* A [retract!](https://juliamanifolds.github.io/ManifoldsBase.jl/stable/retractions/)ion; it is recommended to set the [`default_retraction_method`](https://juliamanifolds.github.io/ManifoldsBase.jl/stable/retractions/#ManifoldsBase.default_retraction_method-Tuple{AbstractManifold}) to a favourite retraction. If this default is set, a `retraction_method=` does not have to be specified.
* if you do not provide an initial regularization parameter `σ`, a [`manifold_dimension`](https://juliamanifolds.github.io/ManifoldsBase.jl/stable/functions/#ManifoldsBase.manifold_dimension-Tuple{AbstractManifold}) is required.
* By default the tangent vector storing the gradient is initialized calling [`zero_vector`](https://juliamanifolds.github.io/ManifoldsBase.jl/stable/functions/#ManifoldsBase.zero_vector-Tuple{AbstractManifold,%20Any})`(M,p)`.
* [`inner`](https://juliamanifolds.github.io/ManifoldsBase.jl/stable/functions/#ManifoldsBase.inner-Tuple{AbstractManifold,%20Any,%20Any,%20Any})`(M, p, X, Y)` is used within the algorithm step

Furthermore, within the Lanczos subsolver, generating a random vector (at `p`) using [`rand!`](https://juliamanifolds.github.io/ManifoldsBase.jl/stable/functions/#Base.rand-Tuple{AbstractManifold})`(M, X; vector_at=p)` in place of `X` is required

## Literature

```@bibliography
Expand Down
4 changes: 2 additions & 2 deletions docs/src/solvers/gradient_descent.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,9 +43,9 @@ RecordGradientNorm
RecordStepsize
```

## [Technical Details](@id GradientDescent-Technical-Details)
## [Technical Details](@id sec-gradient-descent-technical-details)

The [`gradient_descent`](@ref) solver requires the following functions of your manifold to be available
The [`gradient_descent`](@ref) solver requires the following functions of a manifold to be available

* A [retract!](https://juliamanifolds.github.io/ManifoldsBase.jl/stable/retractions/)ion; it is recommended to set the [`default_retraction_method`](https://juliamanifolds.github.io/ManifoldsBase.jl/stable/retractions/#ManifoldsBase.default_retraction_method-Tuple{AbstractManifold}) to a favourite retraction,
for this case it does not have to be specified.
Expand Down
2 changes: 1 addition & 1 deletion docs/src/solvers/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ The following algorithms are currently available
[Primal-dual Riemannian semismooth Newton Algorithm](@ref PDRSSNSolver) | [`primal_dual_semismooth_Newton`](@ref), [`PrimalDualSemismoothNewtonState`](@ref) (using [`TwoManifoldProblem`](@ref)) | ``f=F+G(Λ\cdot)``, ``\operatorname{prox}_{σ F}`` & diff., ``\operatorname{prox}_{τ G^*}`` & diff., ``Λ``
[Quasi-Newton Method](@ref quasiNewton) | [`quasi_Newton`](@ref), [`QuasiNewtonState`](@ref) | ``f``, ``\operatorname{grad} f`` |
[Steihaug-Toint Truncated Conjugate-Gradient Method](@ref tCG) | [`truncated_conjugate_gradient_descent`](@ref), [`TruncatedConjugateGradientState`](@ref) | ``f``, ``\operatorname{grad} f``, ``\operatorname{Hess} f`` |
[Subgradient Method](@ref SubgradientSolver) | [`subgradient_method`](@ref), [`SubGradientMethodState`](@ref) | ``f``, ``∂ f`` |
[Subgradient Method](@refsec-subgradient-method) | [`subgradient_method`](@ref), [`SubGradientMethodState`](@ref) | ``f``, ``∂ f`` |
[Stochastic Gradient Descent](@ref StochasticGradientDescentSolver) | [`stochastic_gradient_descent`](@ref), [`StochasticGradientDescentState`](@ref) | ``f = \sum_i f_i``, ``\operatorname{grad} f_i`` |
[The Riemannian Trust-Regions Solver](@ref trust_regions) | [`trust_regions`](@ref), [`TrustRegionsState`](@ref) | ``f``, ``\operatorname{grad} f``, ``\operatorname{Hess} f`` |

Expand Down
2 changes: 1 addition & 1 deletion docs/src/solvers/subgradient.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# [Subgradient Method](@id SubgradientSolver)
# [Subgradient method](@idsec-subgradient-method)

```@docs
subgradient_method
Expand Down
10 changes: 3 additions & 7 deletions docs/styles/Vocab/Manopt/accept.txt
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
Absil
Adagrad
Adjoint
adjoint
[A|a]djoint
Armijo
Bergmann
Chambolle
Expand All @@ -26,14 +25,11 @@ Lanczos
LineSearches.jl
Manifolds.jl
ManifoldsBase.jl
manopt
manopt.org
Manopt
Manopt.jl
[Mm]anopt(:?.org|.jl)?
Munkvold
Mead
Nelder
parametrising
[Pp]arametrising
Parametrising
Pock
preconditioner
Expand Down

0 comments on commit a7edf1e

Please sign in to comment.