Skip to content

Commit

Permalink
Manually fixed documentation files that were incorrectly generated by…
Browse files Browse the repository at this point in the history
… CodePlex.
  • Loading branch information
alexshtf committed Aug 7, 2017
1 parent d41c170 commit 9e4d509
Show file tree
Hide file tree
Showing 26 changed files with 310 additions and 286 deletions.
150 changes: 75 additions & 75 deletions Readme.md
Original file line number Diff line number Diff line change
@@ -1,75 +1,75 @@
# Project Description
A library that provides fast, accurate and automatic differentiation (computes derivative / gradient) of mathematical functions.

# Getting AutoDiff
Using NuGet:

```powershell
Install-Package AutoDiff
```

# **Using in research papers**
If you like the library and it helps you publish a research paper, please cite the paper I originally wrote the library for [geosemantic.bib](docs/Home_geosemantic.bib)

# What is it for?
AutoDiff provides a simple and intuitive API for computing function gradients/derivatives along with a fast state-of-the-art algorithm for performing the computation. Such computations are mainly useful in numeric optimization scenarios.

# Code example
```c#
using AutoDiff;

class Program
{
public static void Main(string[]() args)
{
// define variables
var x = new Variable();
var y = new Variable();
var z = new Variable();

// define our function
var func = (x + y) * TermBuilder.Exp(z + x * y);

// prepare arrays needed for evaluation/differentiation
Variable[] vars = { x, y, z };
double[] values = {1, 2, -3 };

// evaluate func at (1, 2, -3)
double value = func.Evaluate(vars, values);

// calculate the gradient at (1, 2, -3)
double[] gradient = func.Differentiate(vars, values);
// print results
Console.WriteLine("The value at (1, 2, -3) is " + value);
Console.WriteLine("The gradient at (1, 2, -3) is ({0}, {1}, {2})", gradient[0](0), gradient[1](1), gradient[2](2));
}
}
```



# Documentation
The [Documentation](doc/Documentation.md) contains some basic tutorials, we have 0an [article](http://www.codeproject.com/KB/library/Automatic_Differentiation.aspx) on CodeProject, and finally source code contains some code examples in addition to the code of the library itself.

# Motivation
There are many open and commercial .NET libraries that have numeric optimization as one of their features (for example, [Microsoft Solver Foundation](http://msdn.microsoft.com/en-us/devlabs/hh145003.aspx), [AlgLib](http://www.alglib.net),[Extreme Optimization](http://www.extremeoptimization.com/), [CenterSpace NMath](http://www.centerspace.net/)) . Most of them require the user to be able to evaluate the function and the function's gradient. This library tries to save the work in manually developing the function's gradient and coding it.
Once the developer defines his/her function, the AutoDiff library can automatically evaluate and differentiate this function at any point. This allows +easy development and prototyping+ of applications based on numeric optimization.

# Features
* Fast! See [0.5 vs 0.3 benchmark](doc/0.5-vs-0.3-benchmark.md) and [0.3 benchmark](doc/0.3-benchmark).
* Composition of functions using arithmetic operators, Exp, Log, Power and user-defined unary and binary functions.
* Function gradient evaluation at specified points
* Function value evaluation at specified points
* Uses [Code Contracts](https://docs.microsoft.com/en-us/dotnet/api/system.diagnostics.contracts.contract) for specifying valid parameters and return values
* Computes gradients using Reverse-Mode AD algorithm in **linear time**!
* Yes, it's faster than numeric approximation for multivariate functions
* You get both high accuracy and speed!

# Used by
* **Andreas Witsch, Hendrik Skubch, Stefan Niemczyk, Kurt Geihs** [Using incomplete satisfiability modulo theories to determine robotic tasks](http://dx.doi.org/10.1109/IROS.2013.6697046) _Intelligent Robots and Systems (IROS), 2013 IEEE/RSJ International Conference_
* **Alex Shtof, Alexander Agathos, Yotam Gingold, Ariel Shamir, Daniel Cohen-Or** [Geosemantic Snapping for Sketch-Based Modeling](http://onlinelibrary.wiley.com/doi/10.1111/cgf.12044/abstract) _Eurographics 2013 proceedings_ ([code repository](https://bitbucket.org/alexshtf/sketchmodeller))
* **Michael Kommenda, Gabriel Kronberger, Stephan Winkler, Michael Affenzeller, Stefan Wagner** [Effects of constant optimization by nonlinear least squares minimization in symbolic regression](http://dl.acm.org/citation.cfm?id=2482691) _Proceeding of the fifteenth annual conference companion on Genetic and evolutionary computation conference companion_
* **Hendrik Skubch**, [Solving non-linear arithmetic constraints in soft realtime environments](http://dl.acm.org/citation.cfm?id=2245293) _Proceedings of the 27th Annual ACM Symposium on Applied Computing_
* [AlicaEngine](http://ros.org/wiki/AlicaEngine) - A cooperative planning engine for robotics. You can see it in action in this [video](http://www.youtube.com/watch?v=HhIrhU19PG4)
* [HeuristicsLab](http://dev.heuristiclab.com) - a framework for heuristic and evolutionary algorithms that is developed by members of the [Heuristic and Evolutionary Algorithms Laboratory (HEAL)](http://heal.heuristiclab.com/)
# Project Description
A library that provides fast, accurate and automatic differentiation (computes derivative / gradient) of mathematical functions.

AutoDiff provides a simple and intuitive API for computing function gradients/derivatives along with a fast state-of-the-art algorithm for performing the computation. Such computations are mainly useful in numeric optimization scenarios.

The Library is available via NuGet:

```powershell
Install-Package AutoDiff
```

# Code example
```c#
using AutoDiff;

class Program
{
public static void Main(string[]() args)
{
// define variables
var x = new Variable();
var y = new Variable();
var z = new Variable();

// define our function
var func = (x + y) * TermBuilder.Exp(z + x * y);

// prepare arrays needed for evaluation/differentiation
Variable[] vars = { x, y, z };
double[] values = {1, 2, -3 };

// evaluate func at (1, 2, -3)
double value = func.Evaluate(vars, values);

// calculate the gradient at (1, 2, -3)
double[] gradient = func.Differentiate(vars, values);

// print results
Console.WriteLine("The value at (1, 2, -3) is " + value);
Console.WriteLine("The gradient at (1, 2, -3) is ({0}, {1}, {2})", gradient[0](0), gradient[1](1), gradient[2](2));
}
}
```



# Documentation
The [Documentation](docs/Readme.md) contains some basic tutorials, we have an [article](http://www.codeproject.com/KB/library/Automatic_Differentiation.aspx) on CodeProject, and finally source code contains some code examples in addition to the code of the library itself.

# Motivation
There are many open and commercial .NET libraries that have numeric optimization as one of their features (for example, [Microsoft Solver Foundation](http://msdn.microsoft.com/en-us/devlabs/hh145003.aspx), [AlgLib](http://www.alglib.net),[Extreme Optimization](http://www.extremeoptimization.com/), [CenterSpace NMath](http://www.centerspace.net/)) . Most of them require the user to be able to evaluate the function and the function's gradient. This library tries to save the work in manually developing the function's gradient and coding it.
Once the developer defines his/her function, the AutoDiff library can automatically evaluate and differentiate this function at any point. This allows +easy development and prototyping+ of applications based on numeric optimization.

# Features
* Fast! See [0.5 vs 0.3 benchmark](docs/0.5-vs-0.3-benchmark.md) and [0.3 benchmark](doss/0.3-benchmark).
* Composition of functions using arithmetic operators, Exp, Log, Power and user-defined unary and binary functions.
* Function gradient evaluation at specified points
* Function value evaluation at specified points
* Uses [Code Contracts](https://docs.microsoft.com/en-us/dotnet/api/system.diagnostics.contracts.contract) for specifying valid parameters and return values
* Computes gradients using Reverse-Mode AD algorithm in **linear time**!
* Yes, it's faster than numeric approximation for multivariate functions
* You get both high accuracy and speed!

# **Using in research papers**

If you like the library and it helps you publish a research paper, please cite the paper I originally wrote the library for [geosemantic.bib](docs/Home_geosemantic.bib)

# Used by

* **Andreas Witsch, Hendrik Skubch, Stefan Niemczyk, Kurt Geihs** [Using incomplete satisfiability modulo theories to determine robotic tasks](http://dx.doi.org/10.1109/IROS.2013.6697046) _Intelligent Robots and Systems (IROS), 2013 IEEE/RSJ International Conference_
* **Alex Shtof, Alexander Agathos, Yotam Gingold, Ariel Shamir, Daniel Cohen-Or** [Geosemantic Snapping for Sketch-Based Modeling](http://onlinelibrary.wiley.com/doi/10.1111/cgf.12044/abstract) _Eurographics 2013 proceedings_ ([code repository](https://bitbucket.org/alexshtf/sketchmodeller))
* **Michael Kommenda, Gabriel Kronberger, Stephan Winkler, Michael Affenzeller, Stefan Wagner** [Effects of constant optimization by nonlinear least squares minimization in symbolic regression](http://dl.acm.org/citation.cfm?id=2482691) _Proceeding of the fifteenth annual conference companion on Genetic and evolutionary computation conference companion_
* **Hendrik Skubch**, [Solving non-linear arithmetic constraints in soft realtime environments](http://dl.acm.org/citation.cfm?id=2245293) _Proceedings of the 27th Annual ACM Symposium on Applied Computing_
* [AlicaEngine](http://ros.org/wiki/AlicaEngine) - A cooperative planning engine for robotics. You can see it in action in this [video](http://www.youtube.com/watch?v=HhIrhU19PG4)
* [HeuristicsLab](http://dev.heuristiclab.com) - a framework for heuristic and evolutionary algorithms that is developed by members of the [Heuristic and Evolutionary Algorithms Laboratory (HEAL)](http://heal.heuristiclab.com/)
40 changes: 22 additions & 18 deletions docs/0.3 benchmark.md → docs/0.3-benchmark.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# The test
The benchmark program (link: [AutodiffBenchmark.zip](0.3 benchmark_AutodiffBenchmark.zip)) tests evaluation and differentiation of a chosen function and reports the timings. The function we tested is a sum of terms, where each term is a squared linear combination of 10 variables with random coefficients. Such function simulates the squared Laplacian norm of a mesh, where each vertex has on average 10 neighboring vertices. The benchmark program performs the following performance tests:
The benchmark program (link: [AutodiffBenchmark.zip](AutodiffBenchmark.zip)) tests evaluation and differentiation of a chosen function and reports the timings. The function we tested is a sum of terms, where each term is a squared linear combination of 10 variables with random coefficients. Such function simulates the squared Laplacian norm of a mesh, where each vertex has on average 10 neighboring vertices. The benchmark program performs the following performance tests:
* Construction and compilation of the target function
* "Natively" evaluate the function. That is, without using AutoDiff. Just simple code to directly evaluate it.
* Approximate the gradient using the above native evaluation, by shifting values assigned to variables by a small epsilon.
Expand All @@ -15,17 +15,20 @@ Several tests are run for different function sizes. Given a number N, we constru

# Results
The results are summarized in the following table. We measured the time for a single operation of each kind. For example, the _Native eval._ column displays the time for a single Native Evaluation operation. All timings are reported in milliseconds.
|| N || Construct || Native eval. || Approx. Grad. || AD Eval. || AD Diff ||
| 1000 | 82 | 0.036 | 24.23 | 0.39 | 2.128 |
| 2000 | 88 | 0.079 | 104.62 | 1.132 | 6.423|
| 3000 | 207 | 0.092 | 246.64 | 2.256 | 11.477 |
| 4000 | 342 | 0.144 | 457.78 | 3.85 | 17.075 |
| 5000 | 423 | 0.166 | 795.99 | 4.066 | 21.945 |
| 6000 | 581 | 0.2 | 1119.87 | 5.675 | 27.37 |
| 7000 | 568 | 0.268 | 1537.01 | 5.91 | 37.534 |
| 8000 | 757 | 0.265 | 2123.62 | 6.855 | 41.122 |
| 9000 | 902 | 0.329 | 2909.86 | 7.878 | 49.066 |
| 10000 | 1383 | 0.413 | 3926.53 | 8.742 | 52.101|

| N | Construct | Native eval. | Approx. Grad. | AD Eval. | AD Diff. |
| ----- | --------- | ------------ | ------------- | -------- | -------- |
| 1000 | 82 | 0.036 | 24.23 | 0.39 | 2.128 |
| 2000 | 88 | 0.079 | 104.62 | 1.132 | 6.423 |
| 3000 | 207 | 0.092 | 246.64 | 2.256 | 11.477 |
| 4000 | 342 | 0.144 | 457.78 | 3.85 | 17.075 |
| 5000 | 423 | 0.166 | 795.99 | 4.066 | 21.945 |
| 6000 | 581 | 0.2 | 1119.87 | 5.675 | 27.37 |
| 7000 | 568 | 0.268 | 1537.01 | 5.91 | 37.534 |
| 8000 | 757 | 0.265 | 2123.62 | 6.855 | 41.122 |
| 9000 | 902 | 0.329 | 2909.86 | 7.878 | 49.066 |
| 10000 | 1383 | 0.413 | 3926.53 | 8.742 | 52.101 |


# Conclusions
* The time complexity expectations are met indeed. You can see the charts below
Expand All @@ -36,9 +39,10 @@ The results are summarized in the following table. We measured the time for a si
Optimizing the Laplacian norm can be done very efficiently using a sparse linear solver and without any iterative methods requiring the user to compute gradients. So using AutoDiff is not really beneficial in this scenario. However, the Laplacian norm was chosen for the benchmark because it represents many functions that users may wish to optimize - a sum of terms such that each term contains a small amount of variables.

# Charts
Here are some charts that show that the library meets the expectations.
![](0.3 benchmark_ad-eval.png)
![](0.3 benchmark_ad-grad.png)
![](0.3 benchmark_compile.png)
![](0.3 benchmark_grad-approx.png)
![](0.3 benchmark_native.png)
Here are some charts that show that the library meets the expectations. The X axis is N, while the Y axis is the time, in milliseconds, it took to complete the operation.

![](0.3-benchmark_ad-eval.png)
![](0.3-benchmark_ad-grad.png)
![](0.3-benchmark_compile.png)
![](0.3-benchmark_grad-approx.png)
![](0.3-benchmark_native.png)
File renamed without changes
File renamed without changes
File renamed without changes
File renamed without changes
File renamed without changes
7 changes: 0 additions & 7 deletions docs/0.5 vs 0.3 benchmark.md

This file was deleted.

9 changes: 9 additions & 0 deletions docs/0.5-vs-0.3-benchmark.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
We used the same benchmark program as in the [0.3 benchmark](0.3-benchmark.md) page, but replaced the AutoDiff library with the newer version. Here are the results. We have a substantial improvement of compilation and differentiation time. However evaluation time is a bit slower.

In the following charts, the X axis is the number of terms and the Y axis is the time, in milliseconds.

![](0.5-vs-0.3-benchmark_compile.png)
![](0.5-vs-0.3-benchmark_eval.png)
![](0.5-vs-0.3-benchmark_diff.png)

The charts were constructed using the following excel file: [benchmark-0.5.xlsx](benchmark-0.5.xlsx).
File renamed without changes
File renamed without changes
File renamed without changes
File renamed without changes.
14 changes: 0 additions & 14 deletions docs/Documentation.md

This file was deleted.

14 changes: 14 additions & 0 deletions docs/Readme.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
# Benchmarks
* [0.3 benchmark](0.3-benchmark.md)
* [0.5 vs 0.3 benchmark](0.5-vs-0.3-benchmark.md)

# AutoDiff tutorial
* [Your first AutoDiff application](firstapp.md)
* [AutoDiff revisited - Compiled terms](autodiff-revisited.md)
* [Solving equations - Newton-Raphson sample](newton-raphson.md)
* [Optimization - simple gradient descent](gradien-descent.md)
* [Optimization - integration with AlgLib](alglib-integration.md)
* [Optimization - integration with ExtremeOptimization](extreme-optimization-integration.md)
* [User-defined functions](user-defined-functions.md)
* TODO: Supported functions survey
* TODO: Vector math
Loading

0 comments on commit 9e4d509

Please sign in to comment.