Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Automatic MAP methods for sparsity-inducing, non-smooth log-likelihoods #8

Open
brandonwillard opened this issue Mar 12, 2019 · 3 comments
Labels
enhancement New feature or request miniKanren This issue involves miniKanren goals

Comments

@brandonwillard
Copy link
Contributor

There's nearly enough in place to start automating the application of proximal methods. I've outlined some of the basic ideas here and here.

To recap, this would involve simple identifications of convex/concave log-likelihoods through the use of a representative table of primitive convex functions and their algebraic closure properties. Identification of convexity would also serve as a means of determining the applicable proximal methods, because a large set of such methods can be "tabled" in direct connection with many primitive convex functions (e.g. see this paper).

@brandonwillard brandonwillard added enhancement New feature or request and removed enhancement New feature or request labels Mar 12, 2019
@brandonwillard brandonwillard added the enhancement New feature or request label Jun 1, 2019
@brandonwillard
Copy link
Contributor Author

brandonwillard commented Jun 4, 2019

FYI: we can add a proximal Langevin sampler to PyMC and extend the aforementioned functionality to posterior sampling.

@twiecki
Copy link
Member

twiecki commented Jun 4, 2019

That'd be awesome!

@brandonwillard
Copy link
Contributor Author

The first steps toward this involve the construction of miniKanren goals (or facts tables) to identify simple, convex likelihood + penalty terms within a model's log-likelihood graph (e.g. squared error terms from Gaussian log-likelihoods, l1 penalties from Laplace prior log-likelihoods, etc.)

This is something we could do now in both PyMC4/TensorFlow and PyMC3/Theano; however, the latter has better graph canonicalization, so the goals would be simpler and more widely applicable.

Once those goals are in place, we will need to create sampler step functions that implement proximal methods. Fortunately, I already have an implementation of the most basic one, the proximal gradient, for use with Theano objects here.

@brandonwillard brandonwillard added the miniKanren This issue involves miniKanren goals label Mar 13, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request miniKanren This issue involves miniKanren goals
Projects
None yet
Development

No branches or pull requests

2 participants