Skip to content

Conversation

@wulfdewolf
Copy link
Collaborator

@wulfdewolf wulfdewolf commented Nov 14, 2025

This extends regularization by allowing passing pytrees that match the parameter structure.
Every parameter can thus be regularized differently.
This will be particularly useful for the PopulationGLM, as it allows regularizing individual neurons differently.

Some details:

  • For the moment intercepts are not regularized (as before).
  • ElasticNet regularization now also takes a pytree (strengths, ratios are set to 0.5) or a tuple of pytrees (strengths and ratios), to allow for maximum flexibility.

TODO:

  • ElasticNet penalization
  • ElasticNet proximal operator
  • GroupLasso penalization
  • GroupLasso proximal operator
  • tests
  • docs

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants