Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement (Smoothed) Finite Difference Approximation of Influence Function #482

Closed
wants to merge 62 commits into from

Conversation

azane
Copy link
Contributor

@azane azane commented Jan 7, 2024

*closed in favor of #501

WIP pending full vectorization and integration test of approximator.

TODO description

blocked by: #478
addresses: #469

eb8680 and others added 19 commits November 27, 2023 10:13
* initial test against analytic fisher vp (pair coded w/ sam)

* linting

* added check against analytic ate

* added vmap and grad smoke tests

* added missing init

* linting and consolidated fisher tests to one file

* fixed types

* fixing linting errors

* trying to fix type error for python 3.8

* fixing test errors

* added patch to test to prevent from failing when denom is small

* composition issue

* removed missing import

* fixed failing test with seeding

* addressing Eli's comments
* upper bound on cg_iters

* address comment
* initial test against analytic fisher vp (pair coded w/ sam)

* linting

* added check against analytic ate

* added vmap and grad smoke tests

* added missing init

* linting and consolidated fisher tests to one file

* fixed types

* fixing linting errors

* trying to fix type error for python 3.8

* fixing test errors

* added patch to test to prevent from failing when denom is small

* composition issue

* seeded NMC implementation

* linting

* removed missing import

* changed to eli's seedmessenger suggestion

* added failing edge case

* explicitly add max plate argument

* added warning message

* fixed linting error and test failure case from too many cg iters

* eli's contextlib seeding strategy

* removed seedmessenger from test

* randomness should be shared across calls

* switched back to different
…er_vp` (#430)

* hessian vector product formulation for fisher

* ignoring small type error

* fixed linting error
* initial test against analytic fisher vp (pair coded w/ sam)

* linting

* added check against analytic ate

* added vmap and grad smoke tests

* added missing init

* linting and consolidated fisher tests to one file

* fixed types

* fixing linting errors

* trying to fix type error for python 3.8

* fixing test errors

* added patch to test to prevent from failing when denom is small

* composition issue

* seeded NMC implementation

* linting

* removed missing import

* changed to eli's seedmessenger suggestion

* added failing edge case

* explicitly add max plate argument

* added warning message

* fixed linting error and test failure case from too many cg iters

* eli's contextlib seeding strategy

* removed seedmessenger from test

* randomness should be shared across calls

* uncomitted change before branch switch

* switched back to different

* added revised simple model and guide

* added multiple link functions in test

* linting
* batching in linearize and influence

* addressing eli's review

* added optimization for pointwise false case

* fixing lint error
* one step correction

* increased tolerance

* fixing lint issue
…PredictiveLikelihood` (#473)

* sketch batched nmc lpd

* nits

* fix type

* format

* comment

* comment

* comment

* typo

* typo

* add condition to help guarantee idempotence

* simplify edge case

* simplify plate_name

* simplify batchedobservation logic

* factorize

* simplify batched

* reorder

* comment

* remove plate_names

* types

* formatting and type

* move unbind to utils

* remove max_plate_nesting arg from get_traces

* comment

* nit

* move get_importance_traces to utils

* fix types

* generic obs type

* lint

* format

* handle observe in batchedobservations

* event dim

* move batching handlers to utils

* replace 2/3 vmaps, tests pass

* remove dead code

* format

* name args

* lint

* shuffle code

* try an extra optimization in batchedlatents

* add another optimization

* undo changes to test

* remove inplace adds

* add performance test showing speedup

* document internal helpers

* batch latents test

* move batch handlers to predictive

* add bind_leftmost_dim, document PredictiveFunctional and PredictiveModel

* use bind_leftmost_dim in log prob
* documentation

* documentation clean up w/ eli

* fix lint issue
…that properly composes with other Predictive models, some vectorization pending.
@azane azane added status:WIP Work-in-progress not yet ready for review blocked labels Jan 7, 2024
@azane azane self-assigned this Jan 7, 2024
@azane azane changed the title Az influence finite difference Implement (Smoothed) Finite Difference Approximation of Influence Function Jan 7, 2024
@eb8680 eb8680 changed the base branch from staging-robust to master January 12, 2024 15:45
@eb8680
Copy link
Contributor

eb8680 commented Jan 12, 2024

Needs merge conflict resolution following #398

@azane azane closed this Jan 12, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
blocked module:robust status:WIP Work-in-progress not yet ready for review
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants