Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

iterative ref: implement and benchmark methods to normalize half maps #25

Open
geoffwoollard opened this issue Mar 28, 2022 · 0 comments
Labels
enhancement New feature or request

Comments

@geoffwoollard
Copy link

Insertion of slices results in uneven coverage of the voxels. The slices are added in, but just because certain slices are observed more does not mean that the values in those voxels should be increased. It just means that the certainty is going up. The expectation require "dividing out" after doing the sum.

One method to "average in" the slices inserts ones and the particle values in parallel, and then divide the "half map" by the "count map". This "dividing out" can be done in a a few ways to protect against numerical instability and noise blow up.

TODO:

  1. implement various methods to be benchmarked
  2. To a survey to see what other reconstruction software does

See
https://github.com/compSPI/reconstructSPI/pull/21/files/33e3229b1dd193f1cecdc7334668b3e991cdf4de..01ceaf5f7f80614fae628a89649908ca11d1a9a0#r836914864

@geoffwoollard geoffwoollard added the enhancement New feature or request label Mar 28, 2022
@geoffwoollard geoffwoollard changed the title implement and benchmark methods to normalize half maps iterative ref: implement and benchmark methods to normalize half maps Mar 30, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant