Skip to content

Commit

Permalink
paper changes
Browse files Browse the repository at this point in the history
  • Loading branch information
FlyingWorkshop committed Sep 20, 2024
1 parent d95a595 commit c89f80c
Show file tree
Hide file tree
Showing 3 changed files with 4 additions and 4 deletions.
4 changes: 2 additions & 2 deletions paper.md
Original file line number Diff line number Diff line change
Expand Up @@ -123,11 +123,11 @@ A lengthier discussion of the `EPCA` constructors and math is provided in the [d

# Applications

To see the benefit of `ExpFamilyPCA.jl`, we recreate Figure 3a from @shortRoy using `CompressedBeliefMDPs.jl` [CITATION PENDING].
The practical applications of `ExpFamilyPCA.jl` span several domains that deal with non-Gaussian data. One notable example is in reinforcement learning, specifically in belief state compression for partially observable Markov decision processes (POMDPs). Using Poisson EPCA, the package effectively reduces high-dimensional belief spaces with minimal information loss, as demonstrated by recreating @shortRoy results. In this case, Poisson EPCA achieved nearly perfect reconstruction of a $41$-dimensional belief profile using just five basis components [CITE `CompressedBeleifMDPS.jl`].

![](./scripts/kl_divergence_plot.png)

The graph compares the average KL divergence between the reconstructions and original discrete belief data using a Poisson EPCA and regular PCA (Gaussian EPCA). The original belief profile has $41$ dimensions and Poisson EPCA achieves nearly perfect reconstruction with just $5$ bases. This makes sense since Poisson EPCA minimizes the generalized KL divergence (see [appendix](https://sisl.github.io/ExpFamilyPCA.jl/dev/math/appendix/poisson/)).
The package also finds applications in areas such as mass spectrometry and survival analysis, where specific data distributions like the gamma or Weibull may be more appropriate. By minimizing divergences suited to the distribution, `ExpFamilyPCA.jl` provides more accurate and interpretable dimensionality reduction compared to standard PCA.

# Acknowledgments

Expand Down
4 changes: 2 additions & 2 deletions scripts/kl_circular_epca.jl
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ beliefs = make_numerical(raw_beliefs, maze)

n, indim = size(beliefs)

outdims = 1:5
outdims = 1:6

kl_divs_poisson_epca = []
kl_divs_gaussian_epca = []
Expand All @@ -71,7 +71,7 @@ plot(
plot!(
outdims,
kl_divs_gaussian_epca,
label="Gaussian EPCA",
label="PCA",
yscale=:log10,
marker=:x,
linestyle=:dash,
Expand Down
Binary file modified scripts/kl_divergence_plot.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit c89f80c

Please sign in to comment.