Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add gl node weight expansions #53

Closed

Conversation

popsomer
Copy link
Contributor

@popsomer popsomer commented Jun 23, 2017

The default algorithm for low n is now forward recurrence because the weights in Golub-Welsh become exactly zero long before they should underflow #52 . The corresponding functions laguerreRec and laguerreRecDer can be called with BigFloats and this can be extended to gausslaguerre.jl as well if for example besselroots.jl can handle BigFloats #22 .

method="RH" is removed such that the code is a lot shorter and the function polyAsyRHgen can still be called if expansions of corresponding orthogonal polynomials are needed: I can change the normalisation to something standard if required.

The timings mentioned in README.md should be repeated on the reference machine, as the explicit expansions are extremely fast: for example, the six Newton iterations required to compute t seem to take more than half the total execution time of gausslaguerre. Higher-order asymptotics were not implemented in the Airy region (currently only O(n^{-4}) for the nodes and O(n^{-2/3}) for the weights) as the latter easily underflow, like the subsampled quadrature rule in reference [7]. I plan to do something similar for Jacobi (, Legendre) and Hermite, and to make a pull request to chebfun then.

@codecov-io
Copy link

codecov-io commented Jun 23, 2017

Codecov Report

Merging #53 into master will decrease coverage by 2.94%.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff             @@
##           master      #53      +/-   ##
==========================================
- Coverage   97.72%   94.78%   -2.95%     
==========================================
  Files           8        8              
  Lines        1849     1074     -775     
==========================================
- Hits         1807     1018     -789     
- Misses         42       56      +14
Impacted Files Coverage Δ
src/gausslaguerre.jl 92.64% <ø> (-5.84%) ⬇️
src/gausslegendre.jl 97% <0%> (-0.98%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update c3db719...562cf92. Read the comment docs.

@daanhb
Copy link
Member

daanhb commented May 29, 2018

There has been some progress on this pull request. A lot of memory allocations were removed, some still remain in the evaluation of special functions (these could be removed by implementing an alternative evaluation method). Even without those, this is a lot faster than the previous implementation without explicit expansions.
The code for more generalized weight functions was moved to gaussfreud.jl, since the generalizations correspond to Freud-type weights (exp(-x^m) for m greater than 1).

@daanhb
Copy link
Member

daanhb commented May 29, 2018

Some more changes to this pull request are forthcoming, and it is currently quite a bit behind master. Stay tuned.

@ajt60gaibb
Copy link
Collaborator

@daanhb, do you want to update this PR so that I can take a look and merge?

@daanhb
Copy link
Member

daanhb commented Feb 4, 2019

Please see #4 for an update to this pull request.

@dlfivefifty
Copy link
Member

Is this PR still active?

@daanhb
Copy link
Member

daanhb commented Apr 8, 2019

No, it was superseded by #69 which is now merged.

@hyrodium
Copy link
Collaborator

I'll close this PR because this is not active.

@hyrodium hyrodium closed this Jan 15, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants