Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LSMR further updates #111

Merged
merged 8 commits into from
Jan 6, 2025
Merged

LSMR further updates #111

merged 8 commits into from
Jan 6, 2025

Conversation

Jutho
Copy link
Owner

@Jutho Jutho commented Jan 4, 2025

@VictorVanthilt , this is based on your PR and contains an updated LSMR implementation. While it is quite different from the original implementation, your PR was still very helpful to finally push me to study and implement this. Major changes as compared to PR #109 :

  • Because LSMR cannot live up to its promises (minimise ||x|| over all solutions if A has a nontrivial nullspace) when a nonzero starting guess is provided, the ability to specify a starting guess has been disabled. The doc string still mentions how you could manually take a starting guess into account. Comments about this are welcome
  • Removed the GKLIterator dependence, which requires to restart after krylovdim steps, which cannot become very large. Furthermore, also restarting has the issue of the starting vector. The new implementation is almost exactly as discussed in the LSMR paper.
  • Because of the several interface differences with other linsolve calls (no starting vector, a linear map for which also the adjoint needs to be defined, possibility to "regularise" the solution with a parameter lambda) I decided to move this to a separate (and thus new) function, called lssolve (for solving least squares problems).

@Jutho
Copy link
Owner Author

Jutho commented Jan 4, 2025

There is one remaining todo (which could also be dealt with at a later time) and one interface questions:

  • the residual r = b - A * x is not by default determined in the LSMR implementation, only its norm is. I currently explicitly compute the residual at the very end (which requires one more matrix-vector product), but probably the residual can be build during the iterative process in a short recurrence style in the same way as x itself is constructed.
  • there are a number of interesting measures. norm(r) is interesting but cannot be expected to become very small for an overconstrained problem. The actual measure for the convergence is norm(A'*r) when lambda = 0, and norm(A'*r - lambda^2 x) otherwise. An estimate for this value is currently stored in info.normres. So in this case, info.normres is not norm(info.residual). Ideas and comments to improve this situation are welcome.
  • Also, the documentation still needs to be updated. This can happen in a separate PR before tagging a new release.

Copy link

codecov bot commented Jan 4, 2025

Codecov Report

Attention: Patch coverage is 94.40000% with 7 lines in your changes missing coverage. Please review.

Project coverage is 86.16%. Comparing base (4d2a06f) to head (c5439cc).
Report is 1 commits behind head on master.

Files with missing lines Patch % Lines
src/linsolve/cg.jl 60.00% 4 Missing ⚠️
src/lssolve/lsmr.jl 97.11% 3 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master     #111      +/-   ##
==========================================
+ Coverage   85.87%   86.16%   +0.28%     
==========================================
  Files          30       32       +2     
  Lines        3286     3404     +118     
==========================================
+ Hits         2822     2933     +111     
- Misses        464      471       +7     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@Jutho Jutho merged commit 478db80 into master Jan 6, 2025
16 checks passed
@Jutho Jutho deleted the jh/lsmr-furtherupdate branch January 17, 2025 11:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants