Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ExponentialUtilities v1.3 #9

Merged
merged 10 commits into from
Oct 26, 2018
Merged

ExponentialUtilities v1.3 #9

merged 10 commits into from
Oct 26, 2018

Conversation

MSeeker1340
Copy link
Contributor

@MSeeker1340 MSeeker1340 commented Oct 17, 2018

Two main jobs to do:

Note that I've refactored expv into two functions, depending on the choice of mode. This is because the code for the two stopping modes are vastly different and they use different keyword arguments.

* `LinearAlgebra.mul!(y, A, x)` (for computing `y = A * x` in place).
* `Base.size(A, dim)`
* `LinearAlgebra.opnorm(A, p=Inf)`. If this is not implemented or the default implementation can be slow, you can manually pass in the operator norm (a rough estimate is fine) using the keyword argument `opnorm`.
* `LinearAlgebra.ishermitian(A)`. If this is not implemented or the default implementation can be slow, you can manually pass in the value using the keyword argument `ishermitian`.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you might want to mention the DiffEqOperator helper @YingboMa made for this. We should add ishermitian and opnorm to it though.

@ChrisRackauckas
Copy link
Member

It's confusing that both of them are called expv! with different arguments. It'll be fine if it gets appropriate documentation though.

@ChrisRackauckas
Copy link
Member

I like the mode choice in the high level allocating version.

@jagot
Copy link
Contributor

jagot commented Oct 18, 2018

I think it looks nice!

@MSeeker1340
Copy link
Contributor Author

Did some extensive structural changes. krylov_expv.jl is renamed to krylov_phiv_error_estimate.jl and the expv! function using Saad is renamed to expv_ee!. Furthermore, things related to stegr is separated into two parts: those independent of exponentiation (basically StegrWork) are moved to a new inner module Stegr while those used by expv!_ee are merged into krylov_phiv_error_estimate.jl.

@jagot would you like to develop Stegr into a separate package? Maybe also extend LinearAlgebra.eigen! for SymTridiagonals so that user-level code doesn't need to use stegr!?

BTW, I'd like to drop the initial plan of implementing phiv_ee! since I'm still debating about the details, and this PR already has enough changes. I'll also need to update OrdinaryDiffEq regarding the opnorm change after tagging the v1.3 release. @ChrisRackauckas for the update, should I keep backward compatibility (i.e. still allow passing in the opnorm function instead of a value) for now so that update on the OrdinaryDiffEq part is easier?

@MSeeker1340 MSeeker1340 reopened this Oct 22, 2018
@jagot
Copy link
Contributor

jagot commented Oct 22, 2018

In fact, I do not like the change to expv_ee!, quite the contrary. As I lined out in #10, I would prefer it to not even have the cache as a separate argument. I think the operator should be tagged with the cache such that the correct algorithm is chosen via dispatch, so you only ever need to write expv!(b, t, A, x).

Regarding stegr!, I don't think it should go into a separate package, as it breaks the functionality as-is at the present. Merely implementing eigen! will not work since more cache is necessary beyond a tridiagonal matrix. I also don't think we need shun Lapack functions, just look at e.g. BandedMatrices.jl

@MSeeker1340
Copy link
Contributor Author

In fact, I do not like the change to expv_ee!, quite the contrary. As I lined out in #10, I would prefer it to not even have the cache as a separate argument. I think the operator should be tagged with the cache such that the correct algorithm is chosen via dispatch, so you only ever need to write expv!(b, t, A, x)

Fair point.

Regarding stegr!, I don't think it should go into a separate package, as it breaks the functionality as-is at the present.

Since we're just extending stegr! with StegrWork, how is this breaking the current funtionality? The original stegr! isn't changed after all.

Merely implementing eigen! will not work since more cache is necessary beyond a tridiagonal matrix.

Sorry I'm not making things clear. What I suggest is to add

import LinearAlgebra: eigen!

function eigen!(A::SymTridiagonal{T}, sw::StegrWork) where {T <: Real}
  stegr!(A.dv, A.de, sw)
  ... # wrap things in an Eigen object
end

The intention is that, if you're a power user and want a faster eigen!, you can just allocate a StegrWork and then proceed to use eigen! pretty much the same way as before, without having to switch to stegr! in the user code.

@MSeeker1340 MSeeker1340 reopened this Oct 23, 2018
@MSeeker1340
Copy link
Contributor Author

Added backward compatibility for opnorm input that is a function (e.g. what is currently done by the ExpRK integrators). OrdinaryDiffEq tests pass on master.

@MSeeker1340 MSeeker1340 merged commit 9304f45 into master Oct 26, 2018
@MSeeker1340 MSeeker1340 deleted the v1.3 branch October 26, 2018 16:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants