Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Faster version of permentropy / should permentropy live in a lower-level package? #115

Closed
kahaaga opened this issue Jun 11, 2020 · 3 comments

Comments

@kahaaga
Copy link
Member

kahaaga commented Jun 11, 2020

@Datseris I'm not sure this is best to post here or in DelayEmbeddings, but it applies to both, so here it goes:

Background

You wanted a method for computing mutual information between two time series in DelayEmbeddings(see this issue). I also use mutual information for computing information-theoretic causality measures in CausalityTools. We therefore discussed potentially having a common API for entropy estimators in CausalityToolsBase.

I now have a ready-ish interface for that, which looks as follows:

Abstract entropy estimators

I define the following abstract type for entropy estimators.

"""
    EntropyEstimator 

The supertype of all entropy estimators. These estimators all have in common 
that they directly or indirectly estimate entropies over some generalized 
delay reconstruction.
"""
abstract type EntropyEstimator end

Symbolic entropy types

There are many different entropy estimators, among which the permutation entropy is one of them. In CausalityTools, I use both quantization-based and permutation-based entropy symbolization. Hence, I define the following types (only showing stuff relevant for permutation entropy):

"""
    SymbolicEntropyEstimator <: EntropyEstimator

The supertype of all symbolic entropy estimators.
"""
abstract type SymbolicEntropyEstimator <: EntropyEstimator end

"""
    PermutationEntropyEstimator <: EntropyEstimator

The supertype of all permutation-based symbolic entropy estimators.
"""
abstract type PermutationEntropyEstimator <: SymbolicEntropyEstimator end

This hierarchy is the starting point for all symbolic permutation-based entropy estimators.

Permutation entropy

I implement estimators for several different variants of permutation entropy, among which the regular permutation entropy from Bandt and Pompe (2002) is one of them. The function for mutual information and friends dispatch on different entropy estimators, and among those is the SymbolicPermutation estimator:

"""
    SymbolicPermutation(; b::Real = 2, m::Int = 2) <: PermutationEntropyEstimator

A symbolic permutation entropy estimator using motifs of length `m`.
The entropy is computed to base `b` (the default `b = 2` gives the 
entropy in bits).

The motif length must be ≥ 2. By default `m = 2`, which is the shortest 
possible permutation length which retains any meaningful dynamical information.
"""
struct SymbolicPermutation <: PermutationEntropyEstimator
    b::Real
    m::Int 
    
    function SymbolicPermutation(; b::Real = 2, m::Int = 2)
        m >= 2 || throw(ArgumentError("Dimensions of individual marginals must be at least 2. Otherwise, symbol sequences cannot be assigned to the marginals. Got m=$(m)."))

        new(b, m)
    end
end

The function signature for computing (lagged) mutual information (and friends) between x and y (which can be either time series or Datasets) using permutation entropy is

mutualinformation(x, y, method::SymbolicPermutation)
transferentropy(x,y, embedding, method::SymbolicPermutation)
predictive_asymmetry(x, y, ηs, method::SymbolicPermutation)
# ... and so on

Under the hood, I've defined entropy method analogously:

import StatsBase.entropy

"""
    entropy(x::Dataset, method::SymbolicPermutation)
    entropy(x::AbstractArray{T, 1}, method::SymbolicPermutation) where T
    entropy(x::AbstractArray{T, 1}, τ::Int, method::SymbolicPermutation) where T
    entropy(x::AbstractArray{T, 1}, emb::GeneralizedEmbedding, method::SymbolicPermutation) where T

Compute the permutation entropy of `x` [^BandtPompe2002]. 

- If `x` is a `D`-dimensional `Dataset`, then compute the order-`D` permutation entropy (using motifs of length `D`).
- If `x` is a time series, embed `x` at with equidistant lags `0:1:(method.m-1)`, and then compute the permutation entropy of order `method.m` (using motifs of length `method.m`).
- If `x` is a time series and a single delay reconstruction lag `τ` is provided, embed `x` at with equidistant lags `0:τ:τ*(method.m-1)`, and then compute the permutation entropy of order `method.m` (using motifs of length `method.m`).
- If `x` is a time series and a `GeneralizedEmbedding` is provided, compute the permutation entropy of the order dictated by `emb` (i.e. if `emb` gives a 5-dimensional embedding, then use motifs of length 5). This method allows for more flexibility if nonequidistant embedding lags are to be used. 

Note: The sign of `τ` can be both positive and negative, because the sign of `τ` only controls whether a forwards or backwards embedding 
vectors are constructed. 

[^BandtPompe2002]: Bandt, Christoph, and Bernd Pompe. "Permutation entropy: a natural complexity measure for time series." Physical review letters 88.17 (2002): 174102.
"""
function entropy end

These entropy methods are used under the hood by the higher-level functions such as mutualinformation.

For example, if you call mutualinformation(x, y, SymbolicPermutation(m = 3, b = 2)), that will compute mutual information using the permutation entropy of order 3 using the base-2 logarithm.

Questions/thoughts

Where should permutation entropies live?

  1. There are several variants of permutation entropy besides the original by Bandt and Pompe. Examples are weighted permutation entropy, amplitude-aware permutation entropy, fine-grained permutation entropy, and more. I think they all deserve a place in ChaosTools, because they can capture dynamical information that the basic permutation entropy cannot.
  2. I would like the possibility to choose between different permutation entropy methods when calling permentropy.
  3. I already define entropy(x, method::SymbolicPermutation), entropy(x, method::SymbolicAmplitudeAware) etc in CausalityToolsBase. It doesn't hurt to have a separate permentropy implementation in ChaosTools too, but it seems a bit redundant. Could it be an idea that ChaosTools adopts this entropy approach and the SymbolicPermutation/SymbolicAmplitudeAware/etc estimators? Alternatively, could it import those and use it under the hood in permentropy (so that syntax doesn't break)?

I'm not sure which methods should belong in which package, but I have to use these entropy estimators in CausalityToolsBase, which is supposed to not be dependency-heavy. Importing ChaosTools introduces dependencies on for example DSP and LombScargle, which I don't want. So I'm thinking the options are 1) keep permentropy as is here and go with the separate entropy(x, method::EntropyEstimator) interface in CausalityTools, 2) let ChaosTools use the new interface by importing it from CausalityToolsBase, 3) create an entirely new, small package called EntropyEstimators (or something along those lines) that both CausalityToolsBase and ChaosTools can import from. Or some better suggestion, if you have ideas!

Speed-ups for permutation entropies

Comparing ChaosTools.permentropy with my new approach, I also gain significant speed-ups in the estimation of permutation entropy (> 1 order of magnitude).

using CausalityToolsBase, ChaosTools, BenchmarkTools

function run(n::Int = 100; order::Int = 5, b::Real = 2, method                       )
    x = rand(n)
    if method == :permentropy
        permentropy(x, order, base = b)
    elseif method == :entropy
        entropy(x, SymbolicPermutation(m = order, b = b))
    end
end 

order = 5
b = MathConstants.e
npts = 10000
# Inital run
eold = run(100, order = order, b = b, method = :permentropy)
enew = run(100, order = order, b = b, method = :entropy)
@show eold, enew

# Time it
@btime run(npts, order = order, b = b, method = :permentropy)
@btime run(npts, order = order, b = b, method = :entropy)

which gives

(eold, enew) = (4.781810515831264, 4.779657182929963)
  6.195 ms (40609 allocations: 2.26 MiB)
  252.858 μs (25 allocations: 706.09 KiB)

For the analysis of multiple time series (e.g. between columns of a Dataset), this can be made even more efficient by creating an EntropyGenerator struct/functor that pre-allocates necessary stuff and re-uses it when the method is applied to multiple time series (analogously to what we do with SurrogateGenerator in TimeseriesSurrogates).

Let me know what you think, @Datseris.


Want to back this issue? Post a bounty on it! We accept bounties via Bountysource.

@kahaaga
Copy link
Member Author

kahaaga commented Jun 11, 2020

Ah, I just noticed that there is actually an issue on the performance of permentropy (#22). Then this implementation addresses that too.

@Datseris
Copy link
Member

I will read this in more detail later and answer properly, but for a quick reply: why not make a package Entropies.jl, which will also include the histogram stuff I have in ChaosTools (and are also the fastest on the market) and the permentropy stuff and the kernel estimation stuff, and then we include this package in DynamicalSystems.jl and CausalityTools.jl ?

@Datseris
Copy link
Member

cf. JuliaDynamics/DelayEmbeddings.jl#33 , unified Entropy package will happen soon!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants