Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is it possible to predict next time series vector? #13

Open
HMaker opened this issue Jun 17, 2021 · 6 comments
Open

Is it possible to predict next time series vector? #13

HMaker opened this issue Jun 17, 2021 · 6 comments

Comments

@HMaker
Copy link

HMaker commented Jun 17, 2021

I have a time series vector with several features, I call that vector "state at time t". I though of using HyperGrid Transformer to encode the state vector and pass it to Pattern Pooler. Given the states from 0 to time t, can Pattern Pooler predict the state at time t + 1? Or maybe it can be done with Sequence Learner taking output of Pattern Pooler? Similar to the multivariate anomaly detection example, but with blocks HyperGridTransformer > PatternPooler > SequenceLearner and predicting next output like Long-Short Term Memory models can do.

I got this when changing multivariate anomaly detection example to use HyperGridTransformer > PatternPooler > SequenceLearner (left is original, right new one)
new multivariate example

@HMaker HMaker changed the title Is it possible to predict next state with Pattern Pooler? Is it possible to predict next state of a time series vector with Pattern Pooler and Sequence Learner? Jun 17, 2021
@HMaker HMaker changed the title Is it possible to predict next state of a time series vector with Pattern Pooler and Sequence Learner? Is it possible to predict next time series vector with Pattern Pooler and Sequence Learner? Jun 17, 2021
@jacobeverist
Copy link
Collaborator

jacobeverist commented Jun 22, 2021

Yes. Although I'm not sure I would put the PatternPooler in there, since I can't quite say what the benefits would be yet.

You would want your SequenceLearner to have a punishment decrement greater than 0 so that it reduces false predictions. If it was a pure anomaly detection task, you would want your SequenceLearner to have a punishment of 0 generally.

Finally, you would put a PatternClassifier on the context output of the SequenceLearner to use this as the source for the t+1 prediction. The PatternClassifier is a native binary pattern classifier which works much better than any other ML classifier you might use.

Each state of the PatternClassifier would be an interval of the scalar output you want to predict. The states for your PatternClassifier would be determined through a discretization process of your scalar range similar to https://scikit-learn.org/stable/modules/preprocessing.html#discretization

We don't yet have an example for this, but we should.

You might also consider the BBClassifier class as a way to get started. It does straight inference from scalar feature vectors and uses the Hypergrid Transform and PatternClassifier under the hood. You would need to insert the SequenceLearner between these things.

The example script is here:
https://github.com/the-aerospace-corporation/brainblocks/blob/master/examples/python/classification/sklearn_style_classifier.py

The source code is here:
https://github.com/the-aerospace-corporation/brainblocks/blob/master/src/python/tools/bbclassifier.py

@HMaker
Copy link
Author

HMaker commented Jul 2, 2021

Hey @jacobeverist,

So if I have 5 features, for example, I need 5 PatternClassifiers to predict next values? I was playing with htm.core which is a fork of nupic.core, I tried using 5 classifiers but it consumes much RAM, my 10GB was not enough (TemporalMemory's output was 16 x 1024). Is brainblocks's classifier better on resource handling?

From what I understood of htm.core's classifiers is that they take output of TemporalMemory, which is the pattern to be classified, then they keep a matrix of weights which relates each active pattern bit (I guess they are active neurons/cells) to each possible label/category and are increased every time that bit is active on learning phase for that label. Is brainblock's classifier similar to it?

Thanks.

@HMaker
Copy link
Author

HMaker commented Jul 2, 2021

Also, from what I understood PatternClassifier will classify current output of SequenceLearner, what about future? i.e. step ahead classifications, this is actually my goal.

@jacobeverist
Copy link
Collaborator

So if I have 5 features, for example, I need 5 PatternClassifiers to predict next values? I was playing with htm.core which is a fork of nupic.core, I tried using 5 classifiers but it consumes much RAM, my 10GB was not enough (TemporalMemory's output was 16 x 1024). Is brainblocks's classifier better on resource handling?

Yes. nupic/htm.core is a real resource hog and BrainBlocks is designed to very efficient. You can accomplish the same tasks in a very scalable way.

From what I understood of htm.core's classifiers is that they take output of TemporalMemory, which is the pattern to be classified, then they keep a matrix of weights which relates each active pattern bit (I guess they are active neurons/cells) to each possible label/category and are increased every time that bit is active on learning phase for that label. Is brainblock's classifier similar to it?

No. HTM is uses a linear classifier which is a just a matrix of linear weights on the predicted output states which can be interpreted as probabilities.

The PatternClassifier is a "native" classifier for binary patterns. It is essentially the same as a PatternPooler (or SpatialPooler) but with supervised component where you can designate clusters of output bits to be the predicted states and you train on inputs with expected state outputs. The resulting output to arbitrary data is a set of bits which "vote" for the final inferred state. The state with the most bits is your classified state.

@jacobeverist
Copy link
Collaborator

Also, from what I understood PatternClassifier will classify current output of SequenceLearner, what about future? i.e. step ahead classifications, this is actually my goal.

The PatternClassifier works with arbitrary binary pattern inputs. For the t+1 prediction step of the SequenceLearner, you need to take the sl.context binary patterns which indicate the prediction states of the SequenceLearner. For the t inference, you would use the sl.output instead.

https://github.com/the-aerospace-corporation/brainblocks/blob/master/examples/python/classification/classif.py

In this example, it just the simple use of converting an arbitrary binary pattern to an inferred state. It trains converting a scalar value to one of two discrete states: {a, b}.

@HMaker HMaker changed the title Is it possible to predict next time series vector with Pattern Pooler and Sequence Learner? Is it possible to predict next time series vector? Jul 5, 2021
@HMaker
Copy link
Author

HMaker commented Jul 5, 2021

Thanks for the detailed answer, I will try it and come back later with results.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants