Skip to content

Conversation

@dajes
Copy link

@dajes dajes commented Jul 14, 2023

  1. Added the ability to choose separately the length of a video and the size of the context of the temporal attention module. By using a sliding window of attention it is now possible to generate infinitely long GIFs.
    Sliding window related parameters:
    --L - the length of the generated animation.
    --context_length - the length of the sliding window (limited by motion modules capacity), default to L.
    --context_overlap - how much neighbouring contexts overlap. By default context_length / 2
    --context_stride - (2^context_stride) is a max stride between 2 neighbour frames. By default 0

  2. Added support for .pt textual inversions from civit.ai that should be put in models/embeddings directory. Though I'm not very sure if this implementation is fully correct, but works fine for me.

  3. Now inference automatically uses torch.autocast to fp16 if --fp32 is not specified. It sped things up by 100% in my tests.

@xdomiall
Copy link

xdomiall commented Jul 17, 2023

Thank you dajes for your contribution! I've tested the fp16 autocast on my 4090 and the speed increases are x4. From 55s per gif I went down to 15s/gif. So 200% increase for me

@Cubey42
Copy link

Cubey42 commented Jul 26, 2023

Any chance you would be interested in figuring out how to add embeddings or the context stride to this repo? https://github.com/neggles/animatediff-cli

@Cubey42
Copy link

Cubey42 commented Jul 26, 2023

Any chance you would be interested in figuring out how to add embeddings or the context stride to this repo? https://github.com/neggles/animatediff-cli

actually I was able to get it to work, nevermind!

@Bendito999
Copy link

Just a reminder, for users looking at this who have an old Maxwell card like the Tesla M40 , the fp16 mode actually causes a 3x slowdown instead of 3x speed boost, so use fp32 for Maxwell cards. Maxwell doesn't have dedicated fp16 hardware. Found this out the hard way haha.

@JACKHAHA363
Copy link

Is this PR merged any where?

@dajes
Copy link
Author

dajes commented Feb 8, 2024

Is this PR merged any where?

AFAIK this technique is used in https://github.com/neggles/animatediff-cli https://github.com/magic-research/magic-animate

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants