Skip to content

wan2.2 is faster than FA2 but consumes more VRAM. #5

@dz1iang

Description

@dz1iang

Hello, thank you for the open-source SLA project. My experience with using it on wan2.2 is as follows:

  1. The training performance has improved by over 30%;
  2. In scenarios where FA2 previously did not encounter OOM (Out of Memory), the SLA project now experiences OOM issues.

Finally, I would like to ask two questions:

  1. Why does this (OOM issue) occur?
  2. Are there any optimization techniques for SLA, particularly regarding computing efficiency and VRAM (Video Random Access Memory) usage?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions