Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MIssing External Data File #467

Open
brian-at-pieces opened this issue Feb 28, 2024 · 1 comment
Open

MIssing External Data File #467

brian-at-pieces opened this issue Feb 28, 2024 · 1 comment

Comments

@brian-at-pieces
Copy link

Hey there! I just stumbled across your repo as I was debugging my own LoRA onnxruntime code - it looks like we independently took a really similar approach (identify reused inits, externalize them, delete data from .onnx, load shared inits at runtime) so kudos!

The only difference is that I had to adjust the onnx source code to disable the check for the external data file because onnx was throwing an error saying that it didn't exist. It was my understanding that onnx shouldn't even be looking for this file if the initializer was already added via session_options.add_initializer(...).

If you wouldn't mind sharing, did you encounter this problem, or do you think I'm just not handling the initializers properly? Thanks!

@ssube
Copy link
Owner

ssube commented Mar 7, 2024

I haven't encountered that problem, but I think that's because I am almost always loading from a model that already has an external data file, at least for the UNet. For the text encoder, that is not the case, but it seems to work anyways.

Looking at my implementation, I think https://onnx.ai/onnx/api/external_data_helper.html#set-external-data is doing a lot of the heavy lifting. If you look at convert_model_to_external_data, it calls set_external_data in a loop. I assume that any tensor with raw data needs to be externalized, for simplicity, leading to:

        if tensor.HasField("raw_data"):
            npt = numpy_helper.to_array(tensor)
            orv = OrtValue.ortvalue_from_numpy(npt)
            external_data.append((name, orv))
            # mimic set_external_data
            set_external_data(tensor, location="foo.bin")
            tensor.name = name
            tensor.ClearField("raw_data")

As you can see there, the only part of the metadata that I set is the location, and that's not a real file on my system (I checked, just in case 😆 ).

Whether a TensorProto is considered external or not is based on a few fields, and location is the only required one:

def uses_external_data(tensor: TensorProto) -> bool:
    """Returns true if the tensor stores data in an external location."""
    return (  # type: ignore[no-any-return]
        tensor.HasField("data_location")
        and tensor.data_location == TensorProto.EXTERNAL
    )

I think removing the raw_data is also important to force the data_location to be EXTERNAL. That's also explained here: https://github.com/onnx/onnx/blob/main/docs/ExternalData.md#external_data-field

On the loading side, I believe that the loader does not look for the external data file if the initializers have already been provided, but I can't find the relevant code for that in the ORT source. My loading code is not complicated:

    (unet_model, unet_data) = buffer_external_data_tensors(unet)
    unet_names, unet_values = zip(*unet_data)
    unet_opts = device.sess_options(cache=False)
    unet_opts.add_external_initializers(list(unet_names), list(unet_values))

I have thought about turning this ONNX LoRA code into a library that can be used on its own, or contributing it upstream to one of the Huggingface projects. I have some tests, not as many as I'd like, but enough to cover the core features of a small library, if that would be useful.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants