Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error with Pytorch metrics during PTQ Static #747

Open
DGP1607 opened this issue Dec 13, 2024 · 2 comments
Open

Error with Pytorch metrics during PTQ Static #747

DGP1607 opened this issue Dec 13, 2024 · 2 comments
Assignees

Comments

@DGP1607
Copy link

DGP1607 commented Dec 13, 2024

Describe the issue

Encountering an issue while PTQ Static on Pytorch Model. The process involves utilizing pytorch metrics for benchmarking such as ['Accuracy','F1']. The workflow is executed within the VS code Jupyter extension.

The issue arises as follows :

During the first execution of the entire notebook, everything runs smoothly without any issues.
However, upon executing the notebook for a second time, I encounter an error.Upon restarting the kernel in VS code it is running fine. But this should not be the case everytime why a restart of kerenel is required for metric.
The traceback for the error is as follows:
AssertionError Traceback (most recent call last)
Cell In[91], line 10
7 metrics = METRICS('pytorch')
8 top2 = metrics'Accuracy'
---> 10 q_model = quantization.fit(
11 model=model,
12 conf=conf,
13 calib_dataloader=dataloader,
14 eval_dataloader=dataloader,
15 eval_metric=top2,
16 )
in fit(model, conf, calib_dataloader, calib_func, eval_func, eval_dataloader, eval_metric, **kwargs)
151 wrapped_model = Model(model, conf=conf)
153 if eval_metric is not None:
--> 154 metric = register_customer_metric(eval_metric, conf.framework)
155 else:
156 metric = None
in register_customer_metric(user_metric, framework)
1681 metric_cfg = {name: id(user_metric)}
1682 metrics = METRICS(framework)
-> 1683 metrics.register(name, metric_cls)
1684 return metric_cfg
...
221 """
--> 222 assert name not in self.metrics.keys(), "registered metric name already exists."
223 self.metrics.update({name: metric_cls})

@DGP1607 DGP1607 changed the title Error with Pytorch metrics during PTQ Static Error with Pytorch metrics during PTQ Static Dec 13, 2024
@feng-intel feng-intel self-assigned this Dec 13, 2024
@feng-intel
Copy link

feng-intel commented Dec 13, 2024

assert name not in self.metrics.keys(), "registered metric name already exists."

What's the registered metric name? Can you print it? Where is it registered first time?

@DGP1607
Copy link
Author

DGP1607 commented Dec 23, 2024

Registered metric name : <neural_compressor.metric.metric.Accuracy object at 0x7fb8a291d340>
Looks like in INC library, the registry is managed internally it may not be straightforward to determine exactly where a particular metric was registered. Is there any way to check were it is registered.

FYI
Using Ubuntu 20.04.6 LTS & Python 3.8.10
error trace

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants