Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about BitLinear Implementation #5

Open
thwannbe opened this issue Mar 12, 2024 · 0 comments
Open

Question about BitLinear Implementation #5

thwannbe opened this issue Mar 12, 2024 · 0 comments

Comments

@thwannbe
Copy link

Hi,
I have a doubt in your BitLinear.forward() implementation.
The BitNet paper says the output should be the form as ; y = binarized_weight(W) @ AbsMaxQuant(LN(x)) * bettagamma/Q_b
(LN is layer normalization as the paper describes).
However, in your implementation, the output looks implemented as ; y = AbsMaxQuant(binarized_weight(W) @ x)
Why do you drop LN(x) and switch the order of paper's implementation? And there isn't dequantization with rescaling with betta
gamma/Q_b in your implementation.
Can I get some ideas behind your implementation? If I misunderstand your implementation, please correct me.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant