Skip to content

[DAC'24] EmMark: Robust Watermarks for IP Protection of Embedded Quantized Large Language Models

Notifications You must be signed in to change notification settings

ruisizhang123/EmMark

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

EmMark

[DAC'24] EmMark: Robust Watermarks for IP Protection of Embedded Quantized Large Language Models

Paper

Experiment

INT4 Quantization WM

For environment setups, plz refer to AWQ

Step1: Preprocess to acquire the quantized model along with the activations

$ cd int4_wm
$ bash scripts/opt_watermark.sh opt-2.7b ours /path/save/llm

Step2: Watermark the model

Change status variable from save to watermark.

$ bash scripts/opt_watermark.sh opt-2.7b ours /path/save/llm

INT8 Quantization WM

For environment setups, plz refer to LLM.int8() and SmoothQuant

Step1: Preprocess to acquire the quantized model along with the activations

$ cd int8_wm
$ bash scripts/opt_watermark.sh opt-2.7b ours /path/save/llm

Step2: Watermark the model

Change status variable from save to watermark.

$ bash scripts/opt_watermark.sh opt-2.7b ours /path/save/llm

Acknowledge

Our code builds heavily upon AWQ, LLM.int8() and SmoothQuant. We thank the authors for open-sourcing the code.

Citation

If you found our code/paper helpful, please kindly cite:

@inproceedings{zhang2024emmark,
  title={EmMark: Robust watermarks for IP protection of embedded quantized large language models},
  author={Zhang, Ruisi and Koushanfar, Farinaz},
  booktitle={Proceedings of the 61st ACM/IEEE Design Automation Conference},
  pages={1--6},
  year={2024}
}

About

[DAC'24] EmMark: Robust Watermarks for IP Protection of Embedded Quantized Large Language Models

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published