Skip to content
/ DARE Public

Code release for "Long-Sequence Recommendation Models Need Decoupled Embeddings" (ICLR 2025), https://arxiv.org/abs/2410.02604

Notifications You must be signed in to change notification settings

thuml/DARE

Repository files navigation

Long-Sequence Recommendation Models Need Decoupled Embeddings (ICLR 2025)

This repository provides the official implementation of DARE: Decoupled Attention and Representation Embedding model in our paper.

πŸ”₯ News

Our paper has been accepted to International Conference on Learning Representations (ICLR 2025).

πŸ› οΈ Environment

There are no strict package version requirements. You can use your existing environment with PyTorch support and install additional dependencies as needed. If you prefer, you can also set up a new Conda environment using the following commands:

conda create --name DARE python=3.7
conda activate DARE
pip install -r requirements.txt

πŸ“¦ Dataset

Our experiments use the Taobao and Tmall datasets. To download and preprocess the data, follow the instructions in preprocess/README.md.

πŸš€ Train

We support training DARE, TWIN, DIN, and their variants on the Taobao and Tmall datasets. Example training scripts can be found in: scripts/Taobao.sh and scripts/tmall.sh (Explanations of input parameters can be found in train_pytorch.py).

For ETA, SDIM, and TWIN-V2, the code is not yet open-source. If you need their implementations, please contact the respective authors.

Note that there are various simple tricks that may affect the results. Adding or removing them could slightly change performance. Besides, candidate sampling introduces randomness, leading to slight variations in absolute results. However, the relative performance trends should remain consistent with our paper.

πŸŽ‡ Analysis

The analysis scripts are located in the ./analysis directory, covering attention, gradient, representation and training. Each folder contains a README.md with instructions. Running the provided commands will generate figures like the following:

πŸ“œ Citation

If you find this project useful, please cite our paper:

@inproceedings{feng2025DARE,
    title={Long-Sequence Recommendation Models Need Decoupled Embeddings}, 
    author={Ningya Feng and Junwei Pan and Jialong Wu and Baixu Chen and Ximei Wang and Qian Li and Xian Hu and Jie Jiang and Mingsheng Long},
    booktitle={International Conference on Learning Representations},
    year={2025},
}

🀝 Contact

If you have any question, please contact fny21@mails.tsinghua.edu.cn

πŸ’‘ Acknowledgement

Our code is based on SIM Official Code and UBR4CTR.

About

Code release for "Long-Sequence Recommendation Models Need Decoupled Embeddings" (ICLR 2025), https://arxiv.org/abs/2410.02604

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published