Skip to content
/ CG-GAN Public

Official PyTorch implementation of the CVPR 2022 paper: "Look Closer to Supervise Better: One-Shot Font Generation via Component-Based Discriminator"

Notifications You must be signed in to change notification settings

kyxscut/CG-GAN

Repository files navigation

CG-GAN(CVPR2022)

This is the official PyTorch implementation of the CVPR 2022 paper: "Look Closer to Supervise Better: One-Shot Font Generation via Component-Based Discriminator".

Requirements

(Welcome to develop CG-GAN together.)

We recommend you to use Anaconda to manage your libraries.

Data Preparation

Please convert your own dataset to LMDB format by using the tool lmdb_maker.py (run in Python 2.7, you can upgrade it).

Both the char(text) label, the radical list and the corresponding writer ID are required for every text image.

Please prepare the TTF font and corpus for the rendering of printed style images.

For Chinese font generation task, we recommend you to use 思源宋体 as the source font, download it and put it into data/font folder. You can download the target fonts from 方正字库 for making your own dataset.

For handwritten word synthesis task, please down the IAM dataset and then convert it to LMDB format. You can also download the training and testing datasets prepared by us.

Training

Chinese font generation

Modify the dataRoot , ttfRoot and corpusRoot in scripts/train_character.shas your settings.

  --dataroot data/path to dataset \
  --ttfRoot data/font \
  --corpusRoot data/path to corpus \

Train your model, run

 sh scripts/train_character.sh

Handwritten word synthesis

Modify the dataRoot , ttfRoot and corpusRoot in scripts/train_handwritten.shas your settings.

  --dataroot data_iam/train_IAM \
  --ttfRoot data_iam/fonts_iam \
  --corpusRoot data_iam/seen_char.txt \

Train your model, run

 sh scripts/train_handwritten.sh

Testing

Chinese font generation

test your model, run

 sh scripts/test_character.sh

Handwritten word synthesis

test your model, run

 sh scripts/test_handwritten.sh

Citation

If our paper helps your research, please cite it in your publication(s):

@article{cluo2019moran,
  author    = {Yuxin Kong, Canjie Luo, Weihong Ma, Qiyuan Zhu, Shenggao Zhu, Nicholas Yuan, Lianwen Jin},
  title     = {Look Closer to Supervise Better: One-Shot Font Generation via Component-Based Discriminator},
  year      = {2022},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  publisher = {IEEE}
}

About

Official PyTorch implementation of the CVPR 2022 paper: "Look Closer to Supervise Better: One-Shot Font Generation via Component-Based Discriminator"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published