[ICCV 2021] WaveFill: A Wavelet-based Generation Network for Image Inpainting
pip install -r requirements.txt
This code requires the pytorch_wavelets package.
$ git clone https://github.com/fbcotter/pytorch_wavelets
$ cd pytorch_wavelets
$ pip install .
Given the dataset, please prepare the images paths in a folder named by the dataset with the following folder strcuture.
flist/dataset_name
├── train.flist # Relative or absolute paths of training images
├── valid.flist # Relative or absolute paths of validation images
└── test.flist # Relative or absolute paths of testing images
In this work, we use CelebA-HQ (Download availbale here), Places2 (Download availbale here), ParisStreet View (need author's permission to download)
checkpoints
├── celebahq_wavefill
├── latest_net_G.pth
# To specify dataset, name of pretain model and mask type in the bash file.
bash test_wavelet.sh
Pretrained VGG model Download from here, move it to models/
. This model is used to calculate training loss.
New models can be trained with the following commands.
Prepare dataset. Use --dataroot
option to locate the directory of file lists, e.g. ./flist
, and specify the dataset name to train with --dataset_name
option. Identify the types and mask ratio using --mask_type
and --pconv_level
options.
Train.
# To specify your own dataset or settings in the bash file.
bash train_wavelet.sh
There are many options you can specify. Please use python train_wavelet.py --help
. The specified options are printed to the console. To specify the number of GPUs to utilize, use --gpu_ids
. If you want to use the second and third GPUs for example, use --gpu_ids 1,2
.
Testing is similar to training new models.
python test_wavelet.py --name [name_of_experiment] --dataset_name [dataset_name] --dataroot [path_to_flist]
Use --results_dir
to specify the output directory. --how_many
will specify the maximum number of images to generate. By default, it loads the latest checkpoint. It can be changed using --which_epoch
.
If you find this code helpful for your research, please cite our papers.
@inproceedings{yu2021wavefill,
title={WaveFill: A Wavelet-based Generation Network for Image Inpainting},
author={Yu, Yingchen and Zhan, Fangneng and Lu, Shijian and Pan, Jianxiong and Ma, Feiying and Xie, Xuansong and Miao, Chunyan},
booktitle={Proceedings of the IEEE/CVF International Conference on Computer Vision},
year={2021}
}
This code borrows heavily from SPADE, CoCosNet, PEN-Net and Edge-Connect, we apprecite the authors for sharing their codes. We also thank Cotter for sharing the pytorch_wavelets code.