Projects
Improving the Fairness of Deep Generative Models
without Retraining
Shuhan Tan, Yujun Shen, Bolei Zhou
arXiv.2012.04842 preprint
[
Paper]
[
Project Page]
[
Code]
[
Colab]
GenForce Lib
Yujun Shen, Yinghao Xu, Ceyuan Yang, Jiapeng Zhu, Bolei Zhou
This is an efficient PyTorch library for deep generative modeling.
[
Code]
[
Colab]
Generative Hierarchical Features from Synthesizing Images
Yinghao Xu*, Yujun Shen*, Jiapeng Zhu, Ceyuan Yang, Bolei Zhou
arXiv.2007.10379 preprint
[
Paper]
[
Project Page]
[
Code]
InterFaceGAN: Interpreting the Disentangled Face Representation Learned by GANs
Yujun Shen, Ceyuan Yang, Xiaoou Tang, Bolei Zhou
IEEE Transactions on Pattern Recognition (TPAMI), Oct 2020
[
Paper]
[
Project Page]
[
Code]
[
Demo Video]
In-Domain GAN Inversion for Real Image Editing
Jiapeng Zhu*, Yujun Shen*, Deli Zhao, Bolei Zhou
European Conference on Computer Vision (ECCV), 2020
[
Paper]
[
Project Page]
[
Code]
[
Demo Video]
[
Colab]
Disentangled Inference for GANs with Latently Invertible Autoencoder
Jiapeng Zhu*, Deli Zhao*, Bo Zhang, Bolei Zhou
arXiv.1906.08090 preprint
[
Paper]
[
Code]
Semantic Hierarchy Emerges in Deep Generative Representations for Scene Synthesis
Ceyuan Yang*, Yujun Shen*, Bolei Zhou
International Journal of Computer Vision (IJCV), 31 Dec 2020
[
Paper]
[
Project Page]
[
Code]
[
Demo Video]
Image Processing Using Multi-Code GAN Prior
Jinjin Gu, Yujun Shen, Bolei Zhou
Computer Vision and Pattern Recognition (CVPR), 2020
[
Paper]
[
Project Page]
[
Code]
Interpreting the Latent Space of GANs for Semantic Face Editing
Yujun Shen, Jinjin Gu, Xiaoou Tang, Bolei Zhou
Computer Vision and Pattern Recognition (CVPR), 2020
[
Paper]
[
Project Page]
[
Code]
[
Demo Video]
Tutorials
Exploring and Exploiting Interpretable Semantics in GANs
Bolei Zhou
Interpretable Machine Learning for Computer Vision (CVPR 2020 Tutorial)
[
Video (YouTube)]
[
Video (bilibili)]
[
Slides]
Interpreting and Exploiting the Latent Space of GANs
Yujun Shen
Invited Talk by 机器之心
[
Video (in Chinese)]
Team
Acknowledgement
We would like to acknowledge the support from the Early Career Scheme (ECS) through the Research Grants Council (RGC) of Hong Kong under Grant No.24206219 and CUHK FoE RSFS Grant.