Spatial Steerability of GANs via
Self-Supervision from Discriminator
Jianyuan Wang1,2,*Lalit Bhagat3,*Ceyuan Yang1Yinghao Xu1Yujun Shen1
Hongdong Li2Bolei Zhou3 
1 The Chinese University of Hong Kong, 2 The Australian National University,
3 University of California, Los Angeles
* Equal contribution

** Interactive editing.
Generative models have made huge progress in photorealistic image synthesis in recent years. To enable human to steer the image generation process and customize the output, many works explore the interpretable dimensions of the latent space in GANs. Existing methods edit the attributes of the output image such as orientation or color scheme by varying the latent code along certain directions. However, these methods usually require additional human annotations for each pre-trained model, and they mostly focus on editing global attributes.

We design randomly sampled Gaussian heatmaps to be encoded into the intermediate layers of generative models as spatial inductive bias. Along with training the GAN model from scratch, these heatmaps are being aligned with the emerging attention of the GAN’s discriminator in a self-supervised learning manner. During inference, users can interact with the spatial heatmaps intuitively, enabling them to edit the output image by adjusting the scene layout, and moving, or removing objects.

We build an interactive interface to visualize, SpacialGAN enables the interactive spatial editing of the output image.
Qualitative Results
Below, we present samples generated by the SpacialGAN model, showcasing manipulations of Multi-Object Indoor Scenes from the LSUN Bedroom dataset.

In this example, we showcase the rearrangement of objects using sub-heatmap manipulation. The yellow arrows indicate the movement of objects like windows and beds, enhancing overall coherence.
Then, we demonstrate the gradual removal of objects by eliminating their associated sub-heatmaps. Elements such as windows and lamps are gradually removed, while the background and other objects remain mostly unchanged.
Next, we explore the alteration of local regions by applying unique style codes to individual sub-heatmaps. This process enables a variety of changes, including variations in paintings, windows, and light types, as denoted by the blue and green boxes.
Synergy between DragGAN and SpatialGAN
We also integrate the recent point-based manipulation technique, DragGAN, into our method. This integration combines the strengths of both approaches to achieve high-quality, fine-grained manipulation in a reasonable time.
      title={Spatial Steerability of GANs via Self-Supervision from Discriminator}, 
      author={Jianyuan Wang and Lalit Bhagat and Ceyuan Yang and Yinghao Xu and Yujun Shen and Hongdong Li and Bolei Zhou},
Related Work