QuiltGAN: An Adversarially Trained, Procedural Algorithm for Texture Generation

Renato Barros Arantes, George Vogiatzis, Diego Faria

Research output: Chapter in Book/Published conference outputChapter


We investigate a generative method that synthesises high-resolution images based on a single constraint source image. Our approach consists of three types of conditional deep convolutional generative adversarial networks (cDCGAN) that are trained to generate samples of an image patch conditional on the surrounding image regions. The cDCGAN discriminator evaluates the realism of the generated sample concatenated with the surrounding pixels that were conditioned on. This encourages the cDCGAN generator to create image patches that seamlessly blend with their surroundings while maintaining the randomisation of the standard GAN process. After training, the cDCGANs recursively generate a sequence of samples which are then stitched together to synthesise a larger image. Our algorithm is able to produce a nearly infinite collection of variations of a single input image that have enough variability while preserving the essential large-scale constraints. We test our system on several types of images, including urban landscapes, building facades and textures, comparing very favourably against standard image quilting approaches.

Original languageEnglish
Title of host publicationComputer Vision Systems - 12th International Conference, ICVS 2019, Proceedings
EditorsDimitrios Tzovaras, Dimitrios Giakoumis, Markus Vincze, Antonis Argyros
Number of pages10
VolumeVolume 11754 LNCS
Publication statusPublished - 23 Nov 2019

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11754 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


  • GAN
  • Inpainting
  • Procedural generation


Dive into the research topics of 'QuiltGAN: An Adversarially Trained, Procedural Algorithm for Texture Generation'. Together they form a unique fingerprint.

Cite this