Javascript must be enabled for the correct page display

Extensions to Neural Texture Synthesis

Pulles, Lonneke (2024) Extensions to Neural Texture Synthesis. Master's Internship Report, Computing Science.

[img]
Preview
Text
mCS_2024_PullesLC.pdf

Download (16MB) | Preview
[img] Text
toestemming.pdf
Restricted to Registered users only

Download (132kB)

Abstract

Texture synthesis with convolutional neural networks was first proposed by Gatys et al. in 2015. Their approach demonstrates that natural repetitive patterns, such as berries and pebbles, can be recreated remarkably well using only the activation layers of the pyramidal neural network architecture VGG-19 and its Gram matrix representations. However, there are two shortcomings: the method only accepts rectangular input and rectangular output, thereby diminishing its practicality in the real world, and large-scale textural patterns are not captured and reproduced. We explore two extensions to Gatys et al.'s method: mask-aware texture synthesis and structure-aware texture synthesis. In conjunction with constrained optimization, mask-aware texture synthesis paves the way for inpainting. Regarding structure-aware texture synthesis, we focus on two approaches. First, we briefly explore the suitability of applying the concept of approximating Gram matrix representations to the Vision Transformer architecture, which is known for focusing on global dependencies with its attention layers. Secondly, we combine the neural network-based loss with a gradient-based loss, allowing us to synthesize, for example, the fur of a cat while retaining the cat's outlines.

Item Type: Thesis (Master's Internship Report)
Supervisor name: Tursun, O.T. and Kosinka, J.
Degree programme: Computing Science
Thesis type: Master's Internship Report
Language: English
Date Deposited: 14 Feb 2024 12:28
Last Modified: 14 Feb 2024 12:28
URI: https://fse.studenttheses.ub.rug.nl/id/eprint/31950

Actions (login required)

View Item View Item