September 20, 2022 Computer Vision
Neural Scene Decoration from a Single Photograph
49 minutes
Hong-Wing Pang, Yingshu Chen, Phuoc-Hieu Le, Binh-Son Hua, Duc Thanh Nguyen, Sai-Kit Yeung
ECCV 2022
Abstract
Furnishing and rendering indoor scenes has been a long-standing task for interior design, where artists create a conceptual design for the space, build a 3D model of the space, decorate, and then perform rendering. Although the task is important, it is tedious and requires tremendous effort. In this paper, we introduce a new problem of domain-specific indoor scene image synthesis, namely neural scene decoration. Given a photograph of an empty indoor space and a list of decorations with layout determined by user, we aim to synthesize a new image of the same space with desired furnishing and decorations. Neural scene decoration can be applied to create conceptual interior designs in a simple yet effective manner. Our attempt to this research problem is a novel scene generation architecture that transforms an empty scene and an object layout into a realistic furnished scene photograph. We demonstrate the performance of our proposed method by comparing it with conditional image synthesis baselines built upon prevailing image translation approaches both qualitatively and quantitatively. We conduct extensive experiments to further validate the plausibility and aesthetics of our generated scenes. Our implementation is available at https://github.com/hkust-vgd/neural_scene_decoration.
Bibtex
@InProceedings{10.1007/978-3-031-20050-2_9,
author=”Pang, Hong-Wing
and Chen, Yingshu
and Le, Phuoc-Hieu
and Hua, Binh-Son
and Nguyen, Duc Thanh
and Yeung, Sai-Kit”,
editor=”Avidan, Shai
and Brostow, Gabriel
and Ciss{\’e}, Moustapha
and Farinella, Giovanni Maria
and Hassner, Tal”,
title=”Neural Scene Decoration from a Single Photograph”,
booktitle=”Computer Vision — ECCV 2022″,
year=”2022″,
publisher=”Springer Nature Switzerland”,
address=”Cham”,
pages=”136–152″,
abstract=”Furnishing and rendering indoor scenes has been a long-standing task for interior design, where artists create a conceptual design for the space, build a 3D model of the space, decorate, and then perform rendering. Although the task is important, it is tedious and requires tremendous effort. In this paper, we introduce a new problem of domain-specific indoor scene image synthesis, namely neural scene decoration. Given a photograph of an empty indoor space and a list of decorations with layout determined by user, we aim to synthesize a new image of the same space with desired furnishing and decorations. Neural scene decoration can be applied to create conceptual interior designs in a simple yet effective manner. Our attempt to this research problem is a novel scene generation architecture that transforms an empty scene and an object layout into a realistic furnished scene photograph. We demonstrate the performance of our proposed method by comparing it with conditional image synthesis baselines built upon prevailing image translation approaches both qualitatively and quantitatively. We conduct extensive experiments to further validate the plausibility and aesthetics of our generated scenes. Our implementation is available at https://github.com/hkust-vgd/neural{\_}scene{\_}decoration.”,
isbn=”978-3-031-20050-2″
}
49 minutes
Hong-Wing Pang, Yingshu Chen, Phuoc-Hieu Le, Binh-Son Hua, Duc Thanh Nguyen, Sai-Kit Yeung
ECCV 2022