Spatially-Varying Diffuse Reflectance Capture Using Irradiance Map Rendering for Image-Based Modeling Applications

Publikation: Bidrag til bog/antologi/rapport/konference proceedingKonferenceartikel i proceedingForskningpeer review

11 Downloads (Pure)

Resumé

Image-based 3D modelling using Structure-form-Motion (SfM) has
matured significantly over the last decade. Standard SfM methods
create the object’s texture from the appearance of the physical object
at the time of acquisition. We propose a method for acquiring
the diffuse per-point reflectance of the modelled object, as part of
the image acquisition work flow, only adding one extra captured
image and an irradiance rendering step, making it easy for anyone
to digitize physical objects to create 3D content for AR/VR using
only consumer grade hardware. Current state of the art of spatially
varying reflectance capture requires either large, expensive, and
purpose built setups or are optimization based approaches, whereas
the proposed approach is model based.
This paper proposes adding a render of irradiance with modelled
camera and light source, using off the shelf hardware for image capture. The key element is taking two images at each imaging location:
one with just the ambient illumination conditions, and one where the
light from an on-camera flash is included. It is demonstrated how to
get the ambient illumination to cancel out, and by assuming Lambertian materials, render the irradiance corresponding to the flash-only
image, enabling computation of spatially varying diffuse reflectance
rather than appearance. Qualitative results demonstrate the added
realism of the modelled objects when used as assets in renders under
varying illumination conditions, including limited outdoor scenarios.
Quantitative tests demonstrate that the reflectance can be estimated
correctly to within a few percent even in cases with severe un-even
ambient illumination.
OriginalsprogEngelsk
Titel2019 IEEE International Symposium on Mixed and Augmented Reality
ForlagIEEE Computer Society Press
Publikationsdato2020
ISBN (Elektronisk)978-1-7281-0987-9
StatusUdgivet - 2020
Begivenhed2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR 2019) - Beijing, Kina
Varighed: 14 okt. 201915 nov. 2019
http://www.ismar19.org

Konference

Konference2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR 2019)
LandKina
ByBeijing
Periode14/10/201915/11/2019
Internetadresse

Fingerprint

Lighting
Hardware
Image acquisition
Light sources
Textures
Cameras
Imaging techniques
Rendering (computer graphics)

Citer dette

Ladefoged, K. S., & Madsen, C. B. (2020). Spatially-Varying Diffuse Reflectance Capture Using Irradiance Map Rendering for Image-Based Modeling Applications. I 2019 IEEE International Symposium on Mixed and Augmented Reality IEEE Computer Society Press.
@inproceedings{2eb9e06ba4684874a7a317f896a160c1,
title = "Spatially-Varying Diffuse Reflectance Capture Using Irradiance Map Rendering for Image-Based Modeling Applications",
abstract = "Image-based 3D modelling using Structure-form-Motion (SfM) hasmatured significantly over the last decade. Standard SfM methodscreate the object’s texture from the appearance of the physical objectat the time of acquisition. We propose a method for acquiringthe diffuse per-point reflectance of the modelled object, as part ofthe image acquisition work flow, only adding one extra capturedimage and an irradiance rendering step, making it easy for anyoneto digitize physical objects to create 3D content for AR/VR usingonly consumer grade hardware. Current state of the art of spatiallyvarying reflectance capture requires either large, expensive, andpurpose built setups or are optimization based approaches, whereasthe proposed approach is model based.This paper proposes adding a render of irradiance with modelledcamera and light source, using off the shelf hardware for image capture. The key element is taking two images at each imaging location:one with just the ambient illumination conditions, and one where thelight from an on-camera flash is included. It is demonstrated how toget the ambient illumination to cancel out, and by assuming Lambertian materials, render the irradiance corresponding to the flash-onlyimage, enabling computation of spatially varying diffuse reflectancerather than appearance. Qualitative results demonstrate the addedrealism of the modelled objects when used as assets in renders undervarying illumination conditions, including limited outdoor scenarios.Quantitative tests demonstrate that the reflectance can be estimatedcorrectly to within a few percent even in cases with severe un-evenambient illumination.",
author = "Ladefoged, {Kasper Skou} and Madsen, {Claus Br{\o}ndgaard}",
year = "2020",
language = "English",
booktitle = "2019 IEEE International Symposium on Mixed and Augmented Reality",
publisher = "IEEE Computer Society Press",
address = "United States",

}

Ladefoged, KS & Madsen, CB 2020, Spatially-Varying Diffuse Reflectance Capture Using Irradiance Map Rendering for Image-Based Modeling Applications. i 2019 IEEE International Symposium on Mixed and Augmented Reality. IEEE Computer Society Press, 2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR 2019), Beijing, Kina, 14/10/2019.

Spatially-Varying Diffuse Reflectance Capture Using Irradiance Map Rendering for Image-Based Modeling Applications. / Ladefoged, Kasper Skou; Madsen, Claus Brøndgaard.

2019 IEEE International Symposium on Mixed and Augmented Reality. IEEE Computer Society Press, 2020.

Publikation: Bidrag til bog/antologi/rapport/konference proceedingKonferenceartikel i proceedingForskningpeer review

TY - GEN

T1 - Spatially-Varying Diffuse Reflectance Capture Using Irradiance Map Rendering for Image-Based Modeling Applications

AU - Ladefoged, Kasper Skou

AU - Madsen, Claus Brøndgaard

PY - 2020

Y1 - 2020

N2 - Image-based 3D modelling using Structure-form-Motion (SfM) hasmatured significantly over the last decade. Standard SfM methodscreate the object’s texture from the appearance of the physical objectat the time of acquisition. We propose a method for acquiringthe diffuse per-point reflectance of the modelled object, as part ofthe image acquisition work flow, only adding one extra capturedimage and an irradiance rendering step, making it easy for anyoneto digitize physical objects to create 3D content for AR/VR usingonly consumer grade hardware. Current state of the art of spatiallyvarying reflectance capture requires either large, expensive, andpurpose built setups or are optimization based approaches, whereasthe proposed approach is model based.This paper proposes adding a render of irradiance with modelledcamera and light source, using off the shelf hardware for image capture. The key element is taking two images at each imaging location:one with just the ambient illumination conditions, and one where thelight from an on-camera flash is included. It is demonstrated how toget the ambient illumination to cancel out, and by assuming Lambertian materials, render the irradiance corresponding to the flash-onlyimage, enabling computation of spatially varying diffuse reflectancerather than appearance. Qualitative results demonstrate the addedrealism of the modelled objects when used as assets in renders undervarying illumination conditions, including limited outdoor scenarios.Quantitative tests demonstrate that the reflectance can be estimatedcorrectly to within a few percent even in cases with severe un-evenambient illumination.

AB - Image-based 3D modelling using Structure-form-Motion (SfM) hasmatured significantly over the last decade. Standard SfM methodscreate the object’s texture from the appearance of the physical objectat the time of acquisition. We propose a method for acquiringthe diffuse per-point reflectance of the modelled object, as part ofthe image acquisition work flow, only adding one extra capturedimage and an irradiance rendering step, making it easy for anyoneto digitize physical objects to create 3D content for AR/VR usingonly consumer grade hardware. Current state of the art of spatiallyvarying reflectance capture requires either large, expensive, andpurpose built setups or are optimization based approaches, whereasthe proposed approach is model based.This paper proposes adding a render of irradiance with modelledcamera and light source, using off the shelf hardware for image capture. The key element is taking two images at each imaging location:one with just the ambient illumination conditions, and one where thelight from an on-camera flash is included. It is demonstrated how toget the ambient illumination to cancel out, and by assuming Lambertian materials, render the irradiance corresponding to the flash-onlyimage, enabling computation of spatially varying diffuse reflectancerather than appearance. Qualitative results demonstrate the addedrealism of the modelled objects when used as assets in renders undervarying illumination conditions, including limited outdoor scenarios.Quantitative tests demonstrate that the reflectance can be estimatedcorrectly to within a few percent even in cases with severe un-evenambient illumination.

UR - https://www.ismar19.org/enweb/news/256.html

UR - https://www.ismar19.org/enweb/news/257.html

M3 - Article in proceeding

BT - 2019 IEEE International Symposium on Mixed and Augmented Reality

PB - IEEE Computer Society Press

ER -

Ladefoged KS, Madsen CB. Spatially-Varying Diffuse Reflectance Capture Using Irradiance Map Rendering for Image-Based Modeling Applications. I 2019 IEEE International Symposium on Mixed and Augmented Reality. IEEE Computer Society Press. 2020