Handling photographic imperfections and aliasing in augmented reality

DSpace Repository

Show simple item record

dc.contributor.author Fischer, Jan de_DE
dc.contributor.author Bartz, Dirk de_DE
dc.date.accessioned 2006-07-24 de_DE
dc.date.accessioned 2014-03-18T10:15:53Z
dc.date.available 2006-07-24 de_DE
dc.date.available 2014-03-18T10:15:53Z
dc.date.issued 2006 de_DE
dc.identifier.other 28696063X de_DE
dc.identifier.uri http://nbn-resolving.de/urn:nbn:de:bsz:21-opus-23844 de_DE
dc.identifier.uri http://hdl.handle.net/10900/48939
dc.description.abstract In video see-through augmented reality, virtual objects are overlaid over images delivered by a digital video camera. One particular problem of this image mixing process is the fact that the visual appearance of the computer-generated graphics differs strongly from the real background image. In typical augmented reality systems, standard real-time rendering techniques are used for displaying virtual objects. These fast, but relatively simplistic methods create an artificial, almost "plastic-like" look for the graphical elements. In this paper, methods for incorporating two particular camera image effects in virtual overlays are described. The first effect is camera image noise, which is contained in the data delivered by the CCD chip used for capturing the real scene. The second effect is motion blur, which is caused by the temporal integration of color intensities on the CCD chip during fast movements of the camera or observed objects, resulting in a blurred camera image. Graphical objects rendered with standard methods neither contain image noise nor motion blur. This is one of the factors which makes the virtual objects stand out from the camera image and contributes to the perceptual difference between real and virtual scene elements. Here, approaches for mimicking both camera image noise and motion blur in the graphical representation of virtual objects are proposed. An algorithm for generating a realistic imitation of image noise based on a camera calibration step is described. A rendering method which produces motion blur according to the current camera movement is presented. As a by-product of the described rendering pipeline, it becomes possible to perform a smooth blending between virtual objects and the camera image at their boundary. An implementation of the new rendering methods for virtual objects is described, which utilizes the programmability of modern graphics processing units (GPUs) and is capable of delivering real-time frame rates. en
dc.language.iso en de_DE
dc.publisher Universität Tübingen de_DE
dc.rights ubt-podok de_DE
dc.rights.uri http://tobias-lib.uni-tuebingen.de/doku/lic_mit_pod.php?la=de de_DE
dc.rights.uri http://tobias-lib.uni-tuebingen.de/doku/lic_mit_pod.php?la=en en
dc.subject.classification Erweiterte Realität <Informatik> , Mixed Reality , Computergraphik , Dreidimensionale Computergraphik , Rendering de_DE
dc.subject.ddc 004 de_DE
dc.subject.other Photographische Imperfektionen , Aliasing , Bewegungsunschärfe , Bildrauschen de_DE
dc.subject.other Photographic imperfections , aliasing , motion blur , image noise en
dc.title Handling photographic imperfections and aliasing in augmented reality en
dc.type Report de_DE
dc.date.updated 2012-10-11 de_DE
utue.publikation.fachbereich Sonstige - Informations- und Kognitionswissenschaften de_DE
utue.publikation.fakultaet 7 Mathematisch-Naturwissenschaftliche Fakultät de_DE
dcterms.DCMIType Text de_DE
utue.publikation.typ report de_DE
utue.opus.id 2384 de_DE
utue.opus.portal wsi de_DE
utue.opus.portalzaehlung 2006.03000 de_DE
utue.publikation.source WSI ; 2006 ; 3 de_DE
utue.publikation.reihenname WSI-Reports - Schriftenreihe des Wilhelm-Schickard-Instituts für Informatik de_DE
utue.publikation.zsausgabe 2006, 3
utue.publikation.erstkatid 2919855-0


This item appears in the following Collection(s)

Show simple item record