Transcription of KinectFusion: Real-Time Dense Surface Mapping and …
{{id}} {{{paragraph}}}
kinectfusion : Real-Time Dense Surface Mapping and Tracking Richard A. NewcombeImperial College LondonShahram IzadiMicrosoft ResearchOtmar HilligesMicrosoft ResearchDavid MolyneauxMicrosoft ResearchLancaster UniversityDavid KimMicrosoft ResearchNewcastle UniversityAndrew J. DavisonImperial College LondonPushmeet KohliMicrosoft ResearchJamie ShottonMicrosoft ResearchSteve HodgesMicrosoft ResearchAndrew FitzgibbonMicrosoft ResearchFigure 1: Example output from our system, generated in Real-Time with a handheld Kinect depth camera and no other sensing maps (colour) and Phong-shaded renderings (greyscale) from our Dense reconstruction system are shown. On the left for comparisonis an example of the live, incomplete, and noisy data from the Kinect sensor (used as input to our system).ABSTRACTWe present a system for accurate Real-Time Mapping of complex andarbitrary indoor scenes in variable lighting conditions, using only amoving low-cost depth camera and commodity graphics fuse all of the depth data streamed from a Kinect sensor intoa single global implicit Surface model of the observed scene inreal- time .
Figure 2: A larger scale reconstruction obtained in real-time. with such sensors are obvious, but algorithms to date have not fully leveraged the fidelity …
Domain:
Source:
Link to this page:
Please notify us if you found a problem with this document:
{{id}} {{{paragraph}}}