Real-Time 3D Hand Tracking for 3D Modelling Applications

Show simple item record

dc.contributor.advisor Gimel'farb, G en
dc.contributor.advisor Lutteroth, C en
dc.contributor.advisor Delmas, P en
dc.contributor.author Lau, Anthony en
dc.date.accessioned 2011-07-18T21:05:48Z en
dc.date.issued 2011 en
dc.identifier.uri http://hdl.handle.net/2292/6949 en
dc.description Full text is available to authenticated members of The University of Auckland only. en
dc.description.abstract Computer mice and graphics tablets serve as the main input devices for many 3D modelling applications. Numerous human computer interactive (HCI) researches attempt to develop various direct control interfaces making use of natural hand motion, postures and gestures on haptic-base and force-base approaches with computer vision as assistance [1, 2, 3, 4, 5, 6, 7, 8]. Nonetheless, research purely relying on computer vision as HCI for 3D modelling are in small numbers [9, 10]. In this thesis, we focus on the development of a real-time stereo system for hand tracking, aidded by a stereo web cam, to perform 3D view navigations from the 3D program, Blender. This prototype is also served as a trial to develop the foundation framework for long time study. Our prototype operates with a low-budget commercial stereo web cam hand tracking system. Our current tracking system concentrates only on tracking the motion of the single colour marker mounted at the finger tip of a single glove. Keyboard pressing is set as the primary ways to access different 3D operations, instead of hand postures and gestures. At present, our system implements the support for the basic 3D view navigation (i.e. view translation, rotation and zooming) in Blender, which are essential to assist any further 3D mesh manipulations, including digital sculpting. The motion of the colour marker is registered as the tracking target at the start of the system. Once the the tracking process started, the marker is being tracked by a pair of Continuously Adaptive Mean Shift (CAMShift) tracking algorithms run on both camera channels. The tracked results are then translated, by triangulation based on the current calibrated webcam, into real world coordinates. Once smoothened by the fast-moving average data filter, the streams of 3D real world coordinates will then be sent to Blender via a local TCP connection. Under proper scaling and the appropriate keyboard button being pressed, the user can navigate the 3D view port in Blender freely with respect to the motion of their marker. Our research shows that stereo vision has the potentials to be used as a new human-computer interaction interface for controlling operations in a 3D space. We have implemented the fundamental frameworks successfully. In the future works, we would like to explore the use of hand gestures and poses for command selections, e.g. view translation, vertex selection, etc. We also want to enable our future system to get the digital sculpting support from Blender. en
dc.relation.ispartof Masters Thesis - University of Auckland en
dc.rights Restricted Item. Available to authenticated members of The University of Auckland. en
dc.rights.uri https://researchspace.auckland.ac.nz/docs/uoa-docs/rights.htm en
dc.rights.uri http://creativecommons.org/licenses/by-nc-sa/3.0/nz/ en
dc.title Real-Time 3D Hand Tracking for 3D Modelling Applications en
dc.type Thesis en
thesis.degree.grantor The University of Auckland en
thesis.degree.level Masters en
dc.rights.holder Copyright: The author en
pubs.elements-id 214946 en
pubs.record-created-at-source-date 2011-07-19 en
dc.identifier.wikidata Q112886849


Files in this item

Find Full text

This item appears in the following Collection(s)

Show simple item record

Share

Search ResearchSpace


Browse

Statistics