Cardiff University | Prifysgol Caerdydd ORCA
Online Research @ Cardiff 
WelshClear Cookie - decide language by browser settings

WristSketcher: Creating 2D dynamic sketches in AR with a sensing wristband

Ying, Enting, Xiong, Tianyang, Zhu, Gaoxiang, Qiu, Ming, Qin, Yipeng ORCID: and Guo, Shihui 2024. WristSketcher: Creating 2D dynamic sketches in AR with a sensing wristband. International Journal of Human-Computer Interaction 10.1080/10447318.2024.2301857
Item availability restricted.

[thumbnail of WristSketcher__Creating_2D_Dynamic_Sketches_in_AR_With_a_Sensing_Wristband_preprint.pdf] PDF - Accepted Post-Print Version
Restricted to Repository staff only until 29 December 2024 due to copyright restrictions.
Available under License Creative Commons Attribution Non-commercial.

Download (2MB)


Restricted by the limited interaction area of native AR glasses, creating sketches is a challenge in it. Existing solutions attempt to use mobile devices (e.g., tablets) or mid-air hand gestures to expand the interactive spaces and as the 2D/3D sketching input interfaces for AR glasses. Between them, mobile devices allow for accurate sketching but are often heavy to carry. Sketching with bare hands is zero-burden but can be inaccurate due to arm instability. In addition, mid-air sketching can easily lead to social misunderstandings and its prolonged use can cause arm fatigue. In this work, we present WristSketcher, a new AR system based on a flexible sensing wristband that enables users to place multiple virtual plane canvases in the real environment and create 2D dynamic sketches based on them, featuring an almost zero-burden authoring model for accurate and comfortable sketch creation in real-world scenarios. Specifically, we streamlined the interaction space from the mid-air to the surface of a lightweight sensing wristband, and implemented AR sketching and associated interaction commands by developing a gesture recognition method based on the sensing pressure points. We designed a set of interactive gestures consisting of Long Press, Tap and Double Tap based on a heuristic study involving 26 participants. These gestures are correspondingly mapped to various command interactions using a combination of multi-touch and hotspots. Moreover, we endow our WristSketcher with the ability of animation creation, allowing it to create dynamic and expressive sketches. Experimental results demonstrate that our WristSketcher (i) recognizes users’ gesture interactions with a high accuracy of 95.9%; (ii) achieves higher sketching accuracy than Freehand sketching; (iii) achieves high user satisfaction in ease of use, usability and functionality; and (iv) shows innovation potentials in art creation, memory aids, and entertainment applications.

Item Type: Article
Date Type: Published Online
Status: In Press
Schools: Computer Science & Informatics
Publisher: Taylor and Francis Group
ISSN: 1044-7318
Date of First Compliant Deposit: 8 February 2024
Date of Acceptance: 29 December 2023
Last Modified: 22 Mar 2024 15:52

Actions (repository staff only)

Edit Item Edit Item


Downloads per month over past year

View more statistics