đź‘‹ Exploring hand gestural control for mobile devices
- 7 minsEliciting User-defined Touch and Mid-air Gestures for Co-located Mobile Gaming
Nature: Academic research
Type: Individual project
Methodologies: Questionnaires, interviews, think-aloud, survey, thematic analysis, data visualisation, descriptive statistics, t-tests
Duration: 4 months
My role: I managed the entire project starting from literature review of past gesture elicitation studies and gestural control. Following the study design, which includes survey design, artifact design, interview script, I went on recruiting participants and carrying out the lab sessions. After collecting video recordings of the elicitation and survey response, I coded the quant/qual data for analysis, subsequently writing up the report.
Background
In recent years, mobile games have become increasingly popular and have largely improved on their interaction techniques. This improvement is enabled by the increasing capability in modern mobile devices as they feature sophisticated sensors such as accelerometers, gyroscope, and motion sensors, which allows for a vast range of input methods.
To make use of the improving capability in mobile devices in the domain of mobile games, there has been research that explores alternative input methods (other than using capacitive touchscreens).
Research Approach
In this research, I aim to explore the use of gesture controls in co-located mobile gaming, an area that has not been focused on in the industry and research community. I will explore traditional multiplayer tabletop games such as board and card games due to their clearly defined game tasks, and the communicative nature of the game, and the materiality of the game materials cherished by players in a co-located setting.
Methodology
I draw upon the widely adopted gesture elicitation methodology to understand user mental models and help develop user-defined gestures.
User elicitation is one form of participatory design to include users’ mental models and proposals in designing new interaction techniques. Elicitation studies aim to invoke easy-to-learn and memorable user-defined gestures instead of gestures that are optimised for machine recognition. In an elicitation, participants are asked to propose gestures to achieve tasks (known as referents) in a specified modality.
I conducted a gesture elicitation study for tasks common in a multiplayer card and board game moderated by mobile devices. I recruited 24 participants in pairs. Twelve were working professionals different backgrounds such as analytics, marketing, engineering, clinical settings, and sports. The other twelve were university students in various disciplines.
Research question:
How can the results and observations made in a gesture elicitation for game tasks inform gesture design for co-located multiplayer mobile games?
Data Collection and Analysis
Data collected include observational notes jotted by the experimenter and the video recordings of the entire sessions, including the elicitation and the post-study interviews. The pre- and post-study questionnaires contain demographic, technology usage habits, Likert-scale and free-form responses. Descriptive statistics were generated from the Likert-scale questions.
All gestures were coded and fed into the AGAte 2.0 tool for agreement rate calculation.
Results
The final consensus gesture set:
Quantitative findings
A total of 662 gesture proposals, with 286 distinct gestures (by referents) were collected. t-tests were performed to compare gesture proposals between Mid-air and Touch modality.
Agreement rates
-
As expected, the average agreement for Touch gestures (ARtouch=0.215) was higher than that for Mid-air gestures (ARmid-air=0.101) i.e. a wider variety of gesture proposals in the Mid-air modality than in the Touch modality.
-
Two dichotomous pairs – Give cardtouch / Take cardtouch and Give chiptouch / Take chiptouch show a significant difference in agreement rates. However, we did not see a specific trend in these differences.
Subjective rating
-
In general, participants found it fun to propose gesture by themselves (mean=6.25, SD=0.77) and with a partner (mean=6.55, SD=0.81).
-
However, 7 participants found it difficult to suggest gestures (mean=3.41, SD=1.73).
-
Regarding the final gestures chosen favourite, participants agreed that the set was a good fit for its purpose (mean=6.46, SD=0.58), and to a lesser extent, learnable (mean=5.67, SD=1.40) and easy to perform (mean=5.92, SD=0.95).
-
Response was mixed for whether the gestures were tiring to perform (mean=3.5, SD=1.80).
Taxonomy
Findings from the taxonomy are omitted in this post.
Qualitative findings
Six themes emerged from the thematic analysis of interviews, think-aloud data during the elicitation and written comments in our post-study questionnaires.
-
All participants welcomed the idea of using gestures to control mobile devices in a board or card game, but to various extent.
-
They welcomed gestural control due to the novelty of alternative gestures and the potential of gesture to streamline the game flow.
-
Besides the inclination for “something other than touch” in games, participants preferred gestures that give them a sense of achievement and foster interaction between players.
-
Participants recognised realistic gestures can assist the game by forming shared situation awareness in all players.
-
Participants were enthusiastic about proposing artefact- manipulative gestures and frequently described them as “fun”.
-
Participants considered social etiquette when using gesture for a referent that involves a partner.
- Similar to prior studies, legacy bias affect how participants propose gestures.
Implication
Here are some brief design implication from the study.
On Input Modality
- The agreement rates of each referent/modality combination can give insights into the suitable modality for a referent
On Gesture Design
-
Consider using gestures to enhance situation awareness.
-
Take advantage of modalities beyond touch to enrich the co- located gaming experience for all stakeholders, as well as to be inclusive to novice players with the educational opportunities provided by gestural input.
-
Design a cohesive gesture set that is available across different form factors
-
Explore explicit visual feedback to guide users in their action, as feedback can help users build their understanding of the gesture-based system.
On Sensing Technology
-
Use multiple inputs concurrently or in sequence to recognise complete compound gestures.
-
Be resilient to the differences in the number of fingers for touch gesture
This post serves as an overview of some main points in the paper that some bits here and there are omitted. The detailed explanation of the participants, procedure, artifacts, agreement rates, taxonomy and gestures are presented in the full paper. Feel free to drop me a message if you are interested!