Virtual Reality Pegboard Test Reveals Discrepancy Between Performance and User Preference

Virtual Reality Pegboard Test Reveals Discrepancy Between Performance and User Preference


Virtual reality (VR) applications that involve virtual hand interactions are widely used and highly beneficial. However, a recent study led by Concordia University highlights the significance of personal preference in the implementation of this technology, regardless of its impact on overall performance.


Presented at the IEEE International Symposium on Mixed and Augmented Reality (ISMAR) in October 2020, the research paper discusses experiments conducted on participants using a VR-based Purdue Pegboard Test (PPT). The PPT is commonly employed as a therapeutic tool for patients who have experienced neurological damage, such as stroke, with the goal of enhancing gross and fine motor skills.

The participants were equipped with a VR headset and were instructed to pick up a virtual object and insert it into a hole with speed and accuracy. Different variations of the task involved using the dominant and non-dominant hands, both hands simultaneously, and assembly tasks.

The tasks were repeated across three different modes. In the first mode, the participants’ virtual hand was opaque, obstructing their view. In the second mode, the outline of the virtual hand was visible, but the hand itself was transparent. In the third mode, the virtual hand disappeared once the peg was picked up.

Various metrics including duration, downtime, movement time, path length, linear velocity, angle, and angular velocity were recorded.

The participants using opaque hands exhibited noticeably slower performance. They moved their fingers more narrowly and completed fewer tasks compared to those using invisible hand visualization.

Laurent Voisard, the lead author of the study, explains that this result was expected since the invisible hand visualization does not obstruct the object being held by the participant. The invisible hand enables users to have better control and visibility while placing the peg. It also enhances motor dexterity when performing movements requiring precise hand movements. This finding could potentially contribute to the development of more effective and efficient medical applications in VR.

Surprisingly, not all participants preferred the invisible hand. Out of the participants, 10 preferred the transparent hand, seven chose the opaque hand, and seven preferred the invisible hand. Participants who favored the transparent hand emphasized that they found it easier to perceive both the hands and the environment simultaneously. They also found interacting with the virtual objects to be more seamless.

On the other hand, participants who preferred the opaque hand stated that movements were easier to track and control. Conversely, those who favored the invisible hand mentioned that it was more comfortable and easier to complete the task while discerning when it was finished.

The researchers hope that this study will serve as a foundation for further research, exploring how VR and PPT can be effectively used for therapeutic purposes and in technical fields such as surgical planning.

Anil Ufuk Batmaz, co-author and assistant professor in the Department of Computer Science and Software Engineering at the Gina Cody School of Engineering and Computer Science, suggests that user preference should be taken into account when visualizing VR experiences. Although a particular visualization may yield better results, it may not be well-received by users, leading to their potential disengagement from the system.

The PPT is widely employed by neurologists as a diagnostic tool for individuals who have experienced brain injuries or strokes. However, this study reveals its potential for rehabilitation purposes, especially in the context of virtual environments for home-based rehabilitation.

Contributing to this study were Amal Hatira and Mine Sarac from Kadir Has University in Istanbul, Turkey.

1.Source: Coherent Market Insights, Public sources, Desk research
2.We have leveraged AI tools to mine information and compile it