Figure 1: The conventional VR desktop that mirrors the traditional 2D user interfaces.
Window management and switching between multiple application windows is a ubiquitous feature of modern desktop operating systems. Prior research has established important principles for 2D desktop interfaces and tested various ways of displaying window switching interfaces both on regular screens [1] and large displays [2].
Current VR systems often mirror traditional 2D desktop displays into virtual space (Figure 1), which can capitalize on users' familiarity with conventional user interfaces (e.g., mobile phone [3]). Yet, it potentially undermines VR's unique affordances for spatial interface organization [4] and embodied interaction [5].
This raises important questions about how established window management principles translate to VR environments and whether novel 3D approaches could enhance user performance.
This research aims to explore two key dimensions: First, we seek to validate whether desktop window management findings (e.g., spatial consistency, grid-based layouts, and scaling behaviors) remain applicable in VR contexts where windows exist as floating 2D planes in 3D space. Second, we investigate how VR's unique capabilities for depth perception, peripheral awareness, and volumetric organization can be leveraged to create window management systems that go beyond simple 2D analogues.
The results will inform the design of future VR workspaces, helping determine whether VR can move beyond simply replicating desktop metaphors to create truly spatial computing environments that enhance productivity through improved window management. This research addresses the fundamental question of whether VR workspaces can provide compelling advantages over traditional desktop computing for common window management tasks.
[1] Andrew Warr, Ed H. Chi, Helen Harris, Alexander Kuscher, Jenn Chen, Robert Flack, and Nicholas Jitkoff. 2016. Window Shopping: A Study of Desktop Window Switching. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). Association for Computing Machinery, New York, NY, USA, 3335–3338. https://doi.org/10.1145/2858036.2858526
[2] Lars Lischke, Sven Mayer, Jan Hoffmann, Philipp Kratzer, Stephan Roth, Katrin Wolf, and Paweł Woźniak. 2017. Interaction techniques for window management on large high-resolution displays. In Proceedings of the 16th International Conference on Mobile and Ubiquitous Multimedia (MUM '17). Association for Computing Machinery, New York, NY, USA, 241–247. https://doi.org/10.1145/3152832.3152852
[3] Verena Biener, Daniel Schneider, Travis Gesslein, Alexander Otte, Bastian Kuth, Per Ola Kristensson, Eyal Ofek, Michel Pahud, and Jens Grubert. 2020. Breaking the Screen: Interaction Across Touchscreen Boundaries in Virtual Reality for Mobile Knowledge Workers.
IEEE Transactions on Visualization and Computer Graphics 26, 12 (2020), 3490–3502. https://doi.org/10.1109/TVCG.2020.3023567[4] Steven Feiner, Blair MacIntyre, Marcus Haupt, and Eliot Solomon. 1993. Windows on the world: 2D windows for 3D augmented reality.
In Proceedings of the 6th annual ACM symposium on User interface software and technology (UIST '93). Association for Computing Machinery, New York, NY, USA, 145–155. https://doi.org/10.1145/168642.168657[5] Di Laura Chen, Marcello Giordano, Hrvoje Benko, Tovi Grossman, and Stephanie Santosa. 2023. GazeRayCursor: Facilitating Virtual Reality Target Selection by Blending Gaze and Controller Raycasting. In Proceedings of the 29th ACM Symposium on Virtual Reality Software and Technology (VRST '23). Association for Computing Machinery, New York, NY, USA, Article 19, 1–11. https://doi.org/10.1145/3611659.3615693
I am planning to explore building eye tracking hardware from scratch. Here is an open-source eye tracking project that can turn any mainstream VR headset into an eye racking one.
If this one doesn't work, I will use devices provided by Dr. Andrew Duchowski. For experiment design, I am planning to adopt the interface design and user performance evaluation metrics from [1]. The design space for VR-native multi-window display is still under exploration.
I have started a systematic literature review.
I have ordered all necessary parts for EyeTrackVR. Expected to arrive by Feb 25 (mostly shipped from China)
Finish literature review. Design experiment by March 10. Develop the Unity project with simulated eye track movements.
Conduct experiments and data collection
Write the term paper