Figure 1: The hardware modification on a non-eye-tracking VR headset.
This project presents a cost-effective eye-tracking solution (hardware + software) designed to upgrade a standard non-eye-tracking VR headset (e.g., Meta Quest 3) into an eye-tracking compatible research instrument at a fraction of commercial solutions' cost. The system achieves an average tracking accuracy of 4.8° of visual angle with a refresh rate of 60 Hz, making it suitable for specific research applications including attention studies, interaction technique development, and gaze analytics.
Figure 2: A simplified illustration of the hardware setup and overall data flow.
Eye tracking represents a critical component in VR research, enabling analysis of user attention patterns, interaction behaviors, and enabling foveated rendering. However, commercial eye-tracking solutions present significant barriers to entry:
This project aims to demonstrate that an affordable eye-tracking solution ($200-300 total build cost) can serve as a practical alternative for educational environments and exploratory research, while providing students with valuable hands-on experience in computer vision and hardware integration.
Figure 8: The final build of the VR eye tracking hardware system (image placeholder).
Note: These specifications were measured in April 2025 using the wired configuration.
The 4.8° accuracy translates to different real-world discrimination abilities:
This project builds upon the open-source EyeTrackVR project. Comprehensive assembly guides and technical documentation can be found in their official documentation.
Screenshot - Click to enlarge
A non-eye-tracking VR headset (e.g., Meta Quest 3, ~$499)
Screenshot - Click to enlarge
Minimum 2 IR-sensitive mini cameras (recommend purchasing 4 units, as IR filter removal carries risk of damage; ~$10-15 each)
2 Microcontroller boards for image processing and data transmission (options include ESP32-CAM, Xiao ESP32S3 Sense; ~$15-25 each)
IR illumination system using 850nm wavelength IR LEDs (recommended: official LED set from EyeTrackVR with XL-3216HIRC-850; ~$20)
Total estimated cost: $200-300 (excluding the VR headset)
Note: Most components are available through Amazon with rapid shipping. However, AliExpress offers significantly lower prices (see EyeTrackVR documentation for specific links). Be aware that AliExpress shipping typically takes 2-4 weeks, so plan accordingly if pursuing the more economical option.
When following the EyeTrackVR documentation, pay special attention to these critical points:
⚠️ Notes: Minimize camera connection/disconnection cycles. After initial testing, it is highly recommended to immediately apply protective measures to the ribbon cable as outlined in the "Protecting a Camera Ribbon Cable" section. These ribbon connections are extremely delicate and prone to failure under repeated stress.
Figure 3: Apply electrical tape reinforcement to protect the camera ribbon cable.
My final version positions the cameras at approximately 45° angles relative to the eye plane, which:
However, it is important to note that optimal camera angles may vary based on individual facial anatomy and headset fit. Experimentation with different angles is recommended to identify the best configuration for your specific use case.
Figure 3: Camera positioning at 45° angles relative to the eye plane.
Carefully plan your cable routing before permanent installation. The 3D printed camera mounts typically align the ribbon cable outlet with the USB-C port orientation. In this implementation, the camera ribbon was intentionally routed in the opposite direction of the USB-C cable to minimize cable congestion around the headset.
If adopting this approach, ensure thorough ribbon cable protection with electrical tape on both sides, and avoid any sharp bends in the ribbon. Gradual curves are essential for maintaining long-term reliability.
Figure 4: Camera mounting and cable routing example.
When assembling the EyeTrackVR official LED arrays (V4 module design with 4 LEDs per eye), note that the circuits for left and right eyes are inverted relative to each other. This design consideration ensures proper IR illumination across the full visual field.
Figure 5: LED circuit orientation for left and right eyes.
⚠️ Notes: This prototype implementation relied on temporary mounting solutions using adhesives and tape to attach components to the headset. This rough build approach was chosen to facilitate rapid iteration and testing but likely impacts the overall accuracy and stability of the system. Future iterations should explore more permanent mounting solutions.
The microcontroller boards require appropriate firmware to function correctly. This implementation uses the wired configuration for its higher refresh rate, lower latency, and better stability.
Algorithm Selection: This implementation uses the Adaptive Starburst Hybrid Sample Feature (ASHSFRAC) algorithm for pupil detection and gaze estimation, which proved most effective in testing environments.
Troubleshooting tip: If experiencing connection issues, ensure your computer and the microcontroller boards are on the same WiFi network. Some institutional networks with client isolation may prevent proper communication. Consider using a dedicated WiFi router for the eye tracking system if necessary.
A custom Unity program was developed for this project to handle calibration, validation, real-time visualization, and data collection.
Figure 6: Unity calibration interface showing 9-point grid display (image placeholder).
The calibration algorithm creates a mapping between:
After calibration, the system measures visual angle changes when shifting gaze between validation points and compares these with actual angular separations. This determines the "wiggle room" or error margin in tracking accuracy.
Recalibration Requirements: Testing showed that recalibration is typically needed after approximately 10 minutes of use, especially after significant head movements, headset adjustments, or changes in user posture.
The system logs comprehensive data for post-processing analysis including timestamps, gaze positions (2D screen and 3D world coordinates), head position/rotation, and eye openness. Data is exported in CSV format for easy analysis.
While this DIY solution cannot match the specifications of high-end commercial systems, it provides sufficient performance for many research applications at a fraction of the cost:
| System | Sampling Rate | Accuracy | Cost |
|---|---|---|---|
| EyeTrackVR (This Project) | ~60 Hz | 4.8° visual angle (SD = 1.8°) | ~$200-300 |
| Varjo XR-3 | 200 Hz | <0.5° visual angle | ~$6,500 |
| HTC Vive Pro Eye | 120 Hz | ~0.5-1.0° visual angle | ~$1,500 |
| Meta Quest Pro | 90 Hz | ~1° visual angle | ~$1,000 |
Several challenges and limitations were identified during implementation:
These maintenance requirements indicate that the current implementation is better suited for exploratory research and early-stage demonstrations rather than extensive participant studies or rigorous data collection for publication. The system serves as an excellent educational tool and proof-of-concept platform.
Several promising directions for future development were identified:
This project demonstrates that affordable, open-source eye tracking solutions can be successfully implemented for VR research applications. While commercial systems offer higher specifications, this approach dramatically reduces the barrier to entry for eye-tracking research, making it accessible to a wider range of institutions and independent researchers.