A Fitts Law Evaluation of Two Type Face Tracking Inputs on Electronic Devices

A Fitts Law Evaluation of Two Type Face Tracking Inputs on Electronic Devices

Research on comparison of typical Face tracking and Eye-tracking on mobile devices. This study evaluates the usability and efficiency of position-based and orientation-based head-tracking input methods for hands-free device control.


Overview

This study evaluates the usability and efficiency of two head-tracking input methods—position-based and orientation-based—for controlling electronic devices in a hands-free context. With the growing demand for intuitive, non-contact interactions, such as navigating smart devices while cooking, the research investigates how face-tracking can enable pointer control without external devices.

Fitts Law Study

Inspired by Fitts’ Law, the experiment focused on one-dimensional target acquisition tasks, comparing performance metrics like throughput, movement time, and error rate.

Research Methodology

Participants used a laptop with a built-in webcam to execute tasks under two conditions:

  1. Head Position Tracking - Movement based on head position in space
  2. Head Orientation Tracking - Movement based on head rotation/angle

Experimental Setup

Experimental Design

The study employed a controlled experimental setup to measure:

  • Throughput - Overall efficiency of the input method
  • Movement Time - Time taken to reach targets
  • Error Rate - Frequency of missed or incorrect selections
  • User Comfort - Subjective feedback on ease of use

Study Design

Key Findings

Position-Based Input Superiority

Results indicate that position-based input yielded:

  • ✅ Higher throughput
  • ✅ Lower movement time
  • ✅ Better accuracy and efficiency
  • ✅ More intuitive user experience

Results Comparison

Challenges Identified

However, several challenges impacted user performance:

  • Tracking Reliability - Variations in tracking accuracy
  • Indoor Lighting Variability - Environmental factors affecting detection
  • User Fatigue - Extended use causing discomfort
  • Calibration Requirements - Need for individual user calibration

Challenges Analysis

Technical Implementation

Technologies Used

  • Computer Vision: OpenCV for face detection and tracking
  • Machine Learning: Face landmark detection algorithms
  • Web Technologies: JavaScript for real-time processing
  • Data Analysis: Python for statistical analysis of results

Technical Architecture

Tracking Algorithms

The study implemented custom algorithms for:

  • Real-time face detection
  • Position and orientation calculation
  • Smoothing and filtering of tracking data
  • Adaptive calibration based on user behavior

Applications and Impact

Practical Applications

This research contributes to the design of accessible and practical hands-free interfaces for:

  1. Smart Home Devices - Control without touching surfaces
  2. Cooking Assistance - Navigate recipes while hands are busy
  3. Accessibility Tools - Support for users with limited mobility
  4. Healthcare Settings - Hygienic, contactless interactions
  5. Industrial Applications - Hands-free operation in manufacturing

Applications

Design Implications

The findings provide valuable insights for:

  • Interface designers creating hands-free systems
  • Developers implementing face-tracking features
  • Researchers studying alternative input methods
  • Product managers planning accessible technology

Future Research Directions

Potential Improvements

  1. Enhanced Tracking Algorithms - More robust detection in various lighting conditions
  2. Hybrid Approaches - Combining position and orientation for optimal performance
  3. Adaptive Systems - Learning user preferences and adjusting accordingly
  4. Multi-Modal Input - Integrating voice and gesture controls

Future Directions

Extended Studies

Future research could explore:

  • Long-term usability and user adaptation
  • Performance in real-world scenarios
  • Integration with other assistive technologies
  • Cross-cultural differences in interaction preferences

Conclusion

This study demonstrates that position-based head tracking offers superior performance compared to orientation-based methods for hands-free device control. While challenges remain in tracking reliability and environmental sensitivity, the findings pave the way for enhanced usability in diverse contexts.

Conclusion

The research contributes valuable insights to the field of Human-Computer Interaction, particularly in designing accessible, intuitive, and practical hands-free interfaces for smart devices.

Research Team

  • Temirlan Dzhoroev - Software Engineer & Researcher
  • Joe, BH Kim - Software Engineer & Researcher

Publication

This research was presented at HCI Korea 2022:

Dzhoroev, T., Kim, B.H., & Lee, H.S. (2022). Comparison of Face Tracking and Eye Tracking for Scrolling a Web Browser on Mobile Devices. HCI Korea, pp. 227–231.