WebCam Eye Tracker
This demo uses your webcam to estimate where you're looking on the screen in real-time. It's a browser-based implementation inspired by my research at Czech Technical University under Prof. Jiri Matas.
Privacy First
This demo processes everything locally in your browser. Your camera feed is never sent to any server. Anonymous gaze logging is off by default and only activates if you explicitly enable it.
Browser Requirements
This demo requires a modern browser with camera access (Chrome, Firefox, Safari, Edge). For best results, ensure good lighting and position yourself directly in front of the camera.
Camera is off
When enabled, gaze coordinates are sent to help improve the model. No personal data is collected. Logging is off by default.
How It Works
1. Face Detection
Locates your face and eyes in the video stream using facial landmark detection.
2. Calibration
Maps eye positions to screen coordinates by having you look at known points.
3. Gaze Estimation
Predicts where you're looking using the calibrated model in real-time.
Note: The current demo uses simulated gaze tracking. The full model integration is in progress. The architecture is designed to be easily extensible with TensorFlow.js or ONNX runtime for real inference.