NativeSensors

@NativeSensors

Dev Update #4 EyeGestures V2

NativeSensors

Untitled design(1).png

Hey all,

New Engine Introduction

https://github.com/NativeSensors/EyeGestures

We are pleased to announce the deployment of our latest Engine, accompanied by significant API adjustments. This iteration introduces two distinct objects:

EyeGestures_v1()

and

EyeGestures_v2()

While the former adheres to a traditional model-based paradigm, the latter leverages advanced machine learning techniques. Notably, EyeGestures_v2 is distinguished by its enhanced precision and personalized functionality, albeit at a potentially slightly slower operational pace compared to its predecessor.

In light of these developments, both objects now yield two distinct events, with the 'estimate' function replaced by 'step':

event, calibration_event = gestures.step(frame)

The 'event' captures comprehensive data pertaining to points, fixations, and blinking occurrences, while the 'calibration_event' furnishes details regarding calibration point placement and associated accuracy metrics (acceptance radius).

For those keen on gaining deeper insights into the mechanics of our Engine V2, we invite you to explore further below.

V2 Engine Internals:

This section is for premium subscribers only. Subscribe to NativeSensors to get access to it.

Subscribe to NativeSensors

Support NativeSensors by subscribing to their work and get access to exclusive content.