Logo
Phone RightPhone Left

How It Works

We collect TAG (touch, accelerometer, gyroscopic) data, completely non-intrusively from the sensors built into your phone. No user attention required. Using this data, our machine learning models extract affective states, allowing us to build a real-time profile of your cognitive and emotional wellbeing.

Data Capture

We record short windows (10s) of every touchscreen gesture and device movement — taps, swipes, holds— alongside inertial readings (accelerometer, gyroscope). These raw signals reflect tiny shifts in your motor patterns that neuroscience links to changes in mood and attention.

Data Capture
Feature Extraction

Feature Extraction

From each window we compute 50+ precise measurements, for example:

  • Touch latency & duration (how quickly you press and how long you hold)
  • Swipe speed & pressure proxy (how fast and forceful your gestures are)
  • Tremor frequency (micro-shakes in your grip)
  • Motion variance (subtle shifts in device tilt)

Psychology research shows these metrics correlate with arousal (alert vs. fatigued), valence (positive vs. negative mood) and dominance (sense of control vs. overwhelm).

Green BlobPink Blob

On-Device Inference

AI trained on thousands of labelled events runs entirely on your device in real-time. It maps those 50+ features into three core affect scores:

Valence

Sad ⇔ Happy

Arousal

Angry ⇔ Calm

Dominance

Afraid ⇔ Relaxed

We never see what you're looking at or what you type — this never leaves your device, safeguarding your privacy while delivering instant insight.