Accessibility

Accessibility

For users with communication difficulties — due to cognitive, neurological, or physical conditions — biometric signals can bridge the gap that text and speech cannot. Nefesh enables AI systems to understand user state even when verbal expression is limited.

The Problem

People with autism spectrum conditions, speech impairments, PTSD, or anxiety disorders may struggle to express their emotional state verbally. Traditional AI systems that rely purely on text input miss critical context about how the user is actually feeling.

How Nefesh Helps

  1. Non-verbal state detection — heart rate, HRV, and facial expression provide emotional context without requiring the user to describe their feelings
  2. Sensory overload detection — rising stress signals can trigger calming UI changes (reduced stimulation, simpler layouts, quieter audio)
  3. Communication adaptation — AI adjusts language complexity, sentence length, and response speed based on real-time cognitive load indicators
  4. Caregiver awareness — share anonymized state summaries with caregivers or therapists (with consent) for better support

Ethical Considerations

  • Always obtain informed consent from the user (and their guardian if applicable)
  • Biometric data is never used for diagnosis — Nefesh is not a medical device
  • Users must be able to opt out at any time with immediate data deletion via the GDPR deletion API
  • All processing follows GDPR principles of data minimization and purpose limitation