Telehealth
In telehealth sessions, clinicians lose the non-verbal cues they rely on in person — body language, skin color changes, fidgeting. Nefesh restores this awareness by providing real-time patient state to the AI assistant.
The Problem
Video calls flatten emotional expression. A patient may say "I'm fine" while their heart rate is elevated and their voice is tense. Without physiological context, AI assistants in telehealth platforms miss these signals entirely.
How Nefesh Helps
- Real-time patient state — display stress level alongside the video feed (with patient consent)
- AI note-taking — the AI scribe can annotate moments of elevated stress during the session
- Session comparison — track stress trends across appointments to measure treatment progress
- Alert on acute stress — notify the clinician if the patient's stress score spikes above a threshold
Signal Sources
- Camera rPPG — heart rate from the existing video feed (no additional hardware)
- Wearable — patient's Apple Watch or Polar H10 for clinical-grade HRV
- Voice — tone classification from the audio stream
- Text — sentiment analysis of patient messages in chat
Important
Nefesh is not a medical device. It provides body signals as AI context only. It does not provide clinical interpretation, diagnosis, or medical recommendations. See Regulatory for details.