While SensorAble research commenced with my MPhil/PhD enrollment on 3 June 2019, this marks the first official blog post for my project. In case you missed the primary reasoning and rationale behind SensorAble, here’s a quick primer on what the research is all about.
SensorAble (\ˈsen-sȯr-əbəl — or alt. — \sen-ˈsȯr-əbəl) is a multi-disciplinary MPhil/PhD Research Project. It fulfils a gap in scholarly knowledge regarding the exploration, design, application and testing of purpose-built wearable technologies that employ artificial intelligence for cognitive enhancement for those diagnosed with autism, by increasing their attentional focus and quality of life and through de-emphasizing anxiety-inducing environmental distractions and over-stimulation.
SensorAble aims to distinguish itself from prior cognate studies in three respects. Prior research uses sensory technology that “interprets socioaffective cues such as tone of voice or facial expression to systematize and understand social interaction” (Kaliouby & Picard, 2006, 243). In contrast, SensorAble’s goal supports users by means of tuned awarenessand customized intervention through alerts prior to the onset of anxiety, distraction or loss of focus by:
- monitoring physiological responses and environmental disruptions and comparing them to a known and growing catalogue of individually learned distractions (e.g., those visual and aural sensations that create anxiety unique to each user);
- adjusting the user experience by diminishing or eliminating visual disturbances, head sway (pupillary observation or inertial monitoring) and offending sounds (noise-cancelling, spatial focusing and frequency/amplitude corrections); and,
- providing notification or anticipatory feedback through vibration and alerts (haptics) that aim to reduce or eliminate anxiety before onset.
Kaliouby, R. E., & Picard, A. R. (2006). Affective Computing and Autism.