Desktop software applications are built with traditional user interfaces in mind. Unlike modern Smartphones and tablets where Tactile and other haptic user-interface modalities are primary, desktop user-interfaces put excessive focus on keystrokes/mouse clicks. It can be established from data that the leading cause of repetitive strain injuries and ensuing lost workdays at workplaces is carpal tunnel syndrome arising from keyboard/mouse usage. The leading focus of this study is to demonstrate a high level user interface layer that can coordinate between a multiplicity of modalities and can be used to drastically reduce the number of mouse-clicks/keystrokes. The modalities will be implemented bottom up and the algorithms involved will be detailed. The user interface layer can be connected to any application through a simple API , thus it can be used to extend the accessibility of any existing software.
Keywords
Voice-actuated-events, Phonemes, Speech-synthesis, Classifier, Bright-spot Detection, Nose tip tracking, Eye-Blink-detection.