Face Direction-based Human-Computer Interface using Image Observation and EMG Signal for The Disabled
This paper proposes a method to estimate face direction by combining image observation along with EMG signals from the neck (sternocleidomastoideus) muscles.
The EMG signals were, naturally, greater when the head was rotated a larger angle. For this study, if the EMG readings were really high only the EMG values were used to figure out face direction.
Likewise, when the EMG angle was really small (less than 30 degrees) only the image observation was used. When it was in the middle (30-90 degrees), a weighted hybrid function was used to exploit linear properties.
In order to detect the face and pupils for image observation the image was converted from RGB to HSI color data. This was to extract the brightness component.
By using the Gaussian distribution of skin color, the user can be sorted from the background.
Labeling, shrinking, and dilation processing are then applied to the skin region to seek out the features such as the eyes, nose, and eyebrows.
Searching to label predefined features (nose, mouth, eyes, eyebrows)
Using separability filter to look for high contrast between region 1 and region 2 in order to locate the pupils.
Pupil candidates before and after the separability filter.
Robotics and Automation, 2003. Proceedings. ICRA ’03. IEEE International Conference on
Date of Conference: 14-19 Sept. 2003
Author(s): Inhyuk Moon
Korea Orthopedics & Rehabilitation Eng. Center, Incheon, South Korea
Kyunghoon Kim ; Jeicheong Ryu ; Museong Mun
Page(s): 1515 – 1520 vol.1
Product Type: Conference Publications