The precise eye state detection is a fundamental stage for various activities that require human-machine interaction (HMI). This chapter presents an analysis of the implementation of a system for navigating a wheelchair with automation (CRA), based on facial expressions, especially eyes closed using a Haar cascade classifier (HCC). Aimed at people with locomotor disability of the upper and lower limbs, the state detection was based on two steps: the capture of the image, which concentrates on the detection actions and image optimization; actions of the chair, which interprets the data capture and sends the action to the chair. The results showed that the model has excellent accuracy in identification with robust performance in recognizing eyes closed, bypassing well occlusion issues and lighting with about 98% accuracy. The application of the model in the simulations opens the implementation and marriage opportunity with the chair sensor universe aiming a safe and efficient navigation to the user.
Part of the book: Robot Control