Driver Alertness Monitoring Using Fusion of Facial Features and Bio-Signals
By Boon-Giin Lee and Wan-Young Chung
NOTE: This is an overview of the entire article, which appeared in the July 2012 issue of the IEEE Sensors Journal.
Click here to read the entire article.
This article begins with an extensive review of the literature on incorporating sensors into transport systems to monitor driver alertness. The fatigue monitoring system developed by the authors, which incorporates elements of prior work is implemented in an Android-based smartphone. A webcam is placed on the dashboard in front of the driver to capture the image of the driver's face and a finger oximeter sensor is installed on the steering wheel to collect information about blood flow in the finger. The smartphone receives the signals via connected wireless sensor. A Dynamic Bayesian network (DBN) paradigm is programmed in the smartphone to perform statistical analysis on driver fatigue level based on extracted information from the two sensor sources mentioned above.
The article describes the characteristics of the sensors and the processing performed on the sensor data. Considerable initial processing is necessary to extract the signal parameters that are used for assessing driver fatigue. For instance, in using the webcam images to determine eyelid closing, the location of the eyes in the image must first be determined. In order to extract parameters from the oximeter signals, several kinds of analysis were employed. The parameterized information from the separate sensors were then combined in the downstream processing.
Various combinations of derived parameters were investigated for generating the optimum indicator of driver fatigue. The four parameters that were finally selected for the study were: 'percent eye closure', 'average eye closure', 'heart rate variability', and 'power spectrum density'. (The first two parameters were derived from the analysis of the webcam signal and the last two from the oximeter signal.) The article describes these parameters and gives literature references. As mentioned earlier, a Dynamic Bayesian Network algorithm performs a statistical analysis on these parameters to determine driver fatigue level.
To test the effectiveness of the system, 10 volunteer were recruited to drive multiple 40-50 minute test drives at various times during the day and night. An experienced driver accompanied the test drivers as a backup and recorded the test drivers' physical condition. Using the four-parameter method, the paper reports a 'true' fatigue detection rate of 96% and a 'false' detection rate of 8%. The authors indicate their plans to extend the system to include external factors in the fatigue detection algorithm.
ABOUT THE AUTHORS
Boon-Giin Lee received the B.I.T. degree in information technology from Multimedia University, Melaka, Malaysia, in 2007, and the Masters degree in engineering from Dongseo University, Busan, South Korea, in 2009. He is currently pursuing the Ph.D. degree in engineering at the Pukyong National University, Busan. He is with the Department of Electronic Engineering, Pukyong National University. He has published several SCI and SCI-(E) journals. His current research interests include wireless sensor networks, computer networks, graphics modeling, ubiquitous healthcare signal processing, location tracking, smartphone programming, and application development.
Wan Young Chung received the B.Eng. and Masters degrees in electronic engineering from Kyungpook National University, Daegu, South Korea, in 1987 and 1989, respectively, and the Ph.D. degree in sen- sor engineering from Kyushu University, Fukuoka, Japan, in 1998. He was an Associate Professor with Dongseo University, Busan, South Korea, from 1999 to 2008. He is currently a Professor with the Department of Electronic Engineering, Pukyong National University, Busan. His current research interests include wireless sensor networks, ubiquitous healthcare and automobile application, smart-light emitting lighting with visible light communication, and embedded systems.