Data example (resultant acceleration) of fall frontward

HMM-Based Human Fall Detection and Prediction Method Using Tri-Axial Accelerometer

HMM-Based Human Fall Detection and Prediction Method Using Tri-Axial Accelerometer

By Lina Tong, Quanjun Song, Yunjian Ge, and Ming Liu

NOTE: This is an overview of the entire article, which appeared in the May 2013 issue of the IEEE Sensors Journal.
Click here to read the entire article.

Falls in the elderly have always been a serious medical and social problem. To detect and predict falls, a hidden Markov model (HMM)-based method using tri-axial accelerations of human body is proposed. Previous studies had utilized detection devices, including accelerometers, position detectors and gyroscopes. The authors chose to use a wearable upper-trunk motion detection device using tri-axial accelerometer. The acceleration time series (ATS) obtained through this device during deliberate fall events made by healthy volunteers were used as training examples to train HMM, in order to be able to detect and predict falls.

The article reports on the design of the wearable accelerometers, their placement on the volunteers, and the resulting information obtained. The feature extraction, HMM recognition, and the fall prediction and detection algorithm are described.

Data example (resultant acceleration) of fall frontward

Data example (resultant acceleration) of fall frontward

Using this hardware and software, the authors acquired data from 80 fall events (such as in the figure above) and a number of non-fall motions (sitting, standing, walking and squatting) in order to train the HMM and to test the effectiveness of the resulting models.

The authors report that their experimental results show that the algorithm proposed can predict falls 200-400 ms before the person collides with the floor, and can distinguish fall events from other daily life activities with 100% sensitivity and 100% specificity. They indicate that they plan next to test the method on a large real-world sample of elders.

 

ABOUT THE AUTHORS

Lina Tong received the B.S. degree in automation and the Ph.D. degree in control science and engineering from the University of Science and Technology of China, Hefei, China, in 2005 and 2011, respectively. She is currently a Post Doctorate Research Associate with the State Key Laboratory of Management and Control for Complex Systems, Institute of Automation, Chinese Academy of Sciences, Hefei. Her major field is robotics and human-machine interaction.

Quanjun Song received the B.S. degree from Anhui University, China, in 1994, the M.S. degree in electrical engineering from the Chinese Academy of Sciences in 2004, and the Ph.D. degree from the University of Science and Technology of China, Hefei, China, in 2007. He is currently an Associate Professor with the Robot Sensor Laboratory, Institute of Intelligent Machines, Chinese Academy of Sciences, Hefei. His current research interests include intelligent robot, biomechanics, and physical human-robot interaction.

Yunjian Ge received the Ph.D. degree from the Institute National des Sciences Appliquées de Lyon, Lyon, France, in 1989. He is currently a Professor of the Institute of Intelligent Machines, Chinese Academy of Sciences, Hefei, China. His current research interests include functional material, robotics, and bionic perception.

Ming Liu (M’93) received the M.Sc. degree in control systems from Northeastern University, Shenyang, China, in 1982, and the Ph.D. degree in robotics and control from the University of Wollongong, Australia, in 1991. He was a Research Associate with the Department of Mechanical and Manufacturing Engineering, Melbourne University, Melbourne, Australia, in 1990, and has been with the Department of Electrical and Computer Systems Engineering, Monash University, Melbourne, as a Lecturer and Senior Lecturer, since 1992. He is currently with the Institute of Intelligent Machines, Chinese Academy of Sciences, Hefei, China, as a Research Scientist. His current research interests include nonlinear systems, control, robotics, UAVs, motion and attitude estimation, and real-time systems.