Detection of Human Fall Using Floor Vibration and Artificial Neural Network

  • Muhamad Asrafie Alias Muhamad Asrafie bin Alias
  • NH Abdul Ghafar
Keywords: Elderly, Falling Detection, Floor Vibration, Neural Network


High rates of wellbeing and improved life expectancy have resulted in the rapid growth of the world's older population. In addition to rising age, older individuals tend to be more likely to suffer from falling with adverse side effects, including chronic illnesses threatening health and live. Hence, many systems are used to detect a falling event on the floor, but produces high false alarms and facing problems related to disturbing users' privacy who feel uncomfortable when using camera device. Being empowered to generate accurate fall detection will minimize the effects of the fall experienced by older people would ultimately boost health outcomes and decrease healthcare expense after falling. To accomplish the goal, a new approach used in this study was proposed by utilizing an artificial neural network from acceleration signal measurement through floor vibration to detect human fall. Four types of event such as  dummy falling with weight of 26kg, free jumping with weight 80kg,sitting with weight of 80kg and walking with weight 80kg were conducted testing at Jamilus Research Centre, FKAAB. The feature vector of variance, peak value and mean value was extracted using the data from average of all the eight sensors was selected to train the dataset performance using neural network algorithm in MATLAB software. The results showed that the data set testing could accurately identify the actual human fall from other human activities with the accuracy of 96.6%, specificity of 96.5% and sensitivity of 97.1%. Therefore, the proposed artificial neural network has verified high accuracy and confidence from the data prediction.

How to Cite
Alias, M. A., & Abdul Ghafar, N. (2021). Detection of Human Fall Using Floor Vibration and Artificial Neural Network. Recent Trends in Civil Engineering and Built Environment, 2(1), 372-380. Retrieved from