Human Activity Recognition (HAR) using smartphone inertial measurement unit (IMU) sensors has emerged as a transformative technology for health monitoring, fitness tracking, and context-aware computing. However, existing HAR research is constrained by limited data availability, short recording durations, and single-limb sensing configurations. This study addresses these limitations through three principal contributions: (1) introduction of a novel open-access multi-limb HAR dataset featuring synchronized 60-second recordings from hand and ankle positions using tri-axial accelerometer, gyroscope, and magnetometer sensors, publicly available via Mendeley Data repository; (2) systematic benchmarking of classical machine learning classifiers including Random Forest, XGBoost, and Linear Support Vector Classifier under realistic multi-sensor fusion conditions; and (3) comprehensive analysis of model robustness across varying windowing configurations. The dataset comprises recordings from six participants performing six daily activities (walking, stair ascent, stair descent, standing, sitting, lying), totaling over 72 minutes of synchronized multi-sensor data. Experimental evaluation demonstrates that Random Forest achieves superior classification accuracy of 96.13%, significantly outperforming XGBoost (85.22%) and LinearSVC (58.54%). The publicly released dataset and benchmarking results provide a valuable resource for the HAR research community, enabling reproducible experimentation and facilitating advancement in multi-limb activity recognition systems.
SUBMITTED: 18 January 2026
ACCEPTED: 19 February 2026
PUBLISHED:
20 April 2026
SUBMITTED to ACCEPTED: 32 days
DOI:
https://doi.org/10.53623/amms.v2i2.999