Everything of this project has been finished in 12 hours by one friend of mine, Yuzhe Yang, and me in the 2020 Hackathon of our university.
1. a snippet of MatLab codes for mobile devices that can cost a fraction of the resource to recognise the user's motion in real-time.
2. a homemade dataset of human actions recorded by us, including walking, running, sitting, standing and dancing, has been devided into a validation set and a training set.
3. a .mat file that contains the weights of our model, a pre-trained gated bag of decision trees, which is used for making the prediction.
We found the features from the magnetic sensors are the most salient one, and the accuracy of our model is not bad, could be 92.7%.