Abstract—Human Activity Recognition (HAR) is one of the main research fields in pattern recognition. In recent years, machine learning and deep learning have played important roles in Artificial Intelligence (AI) fields, and are proven to be very successful in classification tasks of HAR. However, there are two drawbacks of the mainstream frameworks: 1) all inputs are processed with the same parameters, which would cause the framework to incorrectly assign an unrealistic label to the object; 2) these frameworks lack generality in different application scenarios. In this paper, an adaptive multi-state pipe framework based on Set Pair Analysis (SPA) is presented, where pipes are mainly divided into three kinds of types: main pipe, sub-pipe and fusion pipe. In the main pipe, the input of classification tasks is preprocessed by SPA to obtain the Membership Belief Matrix (MBM). The sub-pipe shunt processing is performed according to the membership belief. The results are merged through the fusion pipe in the end. To test the performance of the proposed framework, we attempt to find the best configuration set that yields the optimal performance and evaluate the effectiveness of the new approach on the popular benchmark dataset WISDM. Experimental results demonstrate that the proposed framework can get the good performance by achieving a result of 1.4% test error.
Index Terms—Human Activity Recognition (HAR), Membership Belief Matrix (MBM), multi-state pipe framework, Set Pair Analysis (SPA).
The authors are with the School of Information Science and Engineering, Shandong University, Qingdao, China (e-mail: email@example.com,firstname.lastname@example.org).
Cite: Leixin Shi, Hongji Xu, Beibei Zhang, Xiaojie Sun, Juan Li, and Shidi Fan, "Adaptive Multi-state Pipe Framework Based on Set Pair Analysis," International Journal of Machine Learning and Computing vol. 10, no. 6, pp. 759-764, 2020.Copyright © 2020 by the authors. This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).