Sow-activity classification from acceleration patterns: a machine learning approach

Research output: Contribution to journalJournal articlepeer-review

This paper describes a supervised learning approach to sow-activity classification from accelerometer measurements. In the proposed methodology, pairs of accelerometer measurements and activity types are considered as labeled instances of a usual supervised classification task. Under this scenario sow-activity classification can be approached with standard machine learning methods for pattern classification. Individual predictions for elements of times series of arbitrary length are combined to classify it as a whole. An extensive comparison of representative learning algorithms, including neural networks, support vector machines, and ensemble methods, is presented. Experimental results are reported using a data set for sow-activity classification collected in a real production herd. The data set, which has been widely used in related works, includes measurements from active (Feeding, Rooting, Walking) and passive (Lying Laterally, Lying Sternally) activities. When classifying 1-s length observations, the best method achieved an average recognition rate of 74.64%, for the five activities. When classifying 2-min length time series, the performance of the best model increased to 80%. This is an important improvement from the 64% average recognition rate for the same five activities obtained in previous work. The pattern classification approach was also evaluated in alternative scenarios, including distinguishing between active and passive categories, and a multiclass setting. In general, better results were obtained when using a tree-based logitboost classifier. This method proved to be very robust to noise in observations. Besides its higher performance, the suggested method is more flexible than previous approaches, since time series of any length can be analyzed.
Original languageEnglish
JournalComputers and Electronics in Agriculture
Volume93
Pages (from-to)17-26
Number of pages10
ISSN0168-1699
DOIs
Publication statusPublished - 2013

ID: 45859614