Please use this identifier to cite or link to this item:
https://open.uns.ac.rs/handle/123456789/32704
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Pavlović, Dejan | en_US |
dc.contributor.author | Tachtatzis, Christos | en_US |
dc.contributor.author | Hamilton, Andrew | en_US |
dc.contributor.author | Marko, Oskar | en_US |
dc.contributor.author | Atkinson, Robert | en_US |
dc.contributor.author | Davison, Christopher | en_US |
dc.contributor.author | Michie, Craig | en_US |
dc.contributor.author | Crnojević, Vladimir | en_US |
dc.contributor.author | Andonović, Ivan | en_US |
dc.date.accessioned | 2024-04-25T07:47:01Z | - |
dc.date.available | 2024-04-25T07:47:01Z | - |
dc.date.issued | 2020-12 | - |
dc.identifier.isbn | 978-90-8686-349-5 | en_US |
dc.identifier.uri | https://open.uns.ac.rs/handle/123456789/32704 | - |
dc.description.abstract | The monitoring of cattle behaviour through sensor systems is gaining importance in the improvement of animal health, fertility and management of large herds. Commercial farms commonly implement accelerometer-based systems to monitor the time an animal spends ruminating, eating and overall activity which informs farmers on the health and fertility status of individual cattle. Ill or injured cattle feed and ruminate less, so tracking the duration and frequency of these states provide key indicators of animal health. Activity is used as a metric for the detection of oestrus (heat) which promotes more efficient fertilisation of dairy and beef cattle, reducing operating costs and increasing profits for farmers. The aim of the study was to determine the feasibility of enhancing the accuracy of estimating multiple classifications derived from accelerationbased activity collars can through Convolutional Neural Networks (CNN). CNN models are typically used to classify objects within images, but have been demonstrated to be effective at classifying time-series data across different domains. To evaluate their effectiveness for cattle behaviours classifications, acceleration data was collected from 18 cows across 3 farms using neck-mounted collars which provided 3-axis acceleration values at 10Hz sampling frequency. Each cow was equipped with pressure sensor halters which provided ground truth data of the animal behavioural state, also at 10Hz sampling frequency. The ground truth from the halter allowed the CNN model to be trained to predict a number of key cattle behaviours. The model was then tested on separate data to assess performance. The CNN was able to classify the 3 activity states (rumination, eating and other) with an overall F1 score of 82% compared to reported collar classifications with an overall F1 score of 72%. | en_US |
dc.subject | monitoring, sensor systems, animal health | en_US |
dc.title | Classification of cattle behaviour using convolutional neural networks | en_US |
dc.type | Conference Paper | en_US |
dc.relation.conference | 71st Annual Meeting of the European Federation of Animal Science, Virtual, 1-4 Dec. | en_US |
dc.identifier.doi | 10.5281/zenodo.6393511 | - |
dc.description.version | Published | en_US |
item.fulltext | With Fulltext | - |
item.grantfulltext | open | - |
crisitem.author.dept | Institut BioSense | - |
crisitem.author.dept | Institut BioSense | - |
crisitem.author.dept | Institut BioSense | - |
crisitem.author.orcid | 0000-0002-9811-9485 | - |
crisitem.author.orcid | 0000-0001-6683-7178 | - |
crisitem.author.orcid | 0000-0001-7144-378X | - |
crisitem.author.parentorg | Univerzitet u Novom Sadu | - |
crisitem.author.parentorg | Univerzitet u Novom Sadu | - |
crisitem.author.parentorg | Univerzitet u Novom Sadu | - |
Appears in Collections: | IBS Publikacije/Publications |
Files in This Item:
File | Size | Format | |
---|---|---|---|
M34-2020-Classification of cattle behaviour using convolutional neural networks.pdf | 144.57 kB | Adobe PDF | View/Open |
Page view(s)
9
checked on May 3, 2024
Download(s)
1
checked on May 3, 2024
Google ScholarTM
Check
Altmetric
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.