GeoVet 2023 International Conference
R07.2 Predicting spatially explicit swine premises using deep learning and aerial imagery to improve disease monitoring and surveillance

Keywords

convolutional neural networks
Deep Learning
disease transmission
livestock
surveillance

Category

Abstract

Accurate farm geolocations are essential for disease outbreak surveillance and monitoring, and for developing spatial epidemiological models that evaluate distance-dependent transmission processes across different farm types. Farm geolocation data is often not available, which leads to assumptions of the spatial distribution of livestock premises; potentially resulting in over/underestimation of transmission rates, misleading model predictions, and limited interpretation of epidemic dissemination dynamics (Sellman et al., 2020). Here, we developed a convolutional neural network (CNN) model to i) identify individual swine barns from non-barn features (e.g., residential homes, commercial buildings), with the objective of ii) developing a rule-based filtering method to calculate the area and the aspect ratio of swine barns and combine this information with farm metadata (pig capacity, farm type) to iii) predict farm types and population size. 

High resolution aerial imagery from a 22,369 x 13,457 pixel extent was obtained from the National Agriculture Imagery Program (NAIP) (USGS, 2022), and used to generate 1,148 raster tiles with dimensions of 512 x 512 pixels. We constructed a binary mask consisting of 118 known swine barns, which we incorporated into the raster tiles as an additional raster band. Raster tiles were split as 60% for training and 20% for validation, and testing a ResUnet CNN model (Diakogiannis et al., 2020), respectively.  Model performance was evaluated by examining the accuracy of the classification of barn and non-barn features. 

The CNN model was able to accurately identify swine barns from non-barn features with an accuracy of 80% in a swine production region of the United States. Additionally, the fractional predicted probability of the “barns” class, recognized open space between barns and nearby waste lagoons as a high probability of belonging to a swine barns. 

We were able to successfully predict swine barn geolocations using aerial images and deep learning with an accuracy of 80%. We also obtained valuable contextual information regarding environmental, demographic, and structural properties of swine barns and the surrounding areas, which we will be incorporating as we continue to refine our model. Once the best model is identified, our aim is to use this model to predict different swine farm types, and their corresponding population size. Results obtained from this work will enable researchers and policymakers to better understand the distribution and spatial arrangement of swine farms, and aid in the development of accurate spatial epidemiological models. This information is crucial for the identification of high-risk areas, potential sources of disease transmission, tailoring disease control strategies, and evaluating their effectiveness.

References

Diakogiannis, F. I., Waldner, F., Caccetta, P., & Wu, C. (2020). ResUNet-a: A deep learning framework for semantic segmentation of remotely sensed data. ISPRS Journal of Photogrammetry and Remote Sensing, 162, 94–114. doi:10.1016/j.isprsjprs.2020.01.013.

Sellman, S., Tildesley, M. J., Burdett, C. L., Miller, R. S., Hallman, C., Webb, C. T., Wennergren, U., Portacci, K., & Lindström, T. (2020). Realistic assumptions about spatial locations and clustering of premises matter for models of foot-and-mouth disease spread in the United States. Kao, R. (Ed.). PLOS Computational Biology, 16, e1007641. doi:10.1371/journal.pcbi.1007641.

USGS. (2022). USGS EROS Archive - Aerial Photography - National Agriculture Imagery Program (NAIP). https://www.usgs.gov/centers/eros/science/usgs-eros-archive-aerial-photography-national-agriculture-imagery-program-naip.