Detection of lameness and mastitis pathogens in milk using visual and olfactory sensing
Abstract
The objective of this project is to investigate feasibility of visual combined with olfactory sensing and multimodal collaborative intelligence for the perception of diseases, especially the contagious ones, among a population of dairy cattle. The idea is to develop artificially intelligent systems that can generate lowdimensional representations about presence of diseases by learning from visual and olfactory sensory inputs, which are high-dimensional and noisy. The idea of in-cooperating visual and olfactory intelligence is a brilliant one; this is because the olfactory intelligence of animals and insects are predominant over visual intelligence and that olfactory intelligence are currently barely decoded computationally, i.e, no computational models outperform the olfactory perceptional capability of moths being widely studied. This is because in contrast to high-resolution camera sensors reaching many mega pixels, state-of-the-art volatile organic compounds sensing arrays called electronic nose achieve only tens of pixels and can only sense ppm maybe ppb concentration level (1 000 000 to 1 000 times lower than insects). The invention of multi-layer and large artificial neural networks for attempting to encode of human visual perceptual intelligence in a computational manner has achieved breakthrough in high-performance artificial intelligence systems. Newer models often contain one or more architectural modules which encodes cognitive science findings such as memory, contrast, analogy, anticipation of consequences, reasoning, knowledge in physics. We are targeting the derivation of a heterogeneous deep architecture combining the visual and olfactory branches their collaborative intelligence. The scope of this project is to evaluate the potential ofthe proposed approach in a real world setup and to clarify technical challenges.