An AI Powered Tool To Aid The Orthopaedic Examination
We have all dealt with animals where an examination had to be minimal and there were concerns and doubts on the diagnosis. An observation of the animal behaviour cannot confirm any diagnosis although observations accompanied by data analysis can be valuable in deciding our approach. Things are more complicated when we need to examine the oral cavity which normally is challenging or impossible without sedation and imaging. Videoing and comparing the current status of the dog with previous presentations might save us from unnecessary sedations and procedures. Furthermore many owners find it difficult to understand what is wrong just by verbal explanations and imaging material.
DeepLabCut designed a markerless motion capture application to study mobility and behaviour. Their goal is to support animal models in neuroscience research and improve the welfare of lab animals. As a vet I believe that it can become an invaluable tool in veterinary research and clinical practise.
Work by Richard Warren, the 3D movements from a head-fixed mouse running on a treadmill as collected by one camera (plus a mirror)
Videography is an easy method for the recording of the animal mobility and behaviour. It gives us visual history of the condition of the patient so as to compare and evaluate the progression of the condition and the results of the treatment. Also as a visual tool it makes it simpler for the owner to understand the problem of their pet and proceed to the recommended steps.
Work by Dr Daniel Leventhal's group, during an automated pellet reaching task
The application gives us a tracking system of the features of our preference which we can later analyse and compare. The code and the documentation are available on GitHub.
The Vet Futurist team used the tool to track the mobility of the mandible of a cat while eating and here are our results:
First we created several videos of a cat while eating. We chose the best videos and then we annotated the anatomical features we wanted to track. In our examples we tracked the movement of the mouth when the cat was eating. We had to annotate 60 pictures by marking the chosen anatomical features in each one of them.
We used 5 anatomical features:
- lower canines
- tip of the tongue
After the annotation we let the tool train and annotate the rest of the video for us. In the end we could download and see the final result.
The code is available and open source in github. Although currently the system is targeting researchers, so you need specific hardware and knowledge of machine learning techniques in order to use it. I hope that in the future it will be simplified and available to be used by everyone online. Imagine a website that you will be able to upload your video, select the tracking you want and download the processed video and the accompanying data analysis.
AI tools soon will be part of the health plan and daily life of animals. They will enable us to spot the first changes in the behaviour and mobility of them before any disease progresses. An early diagnosis makes the treatment affordable, safer and more efficient. Furthermore, early warnings and notifications will be very useful in the management of the aging population of pets.
We are very satisfied with the tool and we will be experimenting with it the upcoming weeks.