Building on earlier work which teaches computers to recognise emotions and expressions in human faces, the system is able to detect the distinct parts of a sheep's face and compare it with a standardised measurement tool developed by veterinarians for diagnosing pain. Their results will be presented today (1 June) at the 12th IEEE International Conference on Automatic Face and Gesture Recognition in Washington, DC. Severe pain in sheep is associated with conditions such as foot rot, an extremely painful and contagious condition which causes the foot to rot away; or mastitis, an inflammation of the udder in ewes caused by injury or bacterial infection. Both of these conditions are common in large flocks, and early detection will lead to faster treatment and pain relief. Reliable and efficient pain assessment would also help with early diagnosis.
One of today's more popular artificially intelligent (AI) androids comes from the TV series "MARVEL's Agents of S.H.I.E.L.D." Those of you who followed the latest season's story -- no spoilers here! One of the most interesting things about this fictional AI character is that it can read people's emotions. Thanks to researchers from the University of Cambridge, this AI ability might soon make the jump from sci-fi to reality. The first step in creating such a system is training an algorithm on simpler facial expressions and just one specific emotion or feeling. To that end, the Cambridge team focused on using a machine learning algorithm to figure out if a sheep is in pain, and this week, they presented their research at the IEEE International Conference on Automatic Face and Gesture Recognition in Washington, D.C.
The Gromov-Hausdorff distance provides a metric on the set of isometry classes of compact metric spaces. Unfortunately, computing this metric directly is believed to be computationally intractable. Motivated by applications in shape matching and point-cloud comparison, we study a semidefinite programming relaxation of the Gromov-Hausdorff metric. This relaxation can be computed in polynomial time, and somewhat surprisingly is itself a pseudometric. We describe the induced topology on the set of compact metric spaces. Finally, we demonstrate the numerical performance of various algorithms for computing the relaxed distance and apply these algorithms to several relevant data sets. In particular we propose a greedy algorithm for finding the best correspondence between finite metric spaces that can handle hundreds of points.
About Us Faceware Tech develops innovative facial motion capture products for the professional animation and videogame industries. We make the tools used by your favorite videogame developers (EA, Bungie, Rockstar, 2K, WB, Sony, Capcom, and more) and VFX studios (Dneg, Atomic Fiction, Oats, CamD) to create high-end facial animation. At Siggraph 2016, we announced FTI Interactive-- a new division within the company focused on real-time content creation. The world's demands for content are even higher than ever before-- with needs in interactive, realtime content creation on the cusp of a revolutionary breakthrough. This position will be part of the team leading the cutting edge of entertainment by innovating new techniques for the exciting up and coming industries of VR, MR, and AR.
Most sketch recognition systems are accurate in recognizing either text or shape (graphic) ink strokes, but not both. Distinguishing between shape and text strokes is, therefore, a critical task in recognizing hand drawn digital ink diagrams which commonly contain many text labels and annotations. We have found the ‘entropy rate’ to be an accurate criterion of classification. We found that the entropy rate is significantly higher for text strokes compared to shape strokes and can serve as a distinguishing factor between the two. Using entropy values, our system produced a correct classification rate of 92.06% on test data belonging to diagrammatic domain for which the threshold was trained on. It also performed favorably on data for which no training examples at all were supplied.