io app accessibility
CMU, Apple Team Improves iOS App Accessibility
A team at Apple analyzed nearly 78,000 screenshots from more than 4,000 apps to improve the screen reader function on its mobile devices. The result was Screen Recognition, a tool that uses machine learning and computer vision to automatically detect and provide content readable by VoiceOver for apps that would otherwise not be accessible. Jason Wu, a Ph.D. student in Carnegie Mellon University's Human-Computer Interaction Institute (HCII), was part of the team, whose work, "Screen Recognition: Creating Accessibility Metadata for Mobile Applications From Pixels," won a Best Paper award at the recent Association for Computing Machinery (ACM) Computer-Human Interaction (CHI) conference. His advisor, Jeffrey Bigham, an associate professor in HCII and the Language Technologies Institute and head of the Human-Centered Machine Learning Group at Apple, was also among the paper's authors. Apple's VoiceOver uses metadata supplied by ad developers that describes user interface components.