People With Disabilities Can Now Control Their Phones With Facial Gestures

Google released two new tools that allow machine learning and smartphone front cameras to work by detecting eye and face movements. Users can scan their phone screen and select a task by smiling, raising their eyebrows, opening their mouth, or looking left, right, or up. “To make Android more accessible to everyone, we are launching new tools that make it easier to control your phone and communicate using facial gestures,” Google said. The Centers for Disease Control and Prevention estimate that 61 million adults in the United States live with disabilities, prompting Google and its rivals Apple and Microsoft to make products and services more accessible to them. “Every day, people use voice commands, like ‘hey, Google,’ or their hands to navigate their phones,” the tech giant explained in a blog post. “However, that is not always possible for people with severe speech and motor disabilities.” The changes are the result of two new features, one is called “Camera Switches,” which allows people to use their faces instead of swiping and tapping to interact with smartphones. Project Activate The other is Project Activate, a new Android app that lets people use those gestures to trigger an action, like having a phone play a recorded phrase, send a text message, or make a call. “It is now possible for anyone to use eye movements and facial gestures customized for their range of motion to navigate their phone, without hands or voice,” the company said. Apple, Google, and Microsoft have consistently implemented innovations that make Internet technology more accessible to people with disabilities or who find that age has made some tasks, such as reading, difficult. Voice-controlled digital assistants built into speakers and smartphones can allow people with vision or movement problems to tell computers what to do. There is software that identifies text on web pages or in images and then reads it out loud, as well as automatic generation of subtitles that show what is said in videos. An AssistiveTouch feature that Apple built into the software that powers its smartwatch allows touchscreens to be controlled by detecting movements such as finger or hand pinches. Apple said in a post that “this feature also works with VoiceOver so you can navigate the Apple Watch with one hand while using a cane or carrying a service animal.” The computer giant Microsoft, for its part, describes accessibility as essential to empower everyone with technological tools. “To enable transformative change, accessibility must be a priority,” he quoted. “Our goal is to integrate it into what we design for each team, organization, classroom, and home.” * With information from AFP. Connect with the ! Subscribe to our YouTube channel and activate notifications, or follow us on social networks: Facebook, Twitter

and Instagram.