Apple announced new accessibility features coming later this year, including Accessibility Nutrition Labels, which will provide more detailed information for apps and games on the App Store®?. Users who are blind or have low vision can explore, learn, and interact using the new Magnifier app for Mac®?; take notes and perform calculations with the new Braille Access feature; and leverage the powerful camera system of Apple Vision Pro?? with new updates to visionOS??.
Additional announcements include Accessibility Reader, a new systemwide reading mode designed with accessibility in mind, along with updates to Live Listen®?, Background Sounds, Personal Voice, Vehicle Motion Cues, and more. Leveraging the power of Apple silicon -- along with advances in on-device machine learning and artificial intelligence -- users will experience a new level of accessibility across the Apple ecosystem. Accessibility Nutrition Labels bring a new section to App Store product pages that will highlight accessibility features within apps and games.
These labels give users a new way to learn if an app will be accessible to them before they download it, and give developers the opportunity to better inform and educate their users on features their app supports. This includes VoiceOver, Voice Control, Larger Text, Sufficient Contrast, Reduced Motion, captions, and more. Accessibility Nutrition Labels will be available on the App Store worldwide, and developers can access more guidance on the criteria apps should meet before displaying accessibility information on their product pages.
Apple Watch serves as a remote control to start or stop Live Listen sessions, or jump back in a session to capture something that may have been missed. With Apple Watch, Live Listen sessions can be controlled from across the room, so there's no need to get up in the middle of a meeting or during class. Live Listen can be used along with hearing health features available on AirPods Pro®?
2, including the first-of-its-kind clinical-grade Hearing Aid feature. An Enhanced View with Apple Vision Pro: For users who are blind or have low Vision, visionOS will expand vision accessibility features using the advanced camera system on Apple Vision Pro. With powerful updates to Zoom, users can magnify everything in view - including their surroundings -- using the main camera.
For VoiceOver users, Live Recognition in visionOS uses on-device machine learning to describe surroundings, find objects, read documents, and more. For accessibility developers, a new API will enable approved apps to access the main camera to provide live, person-to-person assistance for visual interpretation in apps like Be My Eyes, giving users more ways to understand their surroundings hands-free. Sound Recognition adds Name Recognition, a new way for users who are deaf or hard of hearing to know when their name is being called.
Voice Control introduces a new programming mode in Xcode for software developers with limited mobility. Voice Control also adds vocabulary syncing across devices, and will expand language support to include Korean, Arabic (Saudi Arabia), Turkish, Italian, Spanish (Latin America), Mandarin Chinese (Taiwan), English (Singapore), and Russian. sessions can be scheduled at all Apple Store locations worldwide through Group Booking or by visiting a nearby store.
Apple Music®? shares the story of artist Kiddo K and the power of music haptics for users who are deaf orhard of hearing, unveils updates to its Haptics playlists, and launches a brand-new playlist featuring ASL interpretations of music videos alongside Saylists playlists. Apple Fitness+ welcomes Chelsie Hill as a guest in a Dance workout with Fitness+ trainer Ben Allen.
Hill is a professional dancer and founder of Rolettes, an L.A.-based wheelchair dance team that advocates for disability representation and women's empowerment. The workout is available now in the App Store worldwide.