According to the CDC, one in four adults in the United States has a disability, where disabilities can involve vision, cognitive function, hearing, motor skills and more. This is why the iPhone has accessibility features; so that everyone can use an iPhone, not just those without cons. And iOS 14 has only made the iPhone even more accessible, with tools that even ordinary users will love.
Without some accessibility options, a large portion of the population would not be able to use an iPhone to explore apps, play games, learn something new, or connect with friends. Studies show that 71 percent of users with disabilities will immediately leave a site that does not accommodate them, and that̵7;s just one thing that iOS 14 fixes. There’s something for everyone in this update.
1. VoiceOver is smarter overall
In iOS 13 and later, VoiceOver, the technology for blind or low-vision users who read aloud things displayed on the screen, is pretty basic and follows the labels that mobile and web apps give them. But not all apps include this information, and this is where iOS 14 saves the day.
Now there is intelligence on the device that determines what things are called and how to say them for when developers do not provide that information. So people who cannot see will be able to use everything on their iPhone, not just part of it.
2. It recognizes text in images and photos
An important aspect of VoiceOver’s new capabilities is that it can read what text is in pictures and photos. So if the text is actually part of the image or photo, VoiceOver will recognize it.
3. And descriptions for pictures and photos
VoiceOver can also read the descriptions that follow pictures and photos in apps. It would not happen before, so now you know what is happening in all the media you touch, as long as it provides that information.
4. And interface controls in apps
The latest feature of VoiceOver is that it can intelligently describe on-screen interface controls when it detects them. This makes it easier to navigate apps, making them more accessible.
5. The magnifier now has an app icon
The magnification tool, which turns your camera into a magnifying glass, has been around since iOS 10, but it has gone unnoticed by most because it is relatively well hidden. It changes in iOS 14 because there is now an option to add it to your homepage, App library and Search.
To find it, you must have a magnifying glass activated via Settings -> Availability -> magnifying glass. Then you can find the “Tools” or “Recently Added” folders in the App Library. From there, you can drag it out on the home screen to create a shortcut on the home screen. That way, you do not have to worry about triple clicking on anything anymore. You can also search for it via the search tool. But if you ever disable the magnifying glass in the settings, the app icons disappear everywhere.
6. And its controls can be hidden
Previously, when the magnifying glass was opened, the on-screen controls would remain visible at all times if you did not take a picture. In iOS 14, you can now swipe the control panel to see and use only the zoom slider, and you can double-click the screen to hide the whole thing (and double-click again to show it).
Magnifier has a new interface
The controls in the past were pretty basic. The slider would control zoom, the flash would turn on the flash, the lock icon would lock the exposure, the filter button would provide access to colored filters and contrast / brightness, and the shutter would take a still image.
Now the zoom control looks a little better, the contrast / brightness levels are up and in the middle instead of hidden in filters, and the flash link is now a real flashlight.
8. Which can be customized
Also new to the controls is the gear button, which takes you to the magnifying glass control options. Here you can change which control is always visible at the top as the primary control and when the controls are minimized. Default is zoom, but it can be brightness, contrast, filter or torch. You can also organize the secondary controls or delete any of them.
9. Magnification filter can be slimmed down
Before pressing the filter button, you must swipe through each filter until you find the one you need. This is still the case in iOS 14, but when you tap the cog icon, then “Filter Customization”, you can deselect any filters that you do not use. That way, there is less to sweep through.
10. And you can take more pictures
In previous versions, when you pressed the shutter button, it would create a still image of the view that you could interact with further. You can now press the multi-image button in the controls to temporarily save an image so you can interact with it later. Then you can continue using the shutter to take and save more still images. (These still images are not saved in Photos.)
To view older still images, press “View” and then select the image you want. Magnifier will remember these images until you press “Exit” while viewing all still images.
11. Rear pressure provides two additional shortcuts for actions you use the most
In earlier versions of iOS, you can use the Page or Home button to launch various options in Accessibility. With a triple click of the button, you can quickly start all the functions that you activated. If you liked it, it’s still available on iOS 14, but now there’s a new way that can do even more.
In the “Touch” accessibility menu, there is a new option called “Back Tap.” Select it and you will see options for “Double tap” and “Triple Tap.” These refer to tapping the back of your iPhone – no buttons or on-screen controls. In each of these menus you can choose which function you want to use with each gesture, and the list of what is possible is LARGE. Then click twice or three times on the back of the device in the middle to start the operation.
Unlike accessibility shortcuts, these gestures allow you to start more than just accessibility options. You can control the volume, sleep your screen, reveal the message center, scroll up or down and much more. There is also support for shortcuts created with the shortcut app. However, our favorite uses “Shake”, so you no longer have to shake the iPhone to undo text or delete something, which does not always work.
12. Voice control has more languages
The new and improved voice control was introduced in iOS 13 and allows you to use your iPhone with just your voice. But in iOS 14 there is a significant improvement, especially British English and Indian English voices.
13. And it works with VoiceOver
What is even better than new voices for voice control? The fact that it now works with VoiceOver. So if you ever wanted to use both features at the same time, now you can do it.
14. Headphones can help you hear better
In the audio / visual accessibility setting, there is a new option called “Headphones.” When you turn this on, you get the chance to fine-tune how you hear things in your headphones. And if you have a set of AirPods Pro, it works with Transparency mode.
You can change the sound from a balanced tone to one optimized for higher frequency sound or singing in the intermediate frequencies. There is also a slider to adjust how soft sounds are improved, from light to moderate to strong. And you can hear these changes when you make them in real time with the “Play Trials” button. Best of all, you can choose these settings to apply to media (music, movies, podcasts, etc.) or phone calls or both.
15. And can be further customized on AirPods & Beats
If you own a pair of AirPods, AirPods Pro or Beats, there is another option for headphone systems that allows you to create a “custom sound setting.” It will guide you through different sounds
One of the best features in headphones is available for choosing Airpods and Beats headphones. When connected, you can create a custom sound setting that best suits your hearing. AirPods Pro also supports Transparency mode, which improves the audibility of quieter voices around you.
16. Audio recognition helps you hear important things
Using Neural Engine on your iPhone, iOS 14 can detect background noise that may be trying to alert you. When enabled from the accessibility menu, “Audio Recognition” constantly listens to ambient sounds around you, searching for specific sounds you select – all without draining the battery.
When you switch on the function, press “Sound” and switch to something below alarm (fire, siren, smoke), animal (cat dog), Household (appliances, car horns, doorbell, doorbell, running water), or people (baby crying, screaming).
These sounds are detected using the device’s intelligence, so nothing is recorded or sent anywhere. When you receive a warning, you will see a notification and may feel a vibration depending on the “Sound & Haptics” settings.
17. Real-time text lets you multitask
Since 2017, those with selected disabilities can use RTT software (or text in real time) to communicate with others in a phone call. Unlike in messaging, there is no need to hit send because text appears on the recipient’s screen as you type it and emulate an audio call. In iOS 14, you can now multitask with this feature. When you are outside the Phone app, and therefore outside the view of the conversation, you will be able to receive RTT notifications.
18. FaceTime detects sign language
When you use Group FaceTime, the active speaker is usually magnified if you do not disable the feature. But you do not always talk with your mouth, as is the case with people who sign. Starting with iOS 14, when FaceTime detects someone using sign language, they will be emphasized in the video call as the prominent speaker.
Keep your connection secure without a monthly invoice. Get a lifetime subscription to VPN Unlimited for all your devices with a one-time purchase from the new Gadget Hacks Shop and watch Hulu or Netflix without regional restrictions.
Buy now (80% off)>