Apple has always done its best to add features for users with disabilities.
Within iOS, VoiceOver is a useful tool for anyone with low vision, assuming all UI elements are manually named.
Screen recognition is available within iOS 14, and is a computer vision system that is trained on thousands of images of in-use applications in order to recognize what the button looks like and what the icons mean.
These systems are very flexible and can become an expert at detecting cats, facial expressions or different parts of the user interface, depending on the data you provide them.
IPhone users can invoke the feature within any application, so that within a fraction of a second, each element within the screen is named.
Screen recognition must be aware of everything that a sighted user can see and be able to interact with, such as: images, common icons, and menu icons that appear almost everywhere.
And Apple said: We looked for areas where we could make progress in terms of accessibility, such as image descriptions, and in iOS 13 we automatically named icons, but the new feature is a big development.
She added: We can look at pixels within the screen and define a hierarchy of objects that you can interact with, and all of this happens across the device within tenths of a second.
Screen recognition has become more flexible and robust due to the fuzzy logic of machine learning systems and the speed of the AI accelerators built into iPhones.
The new feature should help make it easier to access millions of apps for users with low vision.
Apple may plan in the future to bring the screen recognition feature to other platforms, such as Mac computers, but the problem is that the model itself is not generalizable via desktop applications, which are completely different from mobile applications.