The Live Text function of iOS 15, one of the most interesting innovations of the new operating system, will only work on iPhones with A12 Bionic chip or later.
This means that Live Text can be used on iPhone XR, iPhone XS and later models. The app Camera in iOS 15 now has the “Live Text” OCR function which, using artificial intelligence, allows users to highlight and select text in images and translate it or perform other operations. For example, he can search and find a picture of a handwritten family recipe, or identify the phone number on a shop sign and allow him to call directly. Thanks to the Neural Engine, the Camera app can also quickly recognize and copy text on the spot, for example the Wi-Fi password displayed in a bar, or translate whatever is framed with the camera.
With Visual Look Up users can get information on famous artworks, iconic sites, plants and flowers found in nature and pet breeds, and they can even find books. Live Text can also be used to sort photos by location, scene, people, objects, and more by reading text in photos.
Although iOS 15 is available starting with iPhone 6S and SE, this feature will not be available on all models. The new more detailed cities experience in Maps will also be limited to iPhones with A12 Bionic or later chips.