Looking for a new iPhone featuring Apple Intelligence? Previously, this would cost around $800. However, following Apple’s recent iPhone launch, the price to utilize Apple Intelligence on an iPhone is now reduced by 25%. There are merely two minor conditions to note.
Apple Intelligence encompasses a range of features powered by artificial intelligence. Apple prioritizes privacy with its AI offerings, utilizing a stateless server framework known as Private Cloud Compute. This privacy-focused strategy is distinct from many contemporary AI systems that store data for model training on servers.
Some notable features of Apple Intelligence include Writing Tools for refining or altering your writing style, Clean Up for eliminating unwanted elements from images, and Genmoji, which lets users create personalized emoji characters based on descriptions.
Visual Intelligence
Visual Intelligence forms another part of Apple Intelligence. Currently, most of Apple Intelligence can be accessed on the iPhone, iPad, and Mac. As of now, Visual Intelligence is exclusive to the iPhone 16 and iPhone 16 Pro, lacking support on the iPhone 15 Pro or iPad, which also have Apple Intelligence.
This exclusivity is largely due to branding and product differentiation.
The iPhone 16 and iPhone 16 Pro come equipped with a new feature called Camera Control. Activating Visual Intelligence is as simple as pressing and holding the Camera Control button.
Once activated, Visual Intelligence allows users to point the rear camera at an object to use in a ChatGPT query or Google visual search. Recently, Apple introduced functionality that enables users to add events to their calendar by recognizing items like concert posters and bulletin messages.
This leads to the first minor condition concerning Apple Intelligence on the new iPhone 16e.
The iPhone 16e lacks the Camera Control feature available in other iPhone 16 models, yet still offers Visual Intelligence. Users can easily invoke Visual Intelligence on the iPhone 16e by assigning it to the Action button. However, this effectively prevents the use of the Action button for other functions, such as launching the Camera or muting the device.
Fortunately, there is an alternative method to access Visual Intelligence on the iPhone 16e without relying on the Action button, which involves a few additional steps. Instead of pressing a physical button, users can bring up Visual Intelligence through the Control Center. Currently, the iPhone 16e is the only model featuring Visual Intelligence in the Control Center, and there’s potential for this feature to come to the iPad down the line. Don’t expect it to arrive on the iPhone 15 Pro anytime soon.
Glowing Edge Light
Beyond Visual Intelligence, another very minor distinction exists between Apple Intelligence on the iPhone 15 Pro, iPhone 16, and iPhone 16 Pro compared to the iPhone 16e.
Up until recently, every iPhone equipped with Apple Intelligence included Dynamic Island, Apple’s adaptable interface that surrounds the camera sensor at the top of the screen. In contrast, the iPhone 16e uses the older notch design.
This difference results in the Apple Intelligence feature for Siri, referred to as Glowing Edge Light, appearing slightly different on the iPhone 16e compared to other models. Instead of emitting a uniform rainbow glow around the edge of the display, the Glowing Edge Light encircles the notch on the iPhone 16e.
In terms of functionality, the Glowing Edge Light serves no essential purpose beyond indicating that your iPhone is using the Apple Intelligence version of Siri instead of the traditional version. The visual difference is even more negligible unless one has a preference for perfect symmetry — in which case, an additional $200 to $400 investment may be necessary.
Aside from these two minor distinctions, Apple Intelligence operates the same across all iPhone models, regardless of which one you choose to purchase.