To put it mildly, Apple’s AI efforts haven’t been as fruitful as those of its rivals. While other companies churn out advanced chatbots and bake artificial smarts deeply into their platforms, the iPhone maker has failed to deliver on promises it made way back in 2024. The context-aware Siri upgrade is still coming soon, and Apple Intelligence features that have actually shipped have generally been underwhelming.
All is not lost, however. Not only is the new Siri expected to arrive in March, but there are lots of iPhone AI tools that don’t fall under the Apple Intelligence umbrella. Apple has been embedding a Neural Engine that powers its on-device machine learning features since the iPhone 8 and X debuted in 2017. While Writing Tools and Clean Up in Photos first come to mind when thinking of Apple’s AI offerings, there are plenty of other smart perks that don’t require Apple Intelligence or an internet connection to work on iOS.
Photos
As a mobile photography enthusiast, I especially appreciate my iPhone’s understanding of my photo library. The moment I click or save a photo, I can instantly tap and hold on the subject to remove it and insert it elsewhere, change its background, or turn it into a sticker. This simplifies the photo editing process in both professional and casual workflows.
Similarly, iOS 26’s Spatial Scene feature analyzes your photos to understand their depth. This lets you generate an animated, 3D variant of the shot that reacts to your iPhone’s movement.
The Photos app on iOS isn’t merely detecting borders to highlight subjects; it’s also aware of the context inside. This enables you to search your entire media library using relevant keywords, such as dog, table, pasta, recipe, etc.
Dictation
Another smart feature I frequently rely on is the offline dictation tool. Instead of typing long walls of text manually, I simply hit the microphone button on the built-in software keyboard and proceed with my rant. My iPhone starts listening and converts my speech to text in real time. It is highly accurate and properly punctuates my sentences.
The on-device speech-to-text engine also works in the Voice Memos and Phone apps, letting you transcribe recordings and audio calls live. While summarizing a transcript requires Apple Intelligence, the transcription process itself works without enabling Apple’s AI suite.

Apple’s Personal Voice feature lets you
mimic your own voice without using ChatGPT.Foundry
Personal Voice
Apple takes accessibility tools seriously, helping those with different needs make the most of its devices. Personal Voice is an AI-powered accessibility feature that lets your iPhone convert text to speech using your own voice. While the initial version of the feature required a tedious 15-minute setup, the latest iteration only prompts you to read 10 sentences out loud. Once set up, your iPhone can talk using your mimicked voice whenever you input text in the designated field.
Siri Suggestions
While Siri itself isn’t always so great at handling user requests, the suggestions it makes can be quite handy. For those unfamiliar, iOS learns from your habits and surfaces relevant actions and information in different parts of the system, such as Spotlight Search and the Lock Screen. For example, if you tend to launch Apple Maps every day at 5 p.m. to navigate home after work, the shortcut will be waiting for you the moment you unlock your iPhone around that time.
Live Text
Live Text is a system-wide optical character recognition (OCR) tool that lets you interact with supported characters in a wide range of apps. In Photos, for example, you can quickly copy phrases that appear in images or call a photographed number. The feature also works in compatible third-party apps, in addition to online images when browsing with Safari. It’s a solid way to copy, translate, or share text directly from a photo.

You can transcribe any spoken word using your iPhone’s keyboard.
Foundry
While email summaries were recently launched as an Apple Intelligence exclusive, the Mail app has gained other smart features that don’t require enabling the AI suite. Notably, it can now analyze and sort emails based on their topics, placing them in dedicated inboxes. These include Primary, Transactions, Promotions, and Updates. It’s a neat way to filter out the noise that you can easily toggle on or off at any point.
Live Recognition
Personal Voice isn’t the only AI-powered accessibility tool available on iOS. Live Recognition is a built-in smart camera that can describe objects and people in front of you. Those with limited vision can rely on their iPhones to detect what’s around them in a hassle-free manner.

Mail will automatically sort your purchase, promotion, and primary messages.
Foundry
Battery
The iPhone’s smart tools also extend to power management. iOS learns from your charging habits to power the Optimized Battery Charging feature. If you keep your iPhone plugged overnight, it won’t fully charge until the morning, when you typically unplug it.
Adaptive Power mode is another smart feature that monitors your usage and optimizes the performance to maximize your iPhone’s battery life.
Camera
Despite the built-in Camera app not offering many of the AI tools like those found on Pixels and other Android phones, it still packs a lot of smarts. For example, it can recognize faces and pets before shooting. This lets it shift its focus dynamically—depending on what it detects in the viewfinder. The app can similarly suggest shooting modes based on the environment, such as Night mode in low-light setups, Macro mode for closeups, and so on.

