The dazzling new iPhone lock screen designs in iOS 16 may have made all the headlines at WWDC 2022, but behind them is a new feature that is also highly unusual for Apple – Photoshop-style editing abilities.
Apple’s AI tools have traditionally focused on helping you take great iPhone photos, rather than editing them. But a new ‘Visual Look Up’ feature, which you’ll be able to find in the Photos app and iOS 16, lets you tap on the subject of a photo (e.g. a dog) and take it off the snap to be pasted elsewhere, like in Messages.
That might not sound too spectacular, but the unnamed feature – which has echoes of Google’s ‘Magic Eraser’ for Pixel phones – will be a significant addition to iPhones when it comes to the software update later this year. Apple usually leaves these kinds of tricks to the best photo editing apps, but now it’s dabbling in Photoshop’s automated abilities.
Just a few years ago, cropping a complex subject in a photo used to be the privilege of Photoshop nerds. But Apple says its Visual Look Up feature, which also automatically provides information about the subject you touch, is based on advanced machine learning models.
Simply lifting a French Bulldog from the background of a photo is, according to Apple, powered by a neural model and engine that performs 40 billion operations in milliseconds. This means it will only be supported by the iPhone XS (and later models).
In addition to the Photos app, the feature will apparently also work in Quick Look, which allows you to quickly view images in apps. There are also echoes of this in iOS 16’s new customizable lock screens, which can automatically place elements of a photo on the front of your iPhone’s clock for a more modern look.
Right now, the feature is limited to letting you quickly cut and paste subjects into photos, but Apple clearly has an appetite for building Photoshop-style tools into its iPhones. And iOS 16 might just be the beginning of your battle with Adobe and Google when it comes to letting you quickly tweak and edit your photos.
Analysis: The AI editing race heats up
Photoshop and Lightroom will always be popular with professional photographers and enthusiasts alike, but we’re starting to see tech giants create automated equivalents of Adobe’s most popular tools on their operating systems.
Last month, Google announced that its Magic Eraser tool, available on Pixel phones, now lets you change the color of objects in your photos with just one tap. This new feature joined the tool’s existing ability to remove unwanted objects or people from your photos.
Apple hasn’t gone that far with the new Visual Look Up feature, which is more akin to Photoshop’s ‘Select Subject’ tool than Google’s Healing Brush. But the iOS 16 update is significant in the context of the broader race to develop better mobile editing skills for point-and-shoot photographers.
There’s no reason Apple shouldn’t extend the concept to let you, for example, select and replace a dull sky with a more dramatic one. This ‘Sky Replacement’ feature is one we’ve seen recently come to Photoshop and other AI desktop photo editors, and today’s smartphones certainly have the processing power to pull it off.
Of course, Adobe will not sit back and let Apple and Google eat their editing lunch, even if Apple appears to be attacking from the side. By incorporating these technologies into essential features like the new iOS 16 lock screen, Apple makes them part of not just an iPhone stock app, but the core operating system. This is a problem for Adobe, but good news for anyone who doesn’t want to learn or pay for Photoshop.