Apple Intelligence expands in iOS 18.2 developer beta, adding Genmoji, Visual Intelligence and ChatGPT

Apple Intelligence expands in iOS 18.2 developer beta, adding Genmoji, Visual Intelligence and ChatGPT

The Apple Intelligence rollout has been slow, staggered and steady since the company first unveiled its take on AI at WWDC this year. It continues today with the release of the latest developer betas for iOS 18, iPadOS 18 and macOS Sequoia. The updates in iOS 18.2, iPadOS 18.2 and macOS Sequoia (15.2) bring long-awaited features like Genmoji, Image Playground, Visual Intelligence and ChatGPT integration for those running the preview software, as well as Image Wand for iPads and more writing tools.

This follows the announcement that iOS 18.1 would be available as a stable release to the public next week, which would bring things like writing tools, notification summaries and Apple’s hearing test to the masses.

That represents the first time for people who haven’t opted into beta software to check out Apple Intelligence, which the company has widely touted as the headline feature for the devices it launched this year. The iPhone 16 series, for example, were billed as phones designed for Apple Intelligence, though they launched without those features.

Now that the next set of tools is ready for developers to test, it seems like we’re weeks away from them arriving to the public. For those already on the developer beta, the update will land automatically. As always, a word of caution: If you’re not already familiar, beta software is meant for users to test new features and often to check for compatibility or problems. They can be buggy, so always back up your data before installing previews. In this case, you’ll also need to have an Apple developer account to get access.

Today’s updates brings Genmoji, which lets you create custom emoji from your keyboard. You’ll go to the emoji keyboard, tap the Genmoji button next to the description or search input field, then enter what you want to create. Apple Intelligence will generate a few options, which you can swipe and select one to send. You’ll be able to use them as tapback reactions to other people’s messages too. Plus, you can make Genmoji based on pictures of your friends, creating more-accurate Memoji of them. Since these are all presented in emoji style, there won’t be the risk of mistaking them for real pictures.

Apple is also releasing a Genmoji API today so third-party messaging apps can read and render Genmoji, and folks you text on WhatsApp or Telegram can see your hot new gym rat emoji.

Other previously announced features like Image Playground and Image Wand are also available today. The former is both a standalone app and something you can access from the Messages app via the Plus button. If you go through Messages, the system will quickly generate some suggestions based on your conversations. You can also type descriptions or select photos from your gallery as a reference, and the system will serve up an image which you can then tweak. To prevent confusion, only some art styles are available: Animation or Illustration. You won’t be able to render photorealistic pictures of people.

Image Wand will also be arriving today as an update to the Apple Pencil tool palette, helping to turn your cruddy sketches into more-polished works of art.

As announced at WWDC, Apple is bringing ChatGPT to Siri and Writing Tools, and each time your request might be well-served by OpenAI’s tools, the system will suggest heading there. For example, if you ask Siri to generate an itinerary, a workout routine or even a meal plan, the assistant might say it needs to use ChatGPT to do so and ask for your permission. You can choose to have the system ask you each time it goes to GPT or surface these requests less often.

It’s worth reiterating that you don’t need a ChatGPT account to use these tools, and Apple has its own agreement with OpenAI so that when you use the latter’s services, your data like your IP address won’t be stored or used to train models. However, if you do connect your ChatGPT account, your content will be covered by OpenAI’s policies.

Elsewhere, Apple Intelligence will also show that you can compose with ChatGPT within Writing Tools, which is where you’ll find things like Rewrite, Summarize and Proofread. It’s also another area that’s getting an update with the developer beta — a new tool called “Describe your change.” This is basically a command bar that lets you tell Apple exactly what it is you want to do to your writing. “Make it sound more enthusiastic,” for example, or “Check this for grammar errors.” Basically, it’ll make getting the AI to edit your work a bit easier, since you won’t have to go to the individual sections for Proofread or Summarize, for example. You can also get it to do things lke “Turn this into a poem.”

Finally, if you have an iPhone 16 or iPhone 16 Pro and are running the developer beta, you’ll be able to try out Visual Intelligence. That lets you point your camera at things around you and get answers for things like math problems in your textbook or the menu of a restaurant you pass on your commute. It can tap third-party services like Google and ChatGPT, too.

Outside of the iPhone 16 series, you’ll need a compatible device to check out any Apple Intelligence features. That means an iPhone 15 Pro and newer or an M-series iPad or MacBook.

By John Routledge

Founder and owner of Technoshia.com - I'm an avid tech junkie, a lover of new gadgets and home automation. You will often find me reading, writing, and learning about new technologies. I've been featured in many leading technology magazines where I've written about my favorite topics.