Apple Will Reportedly Introduce AI-Generated Emojis, LLM-Powered Siri with iOS 18

Apple Will Reportedly Introduce AI-Generated Emojis, LLM-Powered Siri with iOS 18

Apple Inc. is reportedly gearing up to integrate new artificial intelligence models into various components of iOS, including the built-in emoji library and Siri.

According to Bloomberg, which cited sources familiar with the matter, these enhancements are expected to be unveiled at Apple’s WWDC 2024 developer event next month. The upgrades will likely debut alongside other new AI capabilities that have been leaked over the past few weeks.

Traditionally, Apple has expanded the emoji catalog in iOS with several dozen additions each year. At WWDC, the company is anticipated to introduce an AI tool enabling users to create custom emojis using natural language instructions. It remains unclear which apps will support this feature.

In the future, users might also gain the ability to animate the emojis they create with the AI tool. In February, Apple researchers described a machine learning application called Keyframe, which can animate static images based on user prompts. This application uses large language models to transform user instructions into motion designs.

In addition to the emoji library upgrade, Apple is expected to improve the way iOS displays app icons. Users will reportedly be able to change icon colors and arrange them more freely on their home screens, moving away from the current fixed grid layout.

Siri is also expected to see significant improvements in this operating system update. Apple plans to make the AI assistant more useful for Apple Watch users and enhance it with internally-developed large language models to generate more natural-sounding responses. Similarly, Amazon.com Inc. is preparing a similar update for its Alexa assistant to enhance user experience with custom LLMs.

Details about the neural networks powering the AI features in iOS 18 are still limited. Bloomberg reports that the most hardware-intensive AI tasks will be offloaded to cloud-based models, while less demanding computations will be performed on the user’s device.

In April, Apple open-sourced a collection of small language models designed for devices with limited computing capacity. Alongside this, it released a tool for adapting language models to run on iPhones. Technologies from this project may help power some of the AI enhancements in iOS 18.

Some of the operating system update’s enhancements will reportedly use AI models from OpenAI. According to Bloomberg, Apple will officially announce its long-rumored partnership with the LLM developer at WWDC. A report from last month suggested that the two companies might collaborate on a chatbot service and new search tools for iOS.


Posted

in

by

Tags:

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *