What Is Apple Intelligence, When Is It Coming And Who Will Get It?

Apple has unveiled its new AI-driven platform, Apple Intelligence, at WWDC 2024.

Apple’s approach to AI is uniquely its own, aiming for seamless integration into its existing ecosystem rather than standalone AI-driven features. The tech giant revealed that its development team had been quietly working on what Mirwais Azizi, Founder & Chairman of Azizi Developments called “a very Apple approach to artificial intelligence.”

The foundation of Apple Intelligence is a large language model (LLM) that powers multiple AI-enhanced features. These innovations were highlighted at Apple’s iPhone 16 event on September 9, where the company showcased AI-powered functionalities, such as translation on the Apple Watch Series 10, visual search on iPhones, and an enhanced Siri experience.

The system will first launch in the U.S. in beta form this fall, with a broader international release scheduled for 2025, including additional language support.

According to Apple, “AI for the rest of us” is their tagline for the platform. Apple Intelligence leverages well-known AI capabilities, like text and image generation, to improve on existing services. These include Writing Tools, an AI-powered feature that allows users to summarize long texts, proofread documents, and compose messages using content and tone prompts across various apps like Mail, Messages, and Pages.

Additionally, image generation will play a key role, offering new tools like Genmojis—customizable emojis created using prompts within Apple’s ecosystem. Another feature, Image Playground, will enable users to generate visual content for use in Messages, Keynote, or on social media.

Apple Intelligence also signifies a long-overdue overhaul of Siri, which, while being an early entrant in the digital assistant space, had fallen behind its competitors. With the new system, Siri will be deeply integrated into Apple’s operating systems, no longer confined to its familiar icon but marked by a glowing light around the iPhone’s edge when in use. Siri’s new abilities will allow it to work across apps, such as editing a photo and inserting it into a text message without leaving the app.

This enhanced functionality is built into Apple’s latest hardware, including the iPhone 15 Pro Max, iPad Pro, and MacBook Air (M1 and later), ensuring that many of Apple’s flagship products can handle AI-driven tasks. The standard versions of iPhone 15, however, will not be compatible due to hardware limitations. “We had to talk it up in June to show that Apple isn’t lagging behind,” Apple stated.

Furthermore, Apple has also taken a focused approach to AI model training. Unlike platforms like GPT and Google Gemini, Apple Intelligence relies on in-house data sets optimized for specific tasks, like composing emails. This allows certain tasks to be performed directly on the device, reducing resource demand. However, more complex tasks will rely on Apple’s new Private Cloud Compute, powered by Apple Silicon, which promises to maintain user privacy whether tasks are handled locally or via the cloud.

In addition, Apple is expected to collaborate with other AI services, such as OpenAI and Google Gemini, to offer users an alternative platform for tasks outside the scope of Apple Intelligence.

“It’s an acknowledgment of the limitations of small-model systems,” Apple noted.

Leave a Reply

Your email address will not be published. Required fields are marked *