Can Google Save Apple AI? Gemini will power a new, personalized Siri

by
0 comments
Can Google Save Apple AI? Gemini will power a new, personalized Siri

Alice Batters Picaro/ZDNET

Follow ZDNET: Add us as a favorite source On Google.


ZDNET Highlights

  • Google’s Gemini will power Apple’s Siri on the backend.
  • The goal is to drive a more advanced and personalized Siri.
  • However, Siri still needs to be more reliable and less prone to errors.

With the pressure on Apple to finally get Siri right, the company is turning to its arch rival in the business for help.

one in joint statement Released on Monday, Apple and Google announced a multi-year partnership in which Google’s Gemini and cloud technology will power Apple Intelligence features, specifically a more advanced (and more personalized) Siri, which is expected to launch this spring.

In the statement, Apple described Google’s AI as the most capable platform for the Apple Foundation model, letting you access and run large language models (LLMs) directly from your device.

Also: How to remove Copilot AI from Windows 11 today

“The next generation of the Apple Foundation Model will be based on Google’s Gemini model and cloud technology,” the companies said. The idea is to utilize Gemini’s advanced AI capabilities on the backend, while relying on Apple’s own on-premise model and private cloud compute service to ensure that your conversations remain safe and secure on your device.

how it works

Siri will rely on Google’s advanced LLM for more natural and fluid conversations. LLMs are trained on large amounts of data to learn how to handle and process language and appear more human in their responses. On the AI ​​end of the game, Apple has struggled to fully develop and empower its own advanced AI and LLM, forcing it to rely on tools from other companies – notably OpenAI’s ChatGPT.

With Gemini on the backend, Siri should be able to act like an advanced chatbot. Among the specific features in the store, App Intents will enable Siri to work with Apple’s own apps and third-party apps, while “Personal Context Knowledge” will allow Siri to take actions based on its awareness of data and preferences on your device.

Plus: Cloud Cowork now automates complex tasks for you – at your own risk

Another skill, called On-Screen Awareness, will enable Siri to “see” what’s on the screen and interact with it based on your request. Another trick is “World Knowledge Answers”, in which Siri will act like a regular search engine as it scours the web to answer your question or request.

When to expect the new Siri

Reports about a new and improved assistant known as LLM Siri started emerging in late 2024. At the time, Apple watcher Mark Gurman said that this upcoming version was already being tested internally on iPhone, iPad, and Mac as a standalone app.

The goal was to launch the new Siri sometime in the spring of 2026. So far, that timeline appears to be on track. The latest reports suggest that the new Siri will launch with iOS 26.4 sometime in March.

Also: How I used ChatGPT’s $20 Plus plan to fix a nightmare bug fast

The new features and skills in Siri’s to-do list all sound interesting and potentially useful. But the main question is whether the new Gemini integration will help Apple’s error-prone and troubled assistant escape its problematic past.

Too often, Siri falls short of expectations, is unable to respond to requests, misunderstands what the user says, or gives incorrect answers. These problems are especially annoying when you’re trying to get driving directions in the car, and Siri keeps giving you all the wrong information.

Those of us who have been using Siri for years just want a chatbot that works. Apple has made this promise before and failed to deliver. With Gemini on the backend, we’ll see if Siri finally gets that much-needed improvement.

Related Articles

Leave a Comment