Google and Samsung just launched AI features that Apple couldn’t do with Siri

by
0 comments
Google and Samsung just launched AI features that Apple couldn't do with Siri

Google has just announced that Gemini will soon be able to perform certain multistep actions on your phone, like ordering food or hailing a car, starting first with the Pixel 10, Pixel 10 Pro, and the recently announced Samsung Galaxy S26 phones. This all sounds a bit like the features Apple announced for Siri at the 2024 Worldwide Developers Conference — before Apple delayed those planned features to March 2025 and which are still not released.

On stage, Google’s president of Android, Samir Samat, showed a demo of how Gemini’s new agentive features will work to help people juggle a pizza dinner order from their busy family group chat. Samat asked Gemini to look at the chat thread and figure out what to order, and then order from the delivery app. Onscreen – in a pre-recorded video, it wasn’t live – you can see Gemini figuring out what everyone wants from the group chat and showing it in a window. The user then, via voice request, specifying the name of a specific pizzeria, asks Gemini to fulfill that order. Gemini then clicks on Grubhub to prepare the order, everything is still on the screen. When the order is ready, Gemini sends an alert so the user can review it and actually press the submit button.

Leave aside the fact that this situation does not appear He While doing this by yourself in the Grubhub app is complicated (or even by calling the pizzeria to talk through it with a human), this is a potentially big moment for agentic AI. Google just recently added the ability for users to auto-browse to Chrome for Gemini, and being able to do something similar right inside Android seems like a logical next step; Google clearly wants Gemini to be thought of as a support agent or productivity partner, rather than just a chatbot or a series of AI models.

Assuming that agentive Gemini features will also launch “soon” Google is promising And that Apple doesn’t pull a rabbit out of its hat, Google will also beat Apple to the punch on some of its most impressive Apple Intelligence demos – also shown only in pre-recorded videos – from that WWDC 2024 show. One feature shown by Apple will help Siri understand what’s on your screen and take action on it, meaning you can ask Siri to add an address from a message thread to the contact card of the person you’re texting with. Apple demonstrated how Siri will be able to perform actions for you inside and across apps. The company said Siri will also be able to understand your personal context, meaning you can ask her when your mom’s flight is landing and the assistant will pull information from an email and show it to you.

Almost two years later, nothing of them is still available. When Apple announced that the features would be delayed, the company also took out an ad showing off the features. Based on reporting from and bloombergSome features may not be available up to ios 27.

Of course, there are still many questions about Gemini’s new capabilities. They have to actually ship. We’ll have to try them out to see if they’re as useful and functional as advertised — Google is calling this initial launch “beta,” so there may be some glitches. And we don’t know how many developers will actually let Gemini browse their apps on behalf of users, which the verge Editor-in-Chief Nilay Patel likes to call the DoorDash problem. (Google says Gemini will be able to work in “select rideshare and food apps”.)

But it seems Google has overtaken Apple in a big way, and now Apple has a lot more to do to catch up.

Related Articles

Leave a Comment