Hello Apple Developer Community,
I’m working on integrating Siri into my React Native app using native iOS code and bridging to React Native. I’ve followed the necessary steps to set up Siri support, including:
Adding the Siri capability.
Adding Siri usage descriptions in Info.plist.
Using AppIntent and AppShortcutsProvider to define shortcuts.
However, I’m facing the following issues:
Siri Prompts for Confirmation
When a user says a phrase, Siri asks, "Turn on 'MyApp' shortcuts with Siri?" instead of directly recognizing the phrase. Is this expected behavior? If so, how can I reduce friction for users and make the experience more seamless?
Inconsistent Behavior for Existing Users
For users updating to a version with Siri support:
When the app is closed, Siri says, "MyApp hasn't added support for that with Siri."
When the app is open, Siri prompts, "Turn on shortcut for MyApp?" and rest all working fine
Why does Siri not recognize the shortcut when the app is closed, even though the shortcut is defined in AppShortcutsProvider? How can I ensure that Siri recognizes the shortcut regardless of whether the app is open or closed? Other than using AppIntent and AppShortcutsProvider should i try Donating shortcuts(will that helps for updated user case). Please help me on this
SiriKit
RSS for tagHandle requests for your app’s services from users using Siri or Maps.
Posts under SiriKit tag
38 Posts
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
func handle(intent: INSendPaymentIntent, completion: @escaping (INSendPaymentIntentResponse) -> Void) {
if let _ = intent.payee, let currencyAmount = intent.currencyAmount {
let userActivity = NSUserActivity(activityType: "com.rapipay.nye.test.failure")
userActivity.userInfo = ["amount": Int(truncating: currencyAmount.amount!)]
completion(INSendPaymentIntentResponse(code: .failureRequiringAppLaunch, userActivity: userActivity ))
return
}
}
func application(_ application: UIApplication, continue userActivity: NSUserActivity, restorationHandler: @escaping ([any UIUserActivityRestoring]?) -> Void) -> Bool {
print("ddddd")
return true
}
this code is perfectly working on simulator and even i fi user userActivity as nil, continue userActivity is called, but on device it is not called
Siri is working, asking for name and amount and handler is also called and opens app but does not pass intent or userActivity
I'm trying to add Siri support to my app for sending voice messages. I've implemented INSendMessageIntentHandling in my main app target.
It looks like it's getting as far as recording the voice message and passing my intent handler an INSendMessageIntent with an audio attachment, but I'm not able to read the attachment file.
func handle(
intent: INSendMessageIntent,
completion: @escaping (INSendMessageIntentResponse) -> Void
) {
if let attachment = intent.attachments?.first,
let audioFile = attachment.audioMessageFile,
let fileURL = audioFile.fileURL
{
// This branch runs
// fileURL is "file:///var/mobile/tmp/SiriMessages/89F738F7-6092-439A-B4FA-2DD9A99F0EED.caf"
let result = processMessageAudio(url: fileURL)
completion(result)
return
}
// This line isn't reached
completion(.init(code: .failure, userActivity: nil))
}
private func processMessageAudio(url: URL) -> INSendMessageIntentResponse {
var fileRef: ExtAudioFileRef?
if url.startAccessingSecurityScopedResource() {
logDebug("File access allowed")
} else {
// This branch runs
logDebug("File access not allowed")
}
defer {
url.stopAccessingSecurityScopedResource()
}
let openStatus = ExtAudioFileOpenURL(url as CFURL, &fileRef)
// openStatus is -54 (kAudio_FilePermissionError)
return INSendMessageIntentResponse(code: .failure, userActivity: nil)
}
I'm not sure what I'm missing. It looks like there should be an audio file, and Siri shows a preview of the audio for confirmation.
Hi everyone,
I’m an AI engineer working on autonomous AI agents and exploring ways to integrate them into the Apple ecosystem, especially via Siri and Apple Intelligence.
I was impressed by Apple’s integration of ChatGPT and its privacy-first design, but I’m curious to know:
• Are there plans to support third-party LLMs?
• Could Siri or Apple Intelligence call external AI agents or allow extensions to plug in alternative models for reasoning, scheduling, or proactive suggestions?
I’m particularly interested in building event-driven, voice-triggered workflows where Apple Intelligence could act as a front-end for more complex autonomous systems (possibly local or cloud-based).
This kind of extensibility would open up incredible opportunities for personalized, privacy-friendly use cases — while aligning with Apple’s system architecture.
Is anything like this on the roadmap? Or is there a suggested way to prototype such integrations today?
Thanks in advance for any thoughts or pointers!
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Tags:
SiriKit
Machine Learning
Apple Intelligence
If you add a (SiriKit / Widget) intent definition file to an Xcode project and then translate it into another language, the build of the iOS app only works until you close the project. As soon as you open the project again, you get the error message with the next build:
Unexpected duplicate tasks
A workaround for this bug is, that you convert the folder (where the intent file is located) in Xcode to a group. After that every thing works without problems.
Steps to reproduce:
Create a new iOS project
Add a localization to the project (German for example)
Add a SiriKit Intent Definition File
Localize the SiriKit Intent Definition File
Build the project (should work without errors)
Close the project
Open the project again
Build the project again
Expected result:
The project builds without problems
Current result:
The project doesn’t build and returns the error: Unexpected duplicate tasks
Is this a known problem? Is there a way to solve this without switching to Xcode groups (instead of folders)
In my app, when invoking a Shortcut via Siri, the
application(_:continueUserActivity:restorationHandler:)
method in AppDelegate is called twice.
When I debug, both NSUserActivity objects are identical.
However, when I run the same Shortcut by tapping it manually, the method is only called once as expected.
Has anyone experienced this issue? How can I prevent Siri Shortcuts from delivering the same NSUserActivity twice?
When my AppShortcut phrase is:
"Go (.$direction) with (.applicationName)"
Then everything works correctly, the AppIntent correctly receives the parameter. But when my phrase is:
"What is my game (.$direction) with (.applicationName)"
The an alert dialog pops up saying:
"Hey siri what is my game tomorrow with {app name}
Do you want me to use ChatGPT to answer that?"
The phrase is obviously heard correctly, and it's exactly what I've specified in the AppShortcut. Why isn't it being sent to my AppIntent?
import Foundation
import AppIntents
@available(iOS 17.0, *)
enum Direction: String, CaseIterable, AppEnum {
case today, yesterday, tomorrow, next
static var typeDisplayRepresentation: TypeDisplayRepresentation {
TypeDisplayRepresentation(name: "Direction")
}
static var caseDisplayRepresentations: [Direction: DisplayRepresentation] = [
.today: DisplayRepresentation(title: "today", synonyms: []),
.yesterday: DisplayRepresentation(title: "yesterday", synonyms: []),
.tomorrow: DisplayRepresentation(title: "tomorrow", synonyms: []),
.next: DisplayRepresentation(title: "next", synonyms: [])
]
}
@available(iOS 17.0, *)
struct MoveItemIntent: AppIntent {
static var title: LocalizedStringResource = "Move Item"
@Parameter(title: "Direction")
var direction: Direction
func perform() async throws -> some IntentResult {
// Logic to move item in the specified direction
print("Moving item \(direction)")
return .result()
}
}
@available(iOS 17.0, *)
final class MyShortcuts: AppShortcutsProvider {
static let shortcutTileColor = ShortcutTileColor.navy
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: MoveItemIntent()
, phrases: [
"Go \(\.$direction) with \(.applicationName)"
// "What is my game \(\.$direction) with \(.applicationName)"
]
, shortTitle: "Test of direction parameter"
, systemImageName: "soccerball"
)
}
}
Topic:
App & System Services
SubTopic:
Automation & Scripting
Tags:
Siri and Voice
SiriKit
App Intents
It would be good if we could tallk to Siri with emojis as well. I’m pretty sure emojis are her’s native language.
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Tags:
Siri Event Suggestions Markup
SiriKit
Hi,
I'm building an aftermarket solution to enable Apple Maps to support EV routing for any EV.
I am going through the documentation and found some gaps - does anyone know how the following properties work?
INGetCarPowerLevelStatusIntentResponse - consumptionFormulaArguments
INGetCarPowerLevelStatusIntentResponse - chargingFormulaArguments
Is there a working example that anyone has seen?
Many thanks
Topic:
App & System Services
SubTopic:
Maps & Location
Tags:
CarPlay
SiriKit
Maps and Location
App Intents
When I create a list of 6 options for disambiguation, option 5 is ignored by Siri. For example, I have the following code to resolve one of my parameters for a custom intent. When this code executes, Siri reads off the Disambiguation Prompt in the Intents Definition. Then it displays the options shown below. However, if you respond to the prompt with "No" or "No change it", the response is completely ignored. The resolve method is not even called with anything. But if you respond with "Yes continue" or "Yes I'm done", the resolve method is called and I can process the chosen option.
Does this seem like a bug or am I missing something? Does it have anything to do with the Intent Definition for this parameter having Paginate every ___ items set to 6?
NSMutableArray *options;
options = [[NSMutableArray alloc] init];
NSString *anOption = [NSString stringWithFormat:@"%@ myList", intent.partsListName]; // option 1
[options addObject:anOption];
anOption = @"option 1";
[options addObject:anOption];
anOption = @"option 2";
[options addObject:anOption];
anOption = @"option 3";
[options addObject:anOption];
anOption = [NSString stringWithFormat:@"Yes, continue"]; // option 4
[options addObject:anOption];
// Option 5
anOption = [NSString stringWithFormat:@"No, change it"];
[options addObject:anOption];
// Option 6
anOption = [NSString stringWithFormat:@"Yes, I'm done"];
[options addObject:anOption];
completion([INStringResolutionResult disambiguationWithStringsToDisambiguate:[options copy]]);
In this thread, I asked about adding parameters to App Shortcuts. The conclusion that I've drawn so far is that for App Shortcuts, there cannot be any parameters in the prompt, otherwise the system cannot find the AppShortcutsProvider. While this is fine for Shortcuts and non-voice interaction, I'd like to find a way to add parameters to the prompt. Here is the scenario:
My app controls a device that displays some content on "pages." The pages are defined in an AppEnum, which I use for Shortcuts integration via App Intents. The App Intent functions as expected, and is able to change the page based on the user selection within Shortcuts (or prompted if using the App Shortcut). What I'd like to do is allow the user to be able to say "Siri, open with ."
So far, The closest I've come to understanding how this works is through the .intentsdefinition file you can create (and SiriKit in general), however the part that really confused me there is a button in the File Editor that says "Convert to App Intent." To me, this means that I should be able to use the app intent I've already authored and hook that into Siri, rather than making an entirely new function/code-block that does exactly the same thing. Ideally, that's what I want to do.
What's the right way to define this behavior?
p.s. If I had to pick an intent schema in the context of AssistantSchemas, I'd say it's closest to the "Open File" one, if that helps. I'd ultimately like to make the "pages" user-customizable so in the long run, that would be what I'd do.
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Tags:
Siri and Voice
SiriKit
App Intents
Apple Intelligence
I have a chat/search functionality in my app, I want to integrate siri where user can say something "Hey siri! Ask myApp to get latest news" then I want to invoke my search functionality with "get latest news". I see iOS apps like chatGPT and youtube have already achieved this.
I am able to invoke the intent with static phrase which is expecting the parameter, user is able to provide the value when prompted after requestValueDialog. But it is a 2 step process for end user. I want to achieve in a single step.
struct CombinedSiriShortcuts: AppShortcutsProvider {
static var appShortcuts: [AppShortcut] {
return [
AppShortcut(
intent: ShowSpecificNewsArticleIntent(),
phrases: [
"Ask \(.applicationName) to run a query:",
],
shortTitle: "Specific News Article",
systemImageName: "doc.text.fill"
),
AppShortcut(
intent: TestQuery(),
phrases: [
"Ask \(.applicationName) to \(\.$query)",
],
shortTitle: "Test intent",
systemImageName: "doc.text.fill"
),
]
}
}
struct ShowSpecificNewsArticleIntent: AppIntent {
static var title: LocalizedStringResource = "Show Specific News Article"
static var description = IntentDescription(
"Provides details about a specific news article based on its title."
)
@Parameter(title: "Query")
var query: String
@MainActor
func perform() async throws -> some IntentResult & ProvidesDialog & ShowsSnippetView {
print("in show specific intent");
print(query);
return .result(dialog: "view more about: \(query)")
}
}
Topic:
App & System Services
SubTopic:
Automation & Scripting
Tags:
Siri and Voice
SiriKit
App Intents
I need to elicit the location of the user in the Siri intents and so I call:
override init(){ super.init() self.locationManager=CLLocationManager() self.locationManager.delegate = self; self.locationManager.startUpdatingLocation() self.locationManager.requestWhenInUseAuthorization() }
Still neither
public func locationManagerDidChangeAuthorization(_ manager: CLLocationManager)
nor
public func locationManager(_ manager: CLLocationManager, didUpdateLocations locations: [CLLocation])
are ever called, notwithstanding the presence of the correct entry in the info.plist, the inclusion of the library and the indication of the delegation with:
class IntentHandler: INExtension, INSendMessageIntentHandling, CLLocationManagerDelegate, UISceneDelegate
are ever called.
Is there any problem with CLLocation manager on intents? What would be a big problem as there is no way to share information with the main app!
I am building a CarPlay navigation app and I would like it to be as hands free as possible.
I need a better understanding of how Siri can work with CarPlay and if the direction I need to go is using Intents or App Shortcuts.
My goal is to be able to have the user speak to Siri and do things like "open settings" or "zoom in map" and then call a func in my app to do what the user is asking.
Does CarPlay support this? Do I need to use App Intents or App Shortcuts or something else?
Was going to add a shortcut to an app via INIntent. I followed the WWDC developer.apple.com/videos/play/wwdc2021/10232/?time=986
Steps:
Created a .intentdefinition file and created an intent.
Added the intent to .intentdefinition and compiled the app.
Import the header file for the custom intent in the AppDelegate MyIntentname.h
Have the AppDelegate conform to the protocol created in the generated code.
Implement: -application:handlerForIntent: and return self (the app delegate)
Run the app.
Open the Shortcuts app and search for the 'shortcut' (according to the WWDC video linked above it should show up in the actions list).
Doesn't show up in the list.
I tried moving the build application out from Debug to my Applications folder to see if that would help the Shortcuts app find it, but it didn't.
Am I missing a step/doing something wrong?
Topic:
App & System Services
SubTopic:
Automation & Scripting
Tags:
Shortcuts
SiriKit
Intents
App Intents
Hello everyone,
I'm currently working on an App Intent for my iOS app, and I’ve encountered a frustrating issue related to how Siri prompts for a category selection. Here’s an overview of what I’m dealing with:
extension Category: AppEntity, @unchecked Sendable {
var displayRepresentation: DisplayRepresentation {
DisplayRepresentation(title: "\(name)")
}
static var typeDisplayRepresentation = TypeDisplayRepresentation(name: "Category")
typealias DefaultQueryType = ShortcutsCategoryQuery
static var defaultQuery: ShortcutsCategoryQuery = ShortcutsCategoryQuery()
}
struct ShortcutsCategoryQuery: EntityQuery {
func entities(for identifiers: [String]) async throws -> [Category] {
let context = await ModelContext(sharedModelContainer)
let categories = try CategoryDataProvider(context: context).getItems()
return categories.filter { identifiers.contains($0.id) }
}
func entities(matching string: String) async throws -> [Category] {
return try await suggestedEntities()
}
func suggestedEntities() async throws -> [Category] {
let context = await ModelContext(sharedModelContainer)
do {
let categories = try CategoryDataProvider(context: context).getItems()
if categories.isEmpty {
print("No categories found.")
}
return categories.map { category in
Category(
id: category.id,
name: category.name,
stringSymbol: category.stringSymbol,
symbol: category.symbol,
stringColor: category.stringColor,
color: category.color
)
}
} catch {
print(error)
return []
}
}
}
The issue arises when I use Siri to invoke the intent. Siri correctly asks me to select a category but does not display any options unless I said something that Siri recognized, like "Casa(House) or Teste(Test)" in portuguese. Only then does it show the list of available categories.
I would like the categories to appear immediately when Siri asks for a selection. I've already tried refining the ShortcutsCategoryQuery and debugging various parts of my code, but nothing seems to fix this behavior.
I have an app that's capable of playing podcasts via Siri requests, e.g. "Hey Siri, play [Podcast Name]". I’m using INPlayMediaIntentHandling, that is, the SiriKit domain intents, as opposed to the newer AppIntents framework for its ability to select my app for audio playback without the need to specify the name of the app in the user's request to Siri.
This works great overall for the many podcasts I’ve tested the app with, with the exception of one. There's a podcast called "The Headlines", and I when I test the app with the request "Hey Siri, play The Headlines", my app is never selected. Instead, Apple Podcasts begins playback of a show called "NPR News Now".
Oddly, if the Apple Podcasts app is deleted, my app will still not be selected by the system, and instead, Siri responds with "I don’t see an app for that. You’ll need to download one" with a button to open the App Store. Additionally, if I do add the app name to the request using this style of intent, Siri responds with "[App Name] hasn’t added support for that with Siri." However, I’d still like to accomplish this without requiring the app name in the Siri request.
There's nothing complex in my setup:
The target declares one supported intent, INPlayMediaIntent, with "Podcasts" selected as a supported media category.
The Siri entitlement is enabled.
My INSiriAuthorizationStatus is .authorized.
My intent handler is specified in my AppDelegate as follows:
func application(_ application: UIApplication,
handlerFor intent: INIntent) -> Any? {
return IntentHandler.shared
}
My intent handler is simple:
final class IntentHandler: NSObject, INPlayMediaIntentHandling {
static let shared = IntentHandler()
func handle(intent: INPlayMediaIntent) async -> INPlayMediaIntentResponse {
print("IntentHandler: processing intent: \(intent)")
/** code to start playback based on information found in `intent` **/
}
When requesting Siri to "Play The Headlines", my handler code is not called at all. For all other supported shows, the print statement executes, and playback begins as expected.
Is there any way I can get my app to be selected instead of Apple Podcasts for this request?
I donate some INPlayMediaIntent to system, and I find them in Control center, when I click one of them to play media background, the handler don't execute resolve method, I wanna resolve some mediaItems for suggestion playlist
WWDC videos suggest that existing apps should continue using the old SiriKit domains, such as INPlayMediaIntent. But what about new apps for playing audio? Should we implement Siri functionality for audio playback using the old SiriKit domains, or should we create our own AppEntities and trigger them via custom AudioPlaybackIntent implementations?
Interactive widgets require an AppIntent and don’t support the old INPlayMediaIntent. To achieve the same functionality as the Music app widgets, it seems logical to adopt the new AudioPlaybackIntent. However, I can't find any information about this in the documentation.
I have shortcuts up and running and I have my custom response added to my completion handler since day1.
Recently I upgraded to iOS18, and found out the app I develop can not display the custom response.
I test the app on iOS17.6, the display of custom response is no problem.
The situation is exactly like the problem posted on 2018: https://forums.developer.apple.com/forums/thread/109324
Can anyone help me or have the same bugs?
Thank you so much!
Happy 2025