sirikit

Create custom type parameter in sirikit

不羁的心 提交于 2020-05-17 07:42:23
问题 I am working on siri kit and try to create own datatype in Intents.intentdefinition. i refer soupchef demo project which was presented by apple on last WWDC 2019 and in that demo project toppings fields has a custom type. So how custom type is showing and come in datatype/dropdown list? see below How can i create my custom type field? Thank you in advance. 回答1: On the left in the screenshot there are OrderSoup , Soup , OrderType , OrderDetails . If you look in in the lower parts on the left,

My app does not find sirikit intentdefinition classes in xcode 11.4

眉间皱痕 提交于 2020-05-13 19:18:33
问题 My app has had support for a couple of siri shortcuts since it came out, haven't touched it in a long time. My project compiles and builds in 11.3.2 but doesn't in 11.4. The errors i get is that the classes that are supposed to be generated from my intentdefinition file, and that i am referencing in my intenthandlers are no longer found. For example, in my intentdefinition file i have a custom intent called "MyStatus". I then have a MyStatusIntentHandler: #import "MyStatusIntent.h" @interface

SiriKit INPayBillIntentHandling - Siri says, “I wish I could, but <App> hasn't set that up with me yet.”

六眼飞鱼酱① 提交于 2020-01-14 13:12:51
问题 I'm integrating Sirikit , Bill Payment using the intent: INPayBillIntentHandling (which was released recently in iOS 10.3+ , 27 Mar 2017). Apple Documentation is here. Note: I'm using Obj-C Language, XCode 8.3, Device iPhone 6S with iOS 10.3 & Demo Project iOS Deployment target is iOS 10.3 AND also enabled the Siri when asked the permission for the first time and also verified that In Settings, Siri is enabled. When I launch the app on device and say "Bill Payment using DemoApp", Siri says "I

SiriKit INPayBillIntentHandling - Siri says, “I wish I could, but <App> hasn't set that up with me yet.”

安稳与你 提交于 2020-01-14 13:11:22
问题 I'm integrating Sirikit , Bill Payment using the intent: INPayBillIntentHandling (which was released recently in iOS 10.3+ , 27 Mar 2017). Apple Documentation is here. Note: I'm using Obj-C Language, XCode 8.3, Device iPhone 6S with iOS 10.3 & Demo Project iOS Deployment target is iOS 10.3 AND also enabled the Siri when asked the permission for the first time and also verified that In Settings, Siri is enabled. When I launch the app on device and say "Bill Payment using DemoApp", Siri says "I

How to customise SiriKit intent extension for shortcuts app to ask allow access?

谁说我不能喝 提交于 2019-12-24 18:50:23
问题 We are allowed to build custom intent for Siri, by extending SiriKit. Moreover, Apples present how to manage basic integrations and customizations to Siri Shortcuts and Shortcuts app Link for shortcuts app Nonetheless, some apps have their own shortcuts UI with a little customization; for instance with the button to allow access for API, how can we do that? In the documentation presented by Apple, there is no such instruction for how is it possible to customize the UI inside the shortcuts app

Is there a way to detect if user was already in your app before launching Siri?

*爱你&永不变心* 提交于 2019-12-24 10:49:02
问题 Is this possible to know inside my Siri intent handler if the user was in my app before launching Siri? 回答1: You can use App Groups to share data between your app's main target and your Siri intent handler. You could simply store a Bool in UserDefaults indicating whether your app was already launched or not, then access the same value using App Groups from your Siri intent handler. For more information on App Groups, see the App Extension Programming Guide. 来源: https://stackoverflow.com

Clear input of SFSpeechAudioBufferRecognitionRequest after every result (Swift3)

做~自己de王妃 提交于 2019-12-23 13:16:01
问题 I have integrated speech to text through this appcoda tutorial. The problem I am facing is I want that user can write/edit something himself, but SFSpeechAudioBufferRecognitionRequest doesn't take into consideration the thing user has typed. What's the way to send user typed input in SFSpeechAudioBufferRecognitionRequest or any way to clear SFSpeechAudioBufferRecognitionRequest input params before sending new request. Thanks in advance. 回答1: Here is what I use to create my recognition request

Launch app from INExtension in SiriKit

你离开我真会死。 提交于 2019-12-21 17:26:37
问题 I want to use SiriKit to start a workout. Starting the workout requires opening the main app from the app extension. The boilerplate that Apple provides for the INStartWorkoutIntentHandling handler is func handle(startWorkout startWorkoutIntent: INStartWorkoutIntent, completion: (INStartWorkoutIntentResponse) -> Void) { let userActivity = NSUserActivity(activityType: NSStringFromClass(INStartWorkoutIntent)) let response = INStartWorkoutIntentResponse(code: .success, userActivity: userActivity

SiriKit support for general services

邮差的信 提交于 2019-12-19 09:59:06
问题 I have watched SiriKit in wwdc and read document. https://developer.apple.com/library/prerelease/content/documentation/Intents/Conceptual/SiriIntegrationGuide/ Add SiriKit support only if your app implements one of the following types of services: Audio or video calling Messaging Payments Searching photos Workouts Ride booking I am still wondering whether I can do for other services (since my app will be for enterprise app). My service will be very simple searching only like "Find SQ212 in