Siri to speak push notification in Objective-C - ios

Can we use Siri to read push notifications in iOS 10 using Objective-C?

Probably not. I'd recommend diving more into the documentation to find out how it works. SiriKit supports a limited set of functions that it categorises into "domains". The idea is that Siri is prepared to do all the work (for multiple languages) but they have to design how we interface with it, so there's only a few things it does at the moment.
More here.

Related

Siri full sentence order, like sending message via whatsApp

I have been searching for days about Siri integration with an IOS app.
I know about the Siri shortcuts/intents etc.
How do I have Siri take a full sentence such as "Text John I'm on my way" or "Text John via WhatsApp I'm on my way"
Is this something exclusive to Apple apps, is it limited to messaging only or are there other ways to integrate with Siri?
I'm not looking to integrate messaging app, but i'm looking to integrate the full sentence with parameters order/question.
Apple provides SiriKit which gives your application the ability to requests that originate from Siri.
You can look at the Human Interface Guidelines to learn more about designing an interface to interact with Siri.
Do some searcing on SiriKit examples. There are quite a few sources that show how do to an integration with your app.
If you are looking for information about speech recognition within your app, then you may want to look at the Apple Speech Framework
This framework gives you lower level voice recognition and parsing capabilities and may have the flexibility you need.
Hope this helps!
Messaging is not specific to Apple apps.
You can make your application to behave similar to Message app. You need to implement the app extension specific to message intent and add resolve param methods to the handler to handle user input.
Reference for Messaging with SiriKit
https://developer.apple.com/documentation/sirikit/messaging?changes=latest_minor
https://developer.apple.com/documentation/sirikit/insendmessageintent
Sample source - https://www.techotopia.com/index.php/An_iOS_10_Example_SiriKit_Messaging_Extension
https://medium.com/ios-os-x-development/extending-your-ios-app-with-sirikit-fd1a7ef12ba6

Vidyo.io Integration in Swift4

I'm new in IOS. I'm working on a Video Call project in swift. i'm using vidyo.io SDK for video Call and message chat. But I have some questions
If my app is in kill state or my phone is locked. how can I receive call Notification.
Some SDK's have VoIP support for Call notification in locked State. vidyo.io have support for VoIP? If yes how can I implement.
vidyo.io Documentation have some methods for use camera, microphone, Customize UI etc can we implement all these methods in swift?
if any one have good tutorials or helping materials please share.
You can find a Vidyo.io sample app built with Swift here: https://github.com/Vidyo/customview-swift-ios
Vidyo.io is a CPaaS focused on video chat. You can use the service for voice only, but you probably have better options if that is what you want to achieve.

Can we use Sirikit for voice recognition

I have implemented voice recognition in my application for voice to text conversion using Nuance Dragon SDK. I have also tried Open Ears but couldn't get it to work properly. Once conversion is completed I use that text as command to trigger action in my application.
I am wondering if using Sirikit we can do it within application. I was not able to understand it while checking the WWDC16 Sirikit Introduction. May be my interpretation of the intent is not clear but as for as I understood, there's no custom intent to trigger some action inside the application.
Plus is sirikit available for objective C as well or just Swift?
SiriKit is for integrating with Siri outside of the context of your application. However, Apple did releases a Speech Recognition API for iOS 10 as well that sounds more like what you want. You can learn more about it here: https://developer.apple.com/videos/play/wwdc2016/509/
All Apple Frameworks are usable by Objective-C and Swift.

Possible to use Apple system sounds in my iOS app?

Can you use the system sounds in your iOS app? I'm looking to have the same list that is used in the default timer app (Marimba, Alarm, Doorbell etc).
Reason i'm asking is that in Apple's own Multimedia docs it says:
Note: System-supplied alert sounds and system-supplied user-interface sound effects are not available to your application. For example, using the kSystemSoundID_UserPreferredAlert constant as a parameter to the AudioServicesPlayAlertSound function will not play anything.
Then i've come across this list of system sound ID's.
So can you use access and use these sounds in your own apps which will pass Apple's review process? If not are similar sounds available open source?
Actually if you use AudioToolbox/AudioToolbox.h framework and import it in your header file for the view controller, you can play Apple system sounds without jailbreaking. For example, putting
AudioServicesPlaySystemSound(0x450);
under an IBAction will play the Apple 'click' sound on the execution of the action.
Also, to hear the system sounds referenced earlier, there is a great app available on github that works on your iPhone (not the iOS simulator) that has the sounds for you to click and hear, as the documentation references them, but you cannot hear them. The app is nice to listen to and then find the corresponding reference number.
No, you are not able to access this sounds, until jailbreak, after jail break you can access this sounds like below.
AudioServicesPlaySystemSound(1000);
i hope this files are copyrighted.

Does Apple provide an API for SIRI?

Is it possible that Apple does or will provide an API for Siri? It would be great if I can be sipping my coffee and say,
User: Hey Siri, could you please open Angry
Birds; Level 4 and throw a first bird for me. Make sure you at least hit one green pig or it's coming out of your paycheck.
Siri: Yes sure, I will do that for you.
Is this possible? And would you think Apple will provide this to us?
THIS IS NO LONGER ACCURATE:
There is no API and there is no indication of it changing anytime soon. There are private headers that you can look at by decompiling the SDK. This is a great synopsis:
Quora
You can be clever like RTM though, this is as close as it gets:
http://www.rememberthemilk.com/services/siri/
In iOS 10, Apple has announced an API for Siri called SiriKit. However, you can only do it as an app extension and only if your app implements one of the following types of services:
Audio or video calling
Messaging
Payments
Searching photos
Workouts
Ride booking
Climate and radio
SiriKit is a way for you to make your content available through Siri.
It also lets you add support for your services to the Maps app. To
support SiriKit, you use the Intents framework and Intents UI
framework to implement one or more extensions that you then include
inside your iOS app. When the user requests specific types of services
through Siri or Maps, the system uses your extensions to provide those
services.
This means SiriKit cannot be used for the scenario mentioned in the question and in ways that many of us would like.
Source: Apple Docs for SiriKit
When the iPhone was first released, there was absolutely no public talk from Apple about custom app development. The delayed release of the SDK gave them plenty of time to get public feedback on the iPhone user experience and make the SDK ready for public use.
It seems likely that they're taking a similar approach with Siri.
Not yet. If you want it, file a feature request at bugreport.apple.com, and briefly describe what you want it for. The more people ask for it, the more likely it is to happen.

Resources