Notification Service Extension errors in iOS 12.1 with AVFoundation - ios

Our app use notification service extension to play custom audio. It works fine before iOS12.0(include 12.0).
But when i upgrade to 12.1, and found it can't play custom audio in background.

Most extensions are not allowed to activate an audio session, and so cannot play audio or use speech synthesis. This wasn't being enforced consistently prior to iOS 12.1, but now it is.
Some extensions (such as notification content extensions) present UI to the user, and those are allowed to activate an audio session.
Background modes in info.plist are not allowed for app extensions (and will get your app rejected by App Review)

Related

Application not install on real CarPlay device

I add CarPlay integration to my music application.
When I testing my application/CarPlay on simulator all work is fine, but on real device nothing to show. Application Icon not show. Can anybody help me to fix it problem or give me idea how to fix it or what is a reason?
Add com.apple.developer.playable-content
to entitlements. Also when try to launch app on device I got a message error
To run an app on device (and CarPlay system), you need to add a capability to your provisioning profile (CarPlay entitlement). To do that contact to Apple here https://developer.apple.com//contact/carplay/ because CarPlay system allow only some kinds of apps (audio is one of them).
The CarPlay framework is for use by navigation apps only. If you want to add CarPlay support to your audio app, use MPPlayableContentManager. For messaging apps, use SiriKit’s Messaging-related intents to support reading and sending messages in CarPlay through Siri. For VoIP calling apps, use CallKit with SiriKit’s VoIP Calling-related intents to make and answer audio calls on a CarPlay system.
https://developer.apple.com/design/human-interface-guidelines/carplay/overview/introduction/
P.S. Documentation says than CarPlay framework is only for navigation apps, but all CarPlay interface is in that framework. So how can we build audio app with MPPlayableContentManager without CarPlay framework?

Wakeup Watch app OS2 from the parent IOS app?

I would like to give a haptic feedback on the watch. That works fine as long as the watch app is active, but if the watch app goes in background the haptic feedback is not played. Is there a possibility to "wake up" the watch app from the parent IOS app to play the haptic feedback? (NOT A NOTIFICATION)
Since you prohibit Notifications, is not possible to do what you describe. Your watch app can act on messages from the iOS only when it starts up, resumes, or is currently loaded and running.
WatchConnectivity gives you 5 ways of sending data between the watch and phone.
sendMessage, sendMessageData: your scenario excludes these calls since both the watch and iPhone apps need to be active for this mechanism to work.
updateApplicationContext, transferUserData, and transferFile can be invoked from the iOS app at anytime, regardless of the state of your watch app. However, these messages are sent on a background thread and your watch app delegate will receive them the next time the watch app is loaded. By the design of the watch OS, none of these methods can trigger the watch app to load or become active.

How to wake iphone on applewatch programmatically

My scenario is control iphone music player on applewatch. Just like what the applewatch music glance would do. The project is gonna be iphone app, watch app, and watch extension. however, I want it to be able to work even thou my iphone app is not active. I know when iphone app is active, I could use wcsession and sendMessage to control the music on iphone. However, when the iphone app is not active. I don't know what I should do to get the work done.
One more thing, I don't understand how iphone app works when it is not active. Does it need to be active first do those job, or it is never really inactive so it could still do work?
Use this to wake up your iPhone App from your watch kit App in the background:
from https://developer.apple.com/library/prerelease/watchos/documentation/General/Conceptual/AppleWatch2TransitionGuide/UpdatetheAppCode.html
Interactive messaging mode
sendMessage:replyHandler:errorHandler:
sendMessageData:replyHandler:errorHandler:
Use this mode to send data immediately to a counterpart app. If you do not want a response, you may specify nil for the replyHandler parameter.
The counterpart app must be reachable before calling this method. The iOS app is always considered reachable, and calling this method from your Watch app wakes up the iOS app in the background as needed. The Watch app is considered reachable only while it is installed and running.
Data is transmitted immediately and messages are queued and delivered in the order in which they were sent.
All you have to do is configure your App to respond to remote control events.
Remote Control events
Remote control events are any event received for the purpose of controlling media. eg iTunes pause/play/next/previous buttons available from the control center, or remote-control events from play/pause buttons on headphones.
Here is a tutorial on the subject.
Then your watch glance will be able to control the Apps audio.
With the current API as of Watch OS2 and iOS9, rewriting an glance similar to the now playing glance is not possible.

can Apple watch stream music as standalone app

We have an iphone app which is used to stream Music and video and we are planning to develop a Watchkit extension for same.
is it possible for applewatch extension to stream music without iPhone, or
say it is possible to do streaming music when you are not near to your iPhone. or
Watch extensions are depends upon phone to make network calls.
before starting development we are confused .
I don't think that is possible. You can't play a sound on the Apple Watch speaker.
When you play a sound from the extension, it is played on the iPhone as the extension lives on the iPhone.
The only way to play a custom sound through the Apple Watch is to set a custom sound file for a notification but I don't think this fits your case.
As of watchOS 6, it is now possible to both build a standalone watchOS app and stream audio as well.

(AIR iOS) NetStream.play() blocks microphone access in a Native Extension

We have an AIR Mobile application (iOS) that uses a Native Extension for capturing microphone input. We would like to be able to play a NetStream in the application and capture the mic at the same time.
Microphone capture in the Native Extension works fine until we do NetStream.play() in the host AIR application. Once that happens, we start receiving zero samples (i.e. silence) in the Native Extension.
We've tried setting AudioSession in the ANE and other tricks, but to no avail. Is there a way for AIR Mobile not to block microphone operation in our ANE?
Looks like there is a conflict where AIR SDK audio class override the audio session used by the ANE. Take a look here http://forums.adobe.com/message/5660732 where they provide a workaround by playing the sound using also an ANE

Resources