I am using Google cast new production api to stop the mediacontrolchannel through
[mediacontrolchannel stop]
It is stopping the session but when I try to start the stopped channel it gives me an error of
INVALID_REQUEST .What could be the reason. Pausing And Playing a channel works fine but Playing after the channel has stopped is not working
Any help would be appreicated
I am not familiar with the iOS apis but if I draw on similarities with the Android APIs, the "stop" command also unloads the media on the receiver side, hence to start it again, you need to load the media again. You can use "pause" instead of "stop" if you are planning to start the playback again.
Related
I have to add live video streaming in one of my app; I am using Wowza Streaming Cloud for the same. I am using REST Api to make it as my requirement.
The flow I had used is:
Create a stream
Start stream
Check status unless it is "started"
if the status is started, I start broadcasting video.
It goes well sometimes, but sometimes when I try to broadcast even after starting the stream, it says:
Status is idle, An error occurred when trying to connect to host: (code = 15)
Also I see a green screen on player side, and the video on player side is not continuous, it keeps on fluttering.
For Player I used the code provided in sample app.
For Broadcasting I had used GoCoder SDK wherein I set all of the properties like host address, port, app name, stream name, etc.
Do I have to set bitrate or anything?
Where am I wrong?
That error occurs when the entrypoint itself is open for more than 20 minutes without a connection. Once you get an IP returned from the API for the stream, you can connect to it right away. The errors you're getting are showing idle due to lack of connection and it sounds like the timing between starting the event, checking the event, and then finally connecting are hitting this restriction.
I am using tokbox for video calling in my iOS application. I am getting the issue when while video calling, when I lock the app, the stream gets disconnected and the call ends.
This documentation outlines running TokBox in the background, i.e. when the phone is locked. The following is an excerpt from the limitations of running TokBox in the background:
Apps cannot do the following while in the background state:
Use the camera as a video source for a publisher.
The documentation goes on to explain that you can keep an audio-only session active when running in the background, but not a video session.
I am trying to build an iOS app which controls a music player which runs on a seperate machine. I would like to use the MPNowPlayingInfoCenter for inspecting and controlling this player. As far as I can tell so far, the app actually has to output audio for this to work (see also this answer).
However, for instance, the Spotify app is actually capable of doing this without playing audio on the iOS device. If you use Spotify Connect to play the audio on a different device, the MPNowPlayingInfoCenter still displays the correct song and the controls are functional.
What's the catch here? What does one (conceptually) have to do to achieve this? I can think of continuously emitting a "silent" audio stream, but that seams a bit brute-force.
Streaming silence will work, but you don't need to stream it all the time. Just long enough to send your Now Playing info. Using AVAudioPlayer, I've found approaches as short as this will send the data (assuming the player is loaded with a 1s silent audio file):
player.play()
let nowPlayingInfoCenter = MPNowPlayingInfoCenter.default()
nowPlayingInfoCenter.nowPlayingInfo = [...]
player.stop()
I was very surprised this worked within a single event loop. Any other approach to playing silence seems to work as well. Again, it can be very brief (in fact the above code in my tests doesn't even get as far as playing the silence).
I'm very interested in whether this works reliably for other people, so please comment if you make discoveries about it.
I've explored the Spotify app a bit. I'm not 100% certain if this is the same technique they're using. They're able to mix with other audio somehow. So you can be playing local audio on the phone and also playing Spotify Connect to some other device, and the "Now Playing" info will kind of stomp on each other. For my purposes, that would actually be better, but I haven't been able to duplicate it. I still have to make the audio session non-mixable for at least a moment (so you get ~ 1s audio drop if you're playing local audio at the same time). I did find that the Spotify app was not highly reliable about playing to my Connect device when I was also playing other audio locally. Sometimes it would get confused and switch around where it wanted to play. (Admittedly this is a bizarre use case; I was intentionally trying to break it to figure out how they're doing it.)
EDIT: I don't believe Spotify is actually doing anything special here. I think they just play silent audio. I think getting two streams playing at the same time has to do with AirPlay rather than Spotify (I can't make it happen with Bluetooth, only AirPlay).
I have a problem about managing AudioSession (set as Play&Record category) interruptions in my VoIP app in iOS 5.x.
When I have a call in progress, going background and starting youtube app, the audio session begin-interruption occurs and I can put in pause the current call.
The problem is when I return to foreground and want to reactive my call (WITHOUT killing youtube app, simply putting it in background previously): no end-interruption callback occurs and even if I force to set active my previous audioSession, it returns error.
I check if the audio resources are maybe in use by other process (with AudioSessionGetProperty (kAudioSessionProperty_OtherAudioIsPlaying,...) but they don't.
If I kill manually youtube app the end-interruption occurs instead and there's no problem.
Since normally no one kills manually the app, how can i restore the audioSession of my call??
I've checked other examples and Viber works correctly and can "intercepts" in some way the end-interruption.
Another strange behaviour is that if I set also "allowMixing" category in my AudioSession, it is completely ignored in iOS 5.x (youtube takes control of audio resources) while in iOS 4.x it works as it should (that is I listen in contemporary the audio call and youtube mixed).
I had a similar issue. I solved it by stopping and (re-) starting my audio unit in the interruption callback.
I have a 3rd party SDK that handles an audio recording. It has a callback when the recording starts. In the callback I'm trying to play a sound to indicate to the user that the device is now listening (like Siri, or any other speech recognition tends to do), but when I try I get the following error:
AURemoteIO::ChangeHardwareFormats: error -10875
I have tried playing the sound using AudioServicesPlaySystemSound as well as an AVAudioPlayer both with the same result. The sound plays fine at other times, and per the error my assumption is there's an incompatibility between the playback and recording on the hardware level. Can anyone clarify this error, or give me a hint as to a possible workaround?
Make sure that the Audio Session is initialised and configured for Play_and_Record before you start the RemoteIO Audio Unit recording.
You shouldn't and likely can't start playing a sound in a RemoteIO recording callback. Only set a boolean flag in the callback to indicate that a sound should be played. Play your sound from the main UI run loop.
My problem is related specifically to the external SDK and how they handle the audio interface. They override everything when you ask the SDK to start recording, if you take control back you break the recording session. So within the context of that SDK there's no way to work around it unless they fix the SDK.