In an iPhone app that plays live video from a server via HTTP Live Streaming, is it possible to access decoded video frames after decoding?
As far as I can see AVPlayer, MPMoviePlayer, and CoreVideo do not seem to provide a callback to notify the app that an individual frame has been decoded.
My question is similar to "Record HTTP Live Streaming Video To File While Watching?", except I'm not necessarily interested in full DVR functionality. The one answer there suggests a server-side solution and is vague about the possibility of a client-side solution. It's also similar to "Recording, Taking Snap Shot with HTTP Live streaming video running MPMoviePlayerController in iOS" except that I don't require a solution to work with MPMoviePlayerController.
It's possible using "AVPlayerItemVideoOutput" like this:
NSDictionary *options = #{ (__bridge NSString *)kCVPixelBufferPixelFormatTypeKey : #(kCVPixelFormatType_32BGRA),
(__bridge NSString *)kCVPixelBufferOpenGLESCompatibilityKey : #YES };
myOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:options];
myOutput.suppressesPlayerRendering = NO;
[myOutput requestNotificationOfMediaDataChangeWithAdvanceInterval:1];
[myOutput setDelegate:self queue:dispatch_get_main_queue()];
[playerItem addOutput:myOutput];
I've done it in several of my projects.. Take care and good luck!
FYI: (AVPlayerItem *playerItem;)
/Anders.
Related
I am trying to play remote video with MPMoviePlayerViewController like this.
NSURL *url = [NSURL URLWithString: #"http://km.support.apple.com/library/APPLE/APPLECARE_ALLGEOS/HT1211/sample_iTunes.mov"];
self.mp = [[MPMoviePlayerViewController alloc]
initWithContentURL:url];
[self.navigationController presentMoviePlayerViewControllerAnimated:self.mp];
I can play above url. However, I can't play my own video file url like this. It is running on local server and it is pointing to file location.
http://127.0.0.1:8000/media/through_your_eyes/file0.mov
Is it client or server side problem? Shall I point to file location on server side or how shall I prepare?
It is server side problem
From your end you are doing well.
When you trying to play video - getting error like this
_itemFailedToPlayToEnd: {
kind = 1;
new = 2;
old = 0;
}
This shows that MPMoviePlayerViewController not able to play video from this url.
Reasons:
Your url is not proper - print NSURL object check its proper and escape character are there (If required)
Permission - It is not permitted to play this video from device. Firewall stop access this video from device. Check that you are permitted to access the url
Video format - Some times video conversion changes the formate of the video. You can see the correct format of the video how ever if conversion is made it is not able to play by the device. Please copy this url paste it in Simulator Safari app and check it is able to play.
I am currently making an app that is capable of streaming a music file. The problem is, our client wants it that while we are streaming an audio file, the streamed bytes will also be saved in the local storage. Which means that the streamed audio file will also be saved on the device's storage. For example I have this m4a file streamed, when the user stops streaming the audio file, the streamed music file will be saved in the device's local storage for future use.
Is this possible? If it is, what library should I use?
Thanks.
Yes it is possible use AVFoundation framework for this and play your audio.
First drag AVFoundation framework from built phase section then import like this #import <AVFoundation/AVFoundation.h>
Then declare AVAudioPlayer in .h file of your view controller instance like this
AVAudioPlayer *myAudioPlayer;
In view controller.m put this code
NSURL *fileURL = // your url.
myAudioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:fileURL error:nil];
myAudioPlayer.numberOfLoops = -1; //infinite loop
[myAudioPlayer play];
Via this way you are able to play audio in your iOS device.hope it will help your for playing audio
Does anyone know how to stop the iphone muting the speaker output whilst a phone call is taking place?
I'm developing an app where I need to be able to play audio through the phone speakers whilst there's a phonecall..
Hope someone can help!
I know there's an accepted answer but I don't think that it's completely correct. Some navigation apps (Waze for instance) are able to give audio directions during phone calls.
If you use AVAudioPlayer you can handle interruptions (i.e. phone calls) by responding to the audioPlayerBeginInterruption: delegate (you can read more about it here).
While I didn't have too much time to look into it I did manage to play a short sound file while initiating a phone call or playing a song by using the following code:
-(void)audioPlayerBeginInterruption:(AVAudioPlayer *)player
{
NSString *path = [[NSBundle mainBundle] pathForResource:#"mySound" ofType:#"mp3"];
SystemSoundID soundFileObject;
NSURL *pathURL = [NSURL fileURLWithPath : path];
CFURLRef soundFileURLRef = (CFURLRef)CFBridgingRetain(pathURL);
AudioServicesCreateSystemSoundID(soundFileURLRef, &soundFileObject);
AudioServicesPlaySystemSound(soundFileObject);
}
Keep in mind that when an interruption comes in your app might be backgrounded or suspended which will also affect your ability to play sounds (or run any other code for that matter).
I don't believe Apple allows other audio to mix with the audio during a phone call. Apple wants to ensure that poorly written apps won't interfere with a user's ability to perform important functions, such as making phone calls, on their phone. So your app is basically forced to comply.
I do not think this is possible and I hope it's not, because personally, I do not want any app recoding my phone calls or interrupting important calls with prank sounds.
I also think this would be a very risky thing to allow if you think about it from a security standpoint.
I need to send and receive audio files within my app. To be more performant and keep an open connection, I am looking to stream this data.
I have looked at things like HTTP Live Streaming and AudioStreamer based on answers to questions. However, these seem to be for continuous streaming, 1-way (read). Whereas I am sending a finite audio file (> 10 seconds) and then receiving one back.
I am familiar with NSURLConnection and have reviewed this answer. But again, this uses a continuous, 1-way stream.
I would appreciate any recommend on the architecture to accomplish the above to help me get started.
In general, the AVAudioPlayer uses music playback. However, this framework does not support streaming. So using AVPlayer streaming can be achieved. Usually the developers AVPlayer purposes only know that can play video, but can also play music.
I look at the following Apple's sample code is recommended.this is using a AVPlayer
StitchedStreamPlayer
I upload to myServer Tested, .mp3 also perfectly
3G, wifi tested in both environments. although sample code, It was surprisingly works perfectly. mp3 file upload your server. Try to test right now. Will work well.
And If you want to play in the background, the following code don't forget:
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
AVAudioSession* audio = [[AVAudioSession alloc] init];
[audio setCategory: AVAudioSessionCategoryPlayback error: nil];
[audio setActive: YES error: nil];
return YES;
}
For playback form the server my Audjustable project (https://github.com/tumtumtum/audjustable) fixes most of the problems in the original AudioStreamer project and includes gapless playback and decoupled audio data source to allow for things like error-recovery, encryption etc. Completely open source...
If you want to upload audio data to the server you can do it many ways but NSURLConnection would seem to be the easiest way.
Yes, AudioStreamer is really good till now what I have used and searched through.
But with AudioStreamer, there are certain changes needed when trying to stream in background, also when you are creating 2 instances, and lots more.....
https://github.com/mattgallagher/AudioStreamer/
You can find 2-3 questions regarding the same in my profile also...
I think what you are looking to do are uploading and downloading files from and to a server not streaming. For downloading, you can use NSURLConnection/ NSURLRequest. For uploading, you can use HTTP POST method. There is also an old third party library called ASIHTTPRequest. There are quite a bit of samples on these topics on the Internet.
We´re developing a HTTP-streaming iOS app that requires us to receive playlists from a secured site. This site requires us to authenticate using a self signed SSL certificate.
We read the credentials from a .p12 file before we use NSURLConnection with a delegate to react to the authorization challenge.
- (void)connection:(NSURLConnection *)connection didReceiveAuthenticationChallenge:(NSURLAuthenticationChallenge *)challenge
{
[[challenge sender] useCredential: self.credentials forAuthenticationChallenge:challenge];
}
- (BOOL)connection:(NSURLConnection *)connection canAuthenticateAgainstProtectionSpace:(NSURLProtectionSpace *)protectionSpace
{
return YES;
}
By doing this initial connection to the URL where we´re getting the .m3u8 playlist we´re able to play back the playlist using AVPlayer. The problem is that this method only works in the simulator.
NOTE: We´re able to download the playlist using the NSURLConnection on device. This must mean that the AVPlayer somehow can´t continue using the trust established during this initial connection.
We have also tried adding the credentials to the [NSURLCredentialStorage sharedCredentialStorage] without any luck.
Below follows our shotgun approach for that:
NSURLProtectionSpace *protectionSpace = [[NSURLProtectionSpace alloc]
initWithHost:host
port:443
protocol:#"https"
realm:nil
authenticationMethod:NSURLAuthenticationMethodClientCertificate];
[[NSURLCredentialStorage sharedCredentialStorage] setDefaultCredential:creds
forProtectionSpace:protectionSpace];
NSURLProtectionSpace *protectionSpace2 = [[NSURLProtectionSpace alloc]
initWithHost:host
port:443
protocol:#"https"
realm:nil
authenticationMethod:NSURLAuthenticationMethodServerTrust];
[[NSURLCredentialStorage sharedCredentialStorage] setDefaultCredential:creds
forProtectionSpace:protectionSpace2];
EDIT: According to this question: the above method doesn´t work with certificates.
Any hint to why it doesn´t work on device, or an alternate solution is welcome!
From iOS 6 onwards AVAssetResourceLoader can be used for retrieving an HTTPS secured playlist or key file.
An AVAssetResourceLoader object mediates resource requests from an AVURLAsset object with a delegate object that you provide. When a request arrives, the resource loader asks your delegate if it is able to handle the request and reports the results back to the asset.
Please find the sample code below.
// AVURLAsset + Loader
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:url options:nil];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
AVAssetResourceLoader *loader = asset.resourceLoader;
[loader setDelegate:self queue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)];
// AVPlayer
AVPlayer *avPlayer = [AVPlayer playerWithPlayerItem:playerItem];
You will need to handle the resourceLoader:shouldWaitForLoadingOfRequestedResource:delegate method which will be called when there is an authentication need and you can use NSURLConnection to request for the secured resource.
(BOOL)resourceLoader:(AVAssetResourceLoader *)resourceLoader shouldWaitForLoadingOfRequestedResource:(AVAssetResourceLoadingRequest *)loadingRequest
{
//Handle NSURLConnection to the SSL secured resource here
return YES;
}
Hope this helps!
P.S : The proxy approach using CocoaHTTPServer works well but using an AVAssetResourceLoader is a more elegant solution.
It seems that until Apple lets us control what NSURLConnections the AVPlayer uses the only answer seems to be to implement a HTTP loopback server.
To quote the apple representative that answered our support question:
Another option is to implement a loopback
HTTP server and point client objects at that. The clients can use
HTTP (because their requests never make it off the device), while the
loopback HTTP server itself can use HTTPS to connect to the origin
server. Again, this gives you access to the underlying
NSURLConnections, allowing you to do custom server trust evaluation.
Using this technique with UIWebView is going to be tricky unless you
completely control the content at the origin server. If the origin
server can return arbitrary content, you have to grovel through the
returned HTTP and rewrite all the links, which is not much fun. A
similar restriction applies to MPMoviePlayerController/AVPlayer, but
in this case it's much more common to control all of the content and
thus be able to avoid non-relative links.
EDIT:
I managed to implement a loopback server using custom implemenations of the
HTTPResponse and HTTPConnection classes found in CocoaHTTPServer
I can´t disclose the source, but I used NSURLConnection together with a mix of the AsyncHTTPResponse and DataHTTPResponse demonstration responses.
EDIT:
Remember to set myHttpServerObject.interface = #"loopback";
EDIT: WARNING!!! This approach does not seem to work with airplay since the airplay device will ask 127.1.1.1 for encryption keys. The correct approach seems to be defined here:
https://developer.apple.com/library/content/documentation/AudioVideo/Conceptual/AirPlayGuide/EncryptionandAuthentication/EncryptionandAuthentication.html#//apple_ref/doc/uid/TP40011045-CH5-SW1
"Specify the keys in the .m3u8 files using an application-defined URL scheme."
EDIT:
An apple TV and iOS update has resolved the issue mentioned in the edit above!