I am trying to get the video name from the metadata of a video source file in IOS app using Objective c, Cocoa Framework. I am using the following code
AVPlayerItem *playerItem = [[AVPlayerItem alloc] initWithURL:url];
NSArray *metadata = [playerItem.asset metadata];
this url is #"http://www.sample-videos.com/video/mp4/720/big_buck_bunny_720p_1mb.mp4"
I used the same AvPlayerItem class for retrieving the metadata from an audio file as this below:
NSArray *metadata = [playerItem.asset metadata];
for (AVMetadataItem *metadataItem in metadata)
{
NSLog(#"%#",metadataItem.commonKey);
[metadataItem loadValuesAsynchronouslyForKeys:#[AVMetadataKeySpaceCommon] completionHandler:^{
if ([metadataItem.commonKey isEqualToString:#"title"])
{
title = (NSString *)metadataItem.value;
}
}];
}
Here the commonkey 'title' will help me get the audio file name from metadata, but in case of video the same does not work. While debugging I am getting array as below
<__NSArrayM 0x14c777b50>(
As you can see, the commonKey = software whereas it should have been 'title'. Is there any way to get the video name from the metadata? Thanks.
Related
I need to fetch album artworks from all the songs played in live streaming url.I am using AVPlayer to play music from the URL.
Here is my code for getting the metadata from asset:
AVURLAsset *ass = player.currentItem.asset;
for (AVMetadataItem *metadataItem in ass.commonMetadata) {
if ([metadataItem.commonKey isEqualToString:#"artwork"]){
NSDictionary *imageDataDictionary = (NSDictionary *)metadataItem.value;
UIImage *image = [UIImage imageWithData:imageDataDictionary];
imgArt.image = image;
}
}
Note : This code was working for local mp3 file but not for live streaming URL.
I am currently trying to figure out a way to create a playlist from an array of avplayeritems and save the array to the nssuserdefaults but it wont work.
here is some code
AVAsset *asset = [AVAsset assetWithURL:_videoURL];
AAPLPlayerViewController __weak *weakSelf = self;
NSArray *oldItemsArray = [weakSelf.player items];
AVPlayerItem *newPlayerItem = [AVPlayerItem playerItemWithAsset:asset];
[weakSelf.player insertItem:newPlayerItem afterItem:nil];
[weakSelf queueDidChangeFromArray:oldItemsArray toArray:[self.player items]];
i am trying to create a playlist and store the AVPlayerItems from the queue of the AVQueuePlayer but since it has complex data that includes the AVPlayerItem and AVAsset it will crash when i try to save the [self.player items];
Ive also tried to store it to a Mutable dictionary but no methods ive tried have worked. Can anyone with experience suggest a good method of creating a playlist and storing it for the AVQueuePlayer?
I am using the AVFoundationQueuePlayer Objective C sample from the IOS developer downloads at this link :
https://developer.apple.com/library/ios/samplecode/AVFoundationQueuePlayer-iOS/Introduction/Intro.html#//apple_ref/doc/uid/TP40016104
Disclaimer New to AVAudioRecorder
What I'm doing I'm working on an app that uses the iPhone microphone to record sound. After the sound is recorded, I need to convert the sound (should be AVAsset, right?) into NSData to send to our backend.
What's the issue The issue is I am not sure how to "get" the audio that is supposed to be recorded with the AVAudioRecorder. AVAudioRecorder has a delegate method called - (void)audioRecorderDidFinishRecording:(AVAudioRecorder *) aRecorder successfully:(BOOL)flag. I would have expected the actually AVAsset that contains the audio to be passed from this delegate method, but it does not. What it does give me is the aRecorder object that has a .url property on it. When I NSLog the url from the passed aRecorder, it shows up. In fact I can NSLog the length of the file in the code below:
- (void)audioRecorderDidFinishRecording:(AVAudioRecorder *) aRecorder successfully:(BOOL)flag
{
DLog (#"audioRecorderDidFinishRecording:successfully: %#",aRecorder);
AVURLAsset* audioAsset = [AVURLAsset URLAssetWithURL:aRecorder.url options:nil];
CMTime audioDuration = audioAsset.duration;
float audioDurationSeconds = CMTimeGetSeconds(audioDuration);
NSLog(#"asset length = %f", audioDurationSeconds); //Logs 7.051 seconds, so I know it's "there".
self.audioURL = aRecorder.url;
}
Problem When I pass self.audioURL to the next viewController's self.mediaURL and try to grab the file from the AssetLibrary (similarly to how I did before), the asset is not returned from the AssetLibrary (even though when I po self.mediaURL it indeed logs the correct url:
if (self.mediaURL) {
ALAssetsLibrary *assetLibrary = [[ALAssetsLibrary alloc] init];
[assetLibrary assetForURL:self.mediaURL resultBlock:^(ALAsset *asset) {
if (asset) {
// This block does NOT get called...
ALAssetRepresentation *rep = [asset defaultRepresentation];
Byte *buffer = (Byte*)malloc((long)rep.size);
NSUInteger buffered =[rep getBytes:buffer fromOffset:0.0 length:(long)rep.size error:nil];
NSMutableData *body = [[NSMutableData alloc] init];
body = [NSMutableData dataWithBytesNoCopy:buffer length:buffered freeWhenDone:YES];
[dataToSendToServer setObject:body forKey:#"audioData"];
}
} failureBlock:^(NSError *error) {
NSLog(#"FAILED TO ACCESS AUDIO FROM URL: %#!", self.mediaURL);
}];
}
else {
NSLog(#"NO AUDIO DATA!");
}
}
Because I am new to AVAudioRecorder, perhaps I am just not designing this flow correctly. Could anyone help me out in getting the actual audio data.
Thanks!
AVAudioRecorder records to a file, not to the Asset Library.
So you can simply read the data from that file.
Right now, I'm using MNavChapters to get the chapter metadata for audio files and using MPMediaPlayerController to play the audio files.
This works great until I try to load an Audible (AA) book's chapters. The MPMediaItemPropertyAssetURL returns nil because this is a "protected" file. Is there an alternative to read the chapter metadata?
Current non-working code:
NSURL *assetURL = [self.mpmediaitem valueForProperty:MPMediaItemPropertyAssetURL]; //this is null :(
AVAsset *asset = [AVAsset assetWithURL:assetURL];
MNAVChapterReader *reader = [MNAVChapterReader new];
NSArray *chapters = [reader chaptersFromAsset:asset];
I just noticed that this works in iOS 9
I am playing HLS streams using AVPlayer. And I also need to record these streams as user presses record button.
The approach I am using is to record audio and video separately then at the end merge these file to make the final video. And It is successful with remote mp4 files.
But now for the HLS (.m3u8) files I am able to record the video using AVAssetWriter but having problems with audio recording.
I am using MTAudioProccessingTap to process the raw audio data and write it to a file. I followed this article. I am able to record remote mp4 audio but its not working with HLS streams.
Initially I wasn't able to extract the audio tracks from the stream using AVAssetTrack *audioTrack = [asset tracksWithMediaType:AVMediaTypeAudio][0];
But I was able to extract the audioTracks using KVO to initialize the MTAudioProcessingTap.
-(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context{
AVPlayer *player = (AVPlayer*) object;
if (player.status == AVPlayerStatusReadyToPlay)
{
NSLog(#"Ready to play");
self.previousAudioTrackID = 0;
__weak typeof (self) weakself = self;
timeObserverForTrack = [player addPeriodicTimeObserverForInterval:CMTimeMakeWithSeconds(1, 100) queue:nil usingBlock:^(CMTime time)
{
#try {
for(AVPlayerItemTrack* track in [weakself.avPlayer.currentItem tracks]) {
if([track.assetTrack.mediaType isEqualToString:AVMediaTypeAudio])
weakself.currentAudioPlayerItemTrack = track;
}
AVAssetTrack* audioAssetTrack = weakself.currentAudioPlayerItemTrack.assetTrack;
weakself.currentAudioTrackID = audioAssetTrack.trackID;
if(weakself.previousAudioTrackID != weakself.currentAudioTrackID) {
NSLog(#":::::::::::::::::::::::::: Audio track changed : %d",weakself.currentAudioTrackID);
weakself.previousAudioTrackID = weakself.currentAudioTrackID;
weakself.audioTrack = audioAssetTrack;
/// Use this audio track to initialize MTAudioProcessingTap
}
}
#catch (NSException *exception) {
NSLog(#"Exception Trap ::::: Audio tracks not found!");
}
}];
}
}
I am also keeping track of trackID to check if track is changed.
This is how I initialize the MTAudioProcessingTap.
-(void)beginRecordingAudioFromTrack:(AVAssetTrack *)audioTrack{
// Configure an MTAudioProcessingTap to handle things.
MTAudioProcessingTapRef tap;
MTAudioProcessingTapCallbacks callbacks;
callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
callbacks.clientInfo = (__bridge void *)(self);
callbacks.init = init;
callbacks.prepare = prepare;
callbacks.process = process;
callbacks.unprepare = unprepare;
callbacks.finalize = finalize;
OSStatus err = MTAudioProcessingTapCreate(
kCFAllocatorDefault,
&callbacks,
kMTAudioProcessingTapCreationFlag_PostEffects,
&tap
);
if(err) {
NSLog(#"Unable to create the Audio Processing Tap %d", (int)err);
NSError *error = [NSError errorWithDomain:NSOSStatusErrorDomain
code:err
userInfo:nil];
NSLog(#"Error: %#", [error description]);;
return;
}
// Create an AudioMix and assign it to our currently playing "item", which
// is just the stream itself.
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters
audioMixInputParametersWithTrack:audioTrack];
inputParams.audioTapProcessor = tap;
audioMix.inputParameters = #[inputParams];
_audioPlayer.currentItem.audioMix = audioMix;
}
But Now with this audio track MTAudioProcessingTap callbacks "Prepare" and "Process" are never called.
Is the problem with the audioTrack I am getting through KVO?
Now I would really appreciate if some one can help me with this. Or can tell am I using the write approach to record HLS Streams?
I Found solution for this and using it in my app. Wanted to post it earlier but didn't get the time.
So to play with HLS you should have some knowledge what they are exactly. For that please see it here on Apple Website.
HLS Apple
Here are the steps I am following.
1. First get the m3u8 and parse it.
You can parse it using this helpful kit M3U8Kit.
Using this kit you can get the M3U8MediaPlaylist or M3U8MasterPlaylist(if it is a master playlist)
if you get the master playlist you can also parse it to get M3U8MediaPlaylist
(void) parseM3u8
{
NSString *plainString = [self.url m3u8PlanString];
BOOL isMasterPlaylist = [plainString isMasterPlaylist];
NSError *error;
NSURL *baseURL;
if(isMasterPlaylist)
{
M3U8MasterPlaylist *masterList = [[M3U8MasterPlaylist alloc] initWithContentOfURL:self.url error:&error];
self.masterPlaylist = masterList;
M3U8ExtXStreamInfList *xStreamInfList = masterList.xStreamList;
M3U8ExtXStreamInf *StreamInfo = [xStreamInfList extXStreamInfAtIndex:0];
NSString *URI = StreamInfo.URI;
NSRange range = [URI rangeOfString:#"dailymotion.com"];
NSString *baseURLString = [URI substringToIndex:(range.location+range.length)];
baseURL = [NSURL URLWithString:baseURLString];
plainString = [[NSURL URLWithString:URI] m3u8PlanString];
}
M3U8MediaPlaylist *mediaPlaylist = [[M3U8MediaPlaylist alloc] initWithContent:plainString baseURL:baseURL];
self.mediaPlaylist = mediaPlaylist;
M3U8SegmentInfoList *segmentInfoList = mediaPlaylist.segmentList;
NSMutableArray *segmentUrls = [[NSMutableArray alloc] init];
for (int i = 0; i < segmentInfoList.count; i++)
{
M3U8SegmentInfo *segmentInfo = [segmentInfoList segmentInfoAtIndex:i];
NSString *segmentURI = segmentInfo.URI;
NSURL *mediaURL = [baseURL URLByAppendingPathComponent:segmentURI];
[segmentUrls addObject:mediaURL];
if(!self.segmentDuration)
self.segmentDuration = segmentInfo.duration;
}
self.segmentFilesURLs = segmentUrls;
}
You can see that you will get the links to the .ts files from the m3u8 parse it.
Now download all the .ts file into a local folder.
Merge these .ts files in to one mp4 file and Export.
You can do that using this wonderful C library
TS2MP4
and then you can delete the .ts files or keep them if you need them.
This is not good approach what you can do is to Parse M3U8 link .Then try to download segment files (.ts) . If you can get these file you can merge them to generate mp4 file.