Since mp4 is a container file format it can store audio as well as video files. What i am struggling to find out is its true media type. (Whether its a audio or video) Could this be done in IOS (objective c) ?
AVAsset *asset = [AVAsset assetWithURL:<URL to mp4>];
BOOL hasVideo = [asset tracksWithMediaType:AVMediaTypeVideo].count > 0;
BOOL hasAudio = [asset tracksWithMediaType:AVMediaTypeAudio].count > 0;
BOOL isMP4VideoType; //Global Variables;
BOOL isMP4AudioType; //Global Variables;
//Create an object of AVAsset class With MP4 URL.
AVAsset *asset = [AVAsset assetWithURL:YOUR_MP4_URL];
NSArray *aryVideoTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
NSArray *aryAudioTracks = [asset tracksWithMediaType:AVMediaTypeAudio];
if([aryVideoTracks count]!=0)
{
isMP4VideoType = YES;
}else if([aryVideoTracks count]!=0)
{
isMP4AudioType = YES;
}
AVURLAsset* asset = [[AVURLAsset alloc]initWithURL:_assetUrl options:nil];
NSArray *audioTracks = [asset tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack * audioTrack = [audioTracks firstObject];
if([track hasMediaCharacteristic:AVMediaCharacteristicAudible]){
}
Related
I am doing the following
AVURLAsset *audioAsset = [[AVURLAsset alloc] initWithURL:audioUrl options:nil];
NSArray<AVAssetTrack *> *audioTracks = [audioAsset tracksWithMediaType:AVMediaTypeAudio];
which works fine on a real device.
The problem is just happening in the simulators. I am having a statically added mp3 in the bundle, so the audioAsset is properly initiated. But the array audioTracks is empty on the simulator (even though the path in audioUrl and the audioAsset variable is correct and existing.
Any suggestions?
I've faced with the same issue on a real device as well. The issue is caused by the fact that after initialising asset it doesn't ready yet.
Please have a look at documentation:
You can initialize a player item with an existing asset, or you can initialize a player item directly from a URL so that you can play a resource at a particular location (AVPlayerItem will then create and configure an asset for the resource). As with AVAsset, though, simply initializing a player item doesn’t necessarily mean it’s ready for immediate playback. You can observe (using key-value observing) an item’s status property to determine if and when it’s ready to play.
To fix the issue you can try to do the following:
AVURLAsset *audioAsset = [[AVURLAsset alloc] initWithURL:audioUrl options:nil];
NSString *tracksKey = #"tracks";
[audioAsset loadValuesAsynchronouslyForKeys:#[tracksKey] completionHandler:
^{
NSError *error;
AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error];
if (status == AVKeyValueStatusLoaded) {
// The asset is ready at this point
NSArray<AVAssetTrack *> *audioTracks = [audioAsset tracksWithMediaType:AVMediaTypeAudio];
}
}];
Also it's worth to note that AVKeyValueStatus status can be AVKeyValueStatusFailed in case if audioAsset is not playable:
BOOL result = [audioAsset isPlayable];
I am trying to use AVURLAsset to load a webvtt file.
Below is my code.
NSString *urlAddress = #"http://somewhere/some.vtt";
NSURL *urlStream = [[NSURL alloc] initWithString:urlAddress];
AVAsset *avAsset = [AVURLAsset URLAssetWithURL:urlStream options:nil];
NSArray *requestKeys = [NSArray arrayWithObjects:#"tracks",#"playable",nil];
[avAsset loadValuesAsynchronouslyForKeys:requestKeys completionHandler:^{
dispatch_async(dispatch_get_main_queue(),^{
//complete block here
AVKeyValueStatus status =[avAsset statusOfValueForKey:#"tracks" error:nil];
if(status == AVKeyValueStatusLoaded) {
//loaded block !
//Question 1
CMTime assetTime = [avAsset duration];
Float64 duration = CMTimeGetSeconds(assetTime);
NSLog(#"%f", duration);
//Question 2
AVMediaSelectionGroup *subtitle = [avAsset mediaSelectionGroupForMediaCharacteristic: AVMediaCharacteristicLegible];
NSLog(#"%#", subtitle);
}
else {
//don’t load block !
}
});
}];
Question 1: It always go into the "Loaded Block", but I find the avAsset's duration is not complete, that means the data is not loaded? How should I modify it?
Question 2: I am trying to use it to my avplayer's subtitle, but the AVMediaSelectionGroup is always null. What should I do?
For question 1, add duration to your keys:
NSArray *requestKeys = #[#"tracks",#"playable", #"duration"];
I've posted a solution over here: https://stackoverflow.com/a/37945178/171933 Basically you need to use an AVMutableComposition to join the video with the subtitles and then play back that composition.
About your second question: mediaSelectionGroupForMediaCharacteristic seems to only be supported when these "characteristics" are already baked into either the media file or your m3u8 stream, according to this statement by an Apple engineer (bottom of the page).
I am new in iPhone. I want to get data from AVURLAsset format.
following is my AVURLAsset response.
<AVURLAsset: 0x1768c800, URL = http://s9.voscast.com:7924/;.mp3>
i can get URL by following code.
AVPlayerItem *item = (AVPlayerItem *)object;
if(item.status != AVPlayerItemStatusReadyToPlay)
return;
if(item.status == AVPlayerItemStatusReadyToPlay)
{
AVURLAsset *asset = (AVURLAsset *)item.asset;
NSLog(#"myasset:%#",asset);
NSString *myurl = [asset valueForKey:#"URL"];
NSLog(#"myurl:%#",myurl);
}
output of above code is
myasset:<AVURLAsset: 0x1768c800, URL = http://s9.voscast.com:7924/;.mp3>
2015-04-15 23:01:50.452 myurl:http://s9.voscast.com:7924/;.mp3
but I want to get 0x1768c800
can anybody help me how to get above value.
There's a strange behaviour I've found when trying to merge videos with AVFoundation. I'm pretty sure that I've made a mistake somewhere but I'm too blind to see it. My goal is just to merge 4 videos (later there will be crossfade transition between them).
Everytime I'm trying to export video I get this error:
Error Domain=AVFoundationErrorDomain Code=-11821 "Cannot Decode" UserInfo=0x7fd94073cc30 {NSLocalizedDescription=Cannot Decode, NSLocalizedFailureReason=The media data could not be decoded. It may be damaged.}
The funniest thing is that if I don't provide AVAssetExportSession with AVMutableVideoComposition, then everything works fine! I can't understand what I'm doing wrong. The source videos are downloaded from youtube and have .mp4 extension. I can play them with MPMoviePlayerController. While checking the source code, please, look carefully at AVMutableVideoComposition.
I was testing this code in Xcode 6.0.1 on iOS simulator.
#import "VideoStitcher.h"
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import <AssetsLibrary/AssetsLibrary.h>
#implementation VideoStitcher
{
VideoStitcherCompletionBlock _completionBlock;
AVMutableComposition *_composition;
AVMutableVideoComposition *_videoComposition;
}
- (instancetype)init
{
self = [super init];
if (self)
{
_composition = [AVMutableComposition composition];
_videoComposition = [AVMutableVideoComposition videoComposition];
}
return self;
}
- (void)compileVideoWithAssets:(NSArray *)assets completion:(VideoStitcherCompletionBlock)completion
{
_completionBlock = [completion copy];
if (assets == nil || assets.count < 2)
{
// We need at least two video to make a stitch, right?
NSAssert(NO, #"VideoStitcher: assets parameter is nil or has not enough items in it");
}
else
{
[self composeAssets:assets];
if (_composition != nil) // if stitching went good and no errors were found
[self exportComposition];
}
}
- (void)composeAssets:(NSArray *)assets
{
AVMutableCompositionTrack *compositionVideoTrack = [_composition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
NSError *compositionError = nil;
CMTime currentTime = kCMTimeZero;
AVAsset *asset = nil;
for (int i = (int)assets.count - 1; i >= 0; i--) //For some reason videos are compiled in reverse order. Find the bug later. 06.10.14
{
asset = assets[i];
AVAssetTrack *assetVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] firstObject];
BOOL success = [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetVideoTrack.timeRange.duration)
ofTrack:assetVideoTrack
atTime:currentTime
error:&compositionError];
if (success)
{
CMTimeAdd(currentTime, asset.duration);
}
else
{
NSLog(#"VideoStitcher: something went wrong during inserting time range in composition");
if (compositionError != nil)
{
NSLog(#"%#", compositionError);
_completionBlock(nil, compositionError);
_composition = nil;
return;
}
}
}
AVMutableVideoCompositionInstruction *videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, _composition.duration);
videoCompositionInstruction.backgroundColor = [[UIColor redColor] CGColor];
_videoComposition.instructions = #[videoCompositionInstruction];
_videoComposition.renderSize = [self calculateOptimalRenderSizeFromAssets:assets];
_videoComposition.frameDuration = CMTimeMake(1, 600);
}
- (void)exportComposition
{
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:#"testVideo.mov"];
NSURL *url = [NSURL fileURLWithPath:myPathDocs];
NSString *filePath = [url path];
NSFileManager *fileManager = [NSFileManager defaultManager];
if ([fileManager fileExistsAtPath:filePath]) {
NSError *error;
if ([fileManager removeItemAtPath:filePath error:&error] == NO) {
NSLog(#"removeItemAtPath %# error:%#", filePath, error);
}
}
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:_composition
presetName:AVAssetExportPreset1280x720];
exporter.outputURL = url;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
exporter.videoComposition = _videoComposition;
[exporter exportAsynchronouslyWithCompletionHandler:^{
[self exportDidFinish:exporter];
}];
}
- (void)exportDidFinish:(AVAssetExportSession*)session
{
NSLog(#"%li", session.status);
if (session.status == AVAssetExportSessionStatusCompleted)
{
NSURL *outputURL = session.outputURL;
// time to call delegate methods, but for testing purposes we save the video in 'photos' app
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputURL])
{
[library writeVideoAtPathToSavedPhotosAlbum:outputURL completionBlock:^(NSURL *assetURL, NSError *error){
if (error == nil)
{
NSLog(#"successfully saved video");
}
else
{
NSLog(#"saving video failed.\n%#", error);
}
}];
}
}
else if (session.status == AVAssetExportSessionStatusFailed)
{
NSLog(#"VideoStitcher: exporting failed.\n%#", session.error);
}
}
- (CGSize)calculateOptimalRenderSizeFromAssets:(NSArray *)assets
{
AVAsset *firstAsset = assets[0];
AVAssetTrack *firstAssetVideoTrack = [[firstAsset tracksWithMediaType:AVMediaTypeVideo] firstObject];
CGFloat maxWidth = firstAssetVideoTrack.naturalSize.height;
CGFloat maxHeight = firstAssetVideoTrack.naturalSize.width;
for (AVAsset *asset in assets)
{
AVAssetTrack *assetVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] firstObject];
if (assetVideoTrack.naturalSize.width > maxWidth)
maxWidth = assetVideoTrack.naturalSize.width;
if (assetVideoTrack.naturalSize.height > maxHeight)
maxHeight = assetVideoTrack.naturalSize.height;
}
return CGSizeMake(maxWidth, maxHeight);
}
#end
Thank you for your attention. I am really tired, I've been trying to find the bug for four hours straight. I'll go to sleep now.
I've finally found the solution. The description of error lead me in the wrong direction: "Cannot Decode. The media data could not be decoded. It may be damaged.". From this description you may think that there is something wrong with your video files. I've spent 5 hours experimenting with formats, debugging and etc.
Well, THE ANSWER IS COMPLETELY DIFFERENT!
My mistake was that I forgot that CMTimeADD() returns value. I thought that it changes the value of its first argument, and in the code you can see this:
CMTime currentTime = kCMTimeZero;
for (int i = (int)assets.count - 1; i >= 0; i--)
{
CMTimeAdd(currentTime, asset.duration); //HERE!! I don't actually increment the value! currentTime is always kCMTimeZero
}
videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, _composition.duration); // And that's where everything breaks!
The lesson that I've learned: When working with AVFoundation always check your time values! It's very important, otherwise you'll get a lot of bugs.
Error:-
domain: "AVFoundationErrorDomain" - code: 18446744073709539816
Solution:- [Swift 5.5]
Stop running mutiple av player in back ground thread.
I'm developing an ios audio app on xcode and I'm trying to use 2 audio files I have recorded - which are playing at the same time and export it to one audio file.
All I have managed to do is merge 2 audio files to one, but the 2 audios are playing one after another and not in sync at the same time.
Does anyone have a clue how I can sort it out?
Thanks
You should take a look at this for AAC conversion (http://atastypixel.com/blog/easy-aac-compressed-audio-conversion-on-ios/). It's super useful.
Another thing you might want to consider... combining two audio signals is as easy as adding the samples together. So what you could do is:
Open both recordings and get an array for each of the recordings that holds the audio samples.
Make a for() loop that adds each sample and puts it in an output array
for(int i = 0; i<numberOfSamples; i++) {
exportBuffer[i] = firstTrack[i] + secondTrack[i];
}
and then write the exportBuffer to an m4a file.
This code will only work if the two files are the same exact length, so adjust it to your needs. You'll need to add a conditional that fires if you've reached the end of one of the arrays. In that case, just add 0's.
Try Apple's MixerHost sample app.
/* Implement this method if you have already saved your recorded audio file */
-(void)mixAudio{
AVMutableComposition *composition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack setPreferredVolume:0.8];
NSString *soundOne =[[NSBundle mainBundle]pathForResource:#"RecordAudio1" ofType:#"wav"];
NSURL *url = [NSURL fileURLWithPath:soundOne];
AVAsset *avAsset = [AVURLAsset URLAssetWithURL:url options:nil];
NSArray *tracks = [avAsset tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack *clipAudioTrack = [tracks objectAtIndex:0];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration) ofTrack:clipAudioTrack atTime:kCMTimeZero error:nil];
AVMutableCompositionTrack *compositionAudioTrack1 = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack setPreferredVolume:0.8];
NSString *soundOne1 =[[NSBundle mainBundle]pathForResource:#"RecordAudio2" ofType:#"wav"];
NSURL *url1 = [NSURL fileURLWithPath:soundOne1];
AVAsset *avAsset1 = [AVURLAsset URLAssetWithURL:url1 options:nil];
NSArray *tracks1 = [avAsset1 tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack *clipAudioTrack1 = [tracks1 objectAtIndex:0];
[compositionAudioTrack1 insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset1.duration) ofTrack:clipAudioTrack1 atTime: kCMTimeZero error:nil];
AVAssetExportSession *exportSession = [AVAssetExportSession
exportSessionWithAsset:composition
presetName:AVAssetExportPresetAppleM4A];
if (nil == exportSession) return NO;
NSString *soundOneNew = [documentsDirectory stringByAppendingPathComponent:#"combined10.m4a"];
//NSLog(#"Output file path - %#",soundOneNew);
// configure export session output with all our parameters
exportSession.outputURL = [NSURL fileURLWithPath:soundOneNew]; // output path
exportSession.outputFileType = AVFileTypeAppleM4A; // output file type
// perform the export
[exportSession exportAsynchronouslyWithCompletionHandler:^{
if (AVAssetExportSessionStatusCompleted == exportSession.status) {
NSLog(#"AVAssetExportSessionStatusCompleted");
} else if (AVAssetExportSessionStatusFailed == exportSession.status) {
// a failure may happen because of an event out of your control
// for example, an interruption like a phone call comming in
// make sure and handle this case appropriately
NSLog(#"AVAssetExportSessionStatusFailed");
} else {
NSLog(#"Export Session Status: %d", exportSession.status);
}
}];
}