I am saving recorded file to Documents directory and then merging audios with other audios saved in NSBundle. Using m4a encoding for all files.File path format is path/to/documents/directory/filename.m4a. File extension included.
I am creating AVURLAsset like this:
NSURL* audioURL = [NSURL fileURLWithPath:audioPath];
AVURLAsset* audioAsset = [AVURLAsset assetWithURL:audioURL];
Logged like this:
NSLog(#"filename: %#", audioAsset.URL.lastPathComponent); // filename: filename.m4a
But AVURLAsset is not readable and playable and when I get AVAssetTrack array with function tracksWithMediaType like this:
NSArray<AVAssetTrack *>* tracks = [audioAsset tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack* audioTrack = [tracks objectAtIndex:0]; //Exception occurs
Exception: *** -[__NSArrayM objectAtIndex:]: index 0 beyond bounds for empty array thrown
Because AVAssetTrack array is empty. Testing it on simulator with OS 10.3.1.
Tried this according to iCreative's comment:
AVMediaSelectionGroup *audioTracks = [audioAsset mediaSelectionGroupForMediaCharacteristic:AVMediaCharacteristicAudible];
NSLog(#"MediaSelectionGroup count: %lu", (unsigned long)audioTracks.options.count); //count is 0
There isn't any issue in this code. Problem was before this function. Calling merging function before stopping the recorder. The file wasn't released from the recorder and the merger was accessing the file.
Related
Essentially I am looking to concatenate AVAsset files. I've got a rough idea of what to do but I'm struggling with loading the audio files.
I can play the files with an AVAudioPlayer, I can see them in the directory via my terminal, but when I attempt to load them with AVAssetURL it always returns an empty array for tracks.
The URL I am using:
NSURL *firstAudioFileLocation = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#%#", workingDirectory , #"/temp.pcm"]];
Which results in:
file:///Users/evolve/Library/Developer/CoreSimulator/Devices/8BF465E8-321C-47E6-BF2E-049C5E900F3C/data/Containers/Data/Application/4A2D29B2-E5B4-4D07-AE6B-1DD15F5E59A3/Documents/temp.pcm
The asset being loaded:
AVAsset *test = [AVURLAsset URLAssetWithURL:firstAudioFileLocation options:nil];
However when calling this:
NSLog(#" total tracks %#", test.tracks);
My output is always total tracks ().
My subsequent calls to add them to my AVMutableCompositionTrack end up crashing the app as the AVAsset seems to not have loaded correctly.
I have played with other variations for loading the asset including:
NSURL *alternativeLocation = [[NSBundle mainBundle] URLForResource:#"temp" withExtension:#"pcm"];
As well as trying to load AVAsset with the options from the documentation:
NSDictionary *assetOptions = #{AVURLAssetPreferPreciseDurationAndTimingKey: #YES};
How do I load the tracks from a local resource, recently created by the AVAudioRecorder?
EDIT
I had a poke around and found I can record and load a .CAF file extension.
Seems .PCM is unsupported for AVAsset, this page also was of great help. https://developer.apple.com/documentation/avfoundation/avfiletype
An AVAsset load is not instantaneous. You need to wait for the data to be available. Example:
AVAsset *test = [AVURLAsset URLAssetWithURL:firstAudioFileLocation options:nil];
[test loadValuesAsynchronouslyForKeys:#[#"playable",#"tracks"] completionHandler:^{
// Now tracks is available
NSLog(#" total tracks %#", test.tracks);
}];
A more detailed example can be found in the documentation.
Before posting my question, I tried with this post.
Currently I am migrating my project from MPMoviewPlayer to AVPlayer.
In that moment Video file not played if I changed name with French characters. AVURLAsset will return empty tracks array.
Also, asset.playable returns NO.
Here is my code :
NSURL *videoURL = [NSURL fileURLWithPath:videoFilePath];
videoURL = [NSURL fileURLWithPath:videoFilePath isDirectory:NO];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:videoURL options:nil];
if([asset tracksWithMediaType:AVMediaTypeVideo].count == 0) {
NSLog(#"No Video Track");
} else if([asset tracksWithMediaType:AVMediaTypeAudio].count == 0) {
NSLog(#"No Audio Track");
}
If you observe my File path Users/Axio-Mac/Library/Developer/CoreSimulator/Devices..../Library/Caches/AppFiles/première journée French_accent File name â ê i-xiufm2.mp4 you can see french accented characters. Is that the impact ?
I am doing the following
AVURLAsset *audioAsset = [[AVURLAsset alloc] initWithURL:audioUrl options:nil];
NSArray<AVAssetTrack *> *audioTracks = [audioAsset tracksWithMediaType:AVMediaTypeAudio];
which works fine on a real device.
The problem is just happening in the simulators. I am having a statically added mp3 in the bundle, so the audioAsset is properly initiated. But the array audioTracks is empty on the simulator (even though the path in audioUrl and the audioAsset variable is correct and existing.
Any suggestions?
I've faced with the same issue on a real device as well. The issue is caused by the fact that after initialising asset it doesn't ready yet.
Please have a look at documentation:
You can initialize a player item with an existing asset, or you can initialize a player item directly from a URL so that you can play a resource at a particular location (AVPlayerItem will then create and configure an asset for the resource). As with AVAsset, though, simply initializing a player item doesn’t necessarily mean it’s ready for immediate playback. You can observe (using key-value observing) an item’s status property to determine if and when it’s ready to play.
To fix the issue you can try to do the following:
AVURLAsset *audioAsset = [[AVURLAsset alloc] initWithURL:audioUrl options:nil];
NSString *tracksKey = #"tracks";
[audioAsset loadValuesAsynchronouslyForKeys:#[tracksKey] completionHandler:
^{
NSError *error;
AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error];
if (status == AVKeyValueStatusLoaded) {
// The asset is ready at this point
NSArray<AVAssetTrack *> *audioTracks = [audioAsset tracksWithMediaType:AVMediaTypeAudio];
}
}];
Also it's worth to note that AVKeyValueStatus status can be AVKeyValueStatusFailed in case if audioAsset is not playable:
BOOL result = [audioAsset isPlayable];
The problem I am having is getting a variable amount of silence to be placed in-between two wav files.
My approach thus far is as follows:
Firstly I create an AVMutableComposition and an AVMutableCompositionTrack
AVMutableComposition* composition = [AVMutableComposition composition];
AVMutableCompositionTrack* appendedAudioTrack =
[composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
Then using AVURLAsset I allocate my conveniently named first.wav file.
AVURLAsset* firstComponent = [[AVURLAsset alloc]
initWithURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource: #"first" ofType: #"wav"]] options:nil];
I then insert this into the mutable composition track I named appendedAudioTrack earlier
NSArray *firstTrack = [firstComponent tracksWithMediaType:AVMediaTypeAudio];
CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, firstComponent.duration);
[appendedAudioTrack insertTimeRange:timeRange
ofTrack:[firstTrack objectAtIndex:0]
atTime:kCMTimeZero
error:&error];
The second.wav file is inserted in the exact same way:
AVURLAsset* secondComponent = [[AVURLAsset alloc]
initWithURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource: #"second" ofType: #"wav"]] options:nil];
NSArray *secondTrack = [secondComponent tracksWithMediaType:AVMediaTypeAudio];
CMTimeRange timeRange2 = CMTimeRangeMake(kCMTimeZero, secondComponent.duration);
[appendedAudioTrack insertTimeRange:timeRange2
ofTrack:[secondTrack objectAtIndex:0]
atTime:kCMTimeZero
error:&error];
This so far successfully joins the two wav files end to end.
I then try to insert a variable amount of silence using insertEmptyTimeRange like this:
CMTimeRange timeRange3 = CMTimeRangeFromTimeToTime(firstComponent.duration,CMTimeAdd(firstComponent.duration,CMTimeMake(interval,1)));
[appendedAudioTrack insertEmptyTimeRange:timeRange3];
The silence duration is a float and for testing purposes is currently 0.49. Its variable name is interval and it represents the desired silence in seconds.
An assumption i've made using CMTimeRange is that AVURLAsset's duration property can be considered as the finish CMTime for the first audio track.
When I download the documents directory from organiser in Xcode, and look at the resulting m4a file in audacity the silence is in the file, but its at the start, not in the middle of both .wav files as desired.
Incorrectly it goes: SILENCE, first.wav, second.wav
I would like to know how to properly use insertEmptyTimeRange to produce
first.wav, SILENCE, second.wav.
Note: I have seen this question (and others) which presents a very similar problem, however the approach they went for was to use a constant silence file. My silence is variable and determined at run time. And I am also aware that another answer provides what they say is a solution, but it has not worked for me. I have tried all the different methods I found on the internet but it seems I'm misunderstanding something as it never works correctly for me.
Just in case it matters, I export the file like so:
// Create a new audio file using the appendedAudioTrack
AVAssetExportSession* exportSession = [AVAssetExportSession
exportSessionWithAsset:composition
presetName:AVAssetExportPresetAppleM4A];
if (!exportSession)
{
// do something
return nil;
}
//This gives me an output URL to a file name that doesn't yet exist
NSString *path2;
NSArray *paths2 = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
path2 = [paths2 objectAtIndex:0];
path2 = [path2 stringByAppendingPathComponent:[string stringByAppendingString: #".m4a"]];
NSString* appendedAudioPath= path2;
exportSession.outputURL = [NSURL fileURLWithPath:appendedAudioPath];
exportSession.outputFileType = AVFileTypeAppleM4A;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch (exportSession.status)
{
case AVAssetExportSessionStatusFailed:
NSLog(#"%#",exportSession.error);
break;
case AVAssetExportSessionStatusCompleted:
break;
case AVAssetExportSessionStatusWaiting:
break;
default:
break;
}
Thanks
It turns out the interval was not of the correct duration, when checked in audacity it was actually 0.049, not 0.49.
The problem I was having was with this
CMTimeMake(interval,1)
I thought this would yield interval/1 seconds but I was wrong.
Instead I used CMTimeMakeWithSeconds along with NSEC_PER_SEC like so
CMTimeRange timeRange3 = CMTimeRangeFromTimeToTime(firstComponent.duration,CMTimeAdd(firstComponent.duration,CMTimeMakeWithSeconds(interval ,NSEC_PER_SEC)));
[appendedAudioTrack insertEmptyTimeRange:timeRange3];
This then provided me with the desired silence of interval seconds long, and in the right place.
I've been trying to extract audio from a .mov file for a while now and I just can't seem to get it working. Specifically, I need to extract the audio and save it as an .aif or .aiff file .
I've tried using an AVMutableComposition, and loading the mov file as a AVAsset. Adding only the audio track to the AVMutableComposition before finally using an AVAssetExportSession (setting the output file type to AVFileTypeAIFF, which is the format I need it in), to write the file to an aif.
I get an error saying that this output file type is invalid, I'm unsure why:
* Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: 'Invalid output file type'
AVAssetExportSession *exporter;
exporter = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetHighestQuality] ;
exporter.audioMix = audioMix;
exporter.outputURL=[NSURL fileURLWithPath:filePath];
exporter.outputFileType=AVFileTypeAIFF; //Error occurs on this line
I'm not sure if the above approach would work, but im open to the possibility that I'm just doing something wrong. However if anyone knows another way to accomplish what I'm trying to achieve, than any help would be greatly appreciated.
I can post more detailed code if it is needed, but at the moment I'm trying a few other approaches so its a bit messy right now.
Thanks for the help!
-(void)getAudioFromVideo {
float startTime = 0;
float endTime = 10;
[super viewDidLoad];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *audioPath = [documentsDirectory stringByAppendingPathComponent:#"soundOneNew.m4a"];
AVAsset *myasset = [AVAsset assetWithURL:[[NSBundle mainBundle] URLForResource:#"VideoName" withExtension:#"mp4"]];
AVAssetExportSession *exportSession=[AVAssetExportSession exportSessionWithAsset:myasset presetName:AVAssetExportPresetAppleM4A];
exportSession.outputURL=[NSURL fileURLWithPath:audioPath];
exportSession.outputFileType=AVFileTypeAppleM4A;
CMTime vocalStartMarker = CMTimeMake((int)(floor(startTime * 100)), 100);
CMTime vocalEndMarker = CMTimeMake((int)(ceil(endTime * 100)), 100);
CMTimeRange exportTimeRange = CMTimeRangeFromTimeToTime(vocalStartMarker, vocalEndMarker);
exportSession.timeRange= exportTimeRange;
if ([[NSFileManager defaultManager] fileExistsAtPath:audioPath]) {
[[NSFileManager defaultManager] removeItemAtPath:audioPath error:nil];
}
[exportSession exportAsynchronouslyWithCompletionHandler:^{
if (exportSession.status==AVAssetExportSessionStatusFailed) {
NSLog(#"failed");
}
else {
NSLog(#"AudioLocation : %#",audioPath);
}
}];
}
Because the outputFileType is wrong. For .mov file, it often is #"com.apple.quicktime-movie". And the audio extracted is .mov format. To make sure, you can use this method to get the supported output type:
NSArray *supportedTypeArray=exportSession.supportedFileTypes;
for (NSString *str in supportedTypeArray)
NSLog(#"%#",str);
And you can export audio in audio format (.m4a, can be played by AVAudioPlayer) by using methods like this:
AVAssetExportSession *exportSession=[AVAssetExportSession exportSessionWithAsset:self.mAsset presetName:AVAssetExportPresetAppleM4A];
exportSession.outputURL=myUrl;
exportSession.outputFileType=AVFileTypeAppleM4A;
exportSession.timeRange=timeRange;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
if (exportSession.status==AVAssetExportSessionStatusFailed) {
NSLog(#"failed");
}
}];
Just small info for others (I did not know that..)
If you have video file(in my case it was mp4),You can play only the audio with AVAudioPlayer.
You don't need to extract(or convert) the audio from the video.