since Apple anounced that every app has to support 64-bit starting February 1rst, I can't use Dirac3LE anymore. So I found Superpowered which seems to do the same. The only problem I currently see is, that I can't get it to play songs from the iPod Library.
I've tried importing the song via AVAssetExportSession but can't get it to work. This is the code I've tried so far:
NSURL *url = [playingSong valueForProperty:MPMediaItemPropertyAssetURL];
AVURLAsset *songAsset = [AVURLAsset URLAssetWithURL:url options:nil];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset: songAsset presetName: AVAssetExportPresetPassthrough];
exporter.outputFileType = #"com.apple.coreaudio-format";
NSString *fname = [[NSString stringWithFormat:#"tmp"] stringByAppendingString:#".mp3"];
NSString *tempPath = NSTemporaryDirectory();
NSString *exportFile = [tempPath stringByAppendingPathComponent: fname];
exporter.outputURL = [NSURL fileURLWithPath:exportFile];
[exporter exportAsynchronouslyWithCompletionHandler:^{
self.player = [[SuperpoweredAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:exportFile]];
[self.player play];
}];
With opening the file via:
player->open([[url path] fileSystemRepresentation]);
Even if this would work, I'm kind of concerned if this would be fast enough for a music player. Importing a song, as soon as the other finished.
Are there any other options?
Thanks!
if you have a MPMediaItem *item got from iTunes Library, you can use:
player->open([[item assetURL].absoluteString UTF8String]);
Related
I have to compress the video size in my app and play that compressed video in MPMoviePlayerController.I am using AVURLAsset to compress that video and after compressing i m getting a Output URL and when i am playing that URL it shows me this Err0r:-'NSInternalInconsistencyException', reason: 'property storage cannot find expected per-thread storage data
Here is the code
[[NSFileManager defaultManager] removeItemAtURL:outputURL error:nil];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:inputURL options:nil];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetMediumQuality];
exportSession.outputURL = outputURL;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
[exportSession exportAsynchronouslyWithCompletionHandler:^(void) { handler(exportSession);
}];
What to do to resolve this issue.
Help me out
Probably most of you using WhatsApp and you probably know that when you want to send a video to one of your contacts WhatsApp first of all compressing the video to maximum size of 16mb and only afterword it uploads the chosen video to your contact.
what i am trying to do it is simply the same thing using AV Foundation or to be more specific AVAssetExportSession.
here is my code:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
NSURL *videoURL = info[UIImagePickerControllerMediaURL];
NSString *documentsDirectory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString *sourcePath =[documentsDirectory stringByAppendingString:#"/output.mov"];
NSURL *outputURL = [NSURL fileURLWithPath:sourcePath];
[self convertVideoToLowQuailtyWithInputURL:videoURL outputURL:outputURL handler:^(AVAssetExportSession *exportSession)
{
if (exportSession.status == AVAssetExportSessionStatusCompleted)
{
NSLog(#"completed");
}
else
{
NSLog(#"error: %#",exportSession.error);
}
}];
[picker dismissViewControllerAnimated:YES completion:NULL];
}
- (void)convertVideoToLowQuailtyWithInputURL:(NSURL*)inputURL outputURL:(NSURL*)outputURL handler:(void (^)(AVAssetExportSession*))handler
{
[[NSFileManager defaultManager] removeItemAtURL:outputURL error:nil];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:inputURL options:nil];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetMediumQuality];
exportSession.outputURL = outputURL;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
[exportSession exportAsynchronouslyWithCompletionHandler:^(void)
{
handler(exportSession);
}];
}
this code works wonderful, it takes the video and compressing it really small size.
the problem is when the user try to upload a quite long video with powerful camera the size is not small enough to me.
What i want to do is actually to compress any video to a limited size
lets say for example to 16Mb like WhatsApp do.
How can i do that?
It doesn't seems to exist an easy way but AVAssetExportSession has an estimatedOutputFileLenght that could help.
In my code I iterate over different qualities and check if the file size is in the size I want:
NSURL * inputURL = [NSURL fileURLWithPath:path];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:inputURL options:nil];
AVAssetExportSession *exportSession = nil;
for (NSString * string in #[AVAssetExportPresetHighestQuality,AVAssetExportPresetMediumQuality,AVAssetExportPresetLowQuality]) {
exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:string];
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.outputFileType = AVFileTypeMPEG4;
exportSession.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);
unsigned long long espectedFileSize = exportSession.estimatedOutputFileLength;
if (espectedFileSize < VIDE_LIMIT) {
break;
}
}
//Temp file
NSString *fileName = [NSString stringWithFormat:#"%#_%#", [[NSProcessInfo processInfo] globallyUniqueString], #"video.mov"];
NSString * filePath = [NSTemporaryDirectory() stringByAppendingPathComponent:fileName];
NSURL *fileURL = [NSURL fileURLWithPath:filePath];
exportSession.outputURL = fileURL;
[exportSession exportAsynchronouslyWithCompletionHandler:^(void)
I am trying to play a local video from my app using AVPlayer. I have looked everywhere but can't find any useful tutorials/demos/documentation. Here is what I am trying, but it is not playing. Any idea why? The URL is valid because I was using the same one to play a video using MPMoviePlayer successfully.
Video *currentVideo = [videoArray objectAtIndex:0];
NSString *filepath = currentVideo.videoURL;
NSURL *fileURL = [NSURL fileURLWithPath:filepath];
AVURLAsset *asset = [AVURLAsset assetWithURL: fileURL];
AVPlayerItem *item = [AVPlayerItem playerItemWithAsset: asset];
self.player = [[AVPlayer alloc] initWithPlayerItem: item];
AVPlayerLayer *layer = [AVPlayerLayer playerLayerWithPlayer:self.player];
self.player.actionAtItemEnd = AVPlayerActionAtItemEndNone;
layer.frame = CGRectMake(0, 0, 1024, 768);
[self.view.layer addSublayer: layer];
[self.player play];
I believe the problem is that you're assuming that after executing this line AVURLAsset *asset = [AVURLAsset assetWithURL: fileURL]; the asset is ready to use. This is not the case as noted in the AV Foundation Programming Guide:
You create an asset from a URL using AVURLAsset. Creating the asset, however,
does not necessarily mean that it’s ready for use. To be used, an asset must
have loaded its tracks.
After creating the asset try loading it's tracks:
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:fileURL options:nil];
NSString *tracksKey = #"tracks";
[asset loadValuesAsynchronouslyForKeys:#[tracksKey] completionHandler:
^{
NSError *error;
AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error];
if (status == AVKeyValueStatusLoaded) {
// At this point you know the asset is ready
self.playerItem = [AVPlayerItem playerItemWithAsset:asset];
...
}
}];
Please refer to this link to see the complete example.
Hope this helps!
I want to play a sound file (which I have dragged into xcode and copy to the project) using av foundation with the following code but fail.
I think NSURL *url = [[NSURL alloc] initWithString:#"sound.caf"]; this is where it goes wrong, but I have no idea other than this way, how I could instantiate an AVAsset with this the sound file (of course, it would be other place that goes wrong). Anyway, can someone offer me some helps? thanks
AVMutableComposition *composition = [AVMutableComposition composition];
CMPersistentTrackID trackID = kCMPersistentTrackID_Invalid;
AVMutableCompositionTrack *compositionTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:trackID];
NSURL *url = [[NSURL alloc] initWithString:#"sound.caf"];
AVAsset *songAsset = [AVURLAsset URLAssetWithURL:url options:nil];
AVAssetTrack *assetTrack = [[songAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
CMTime startTime = CMTimeMakeWithSeconds(0, 1);
CMTime endTime = songAsset.duration;
CMTimeRange tRange = CMTimeRangeMake(startTime, endTime);
NSError *error = nil;
[compositionTrack insertTimeRange:tRange ofTrack:assetTrack atTime:CMTimeMake(0, 44100) error:&error];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:composition];
self.player = [[AVPlayer alloc] initWithPlayerItem:playerItem];
[self.player play];
This code should help you:
NSString *soundPath = [[NSBundle mainBundle] pathForResource:#"myfile" ofType:#"wav"];
NSURL *soundURL = [NSURL fileURLWithPath:soundPath];
NSError *error = nil;
AVAudioPlayer *player = [[AVAudioPlayer alloc] initWithContentsOfURL:soundURL error:&error];
if (error) {
NSLog(#"%#",[error localizedDescription]);
}
[player play];
Try this
NSString *shutterplayerPath =
[[NSBundle mainBundle] pathForResource:#"shutter" ofType:#"mp3"];
NSString *tickplayerPath =
[[NSBundle mainBundle] pathForResource:#"tick" ofType:#"wav"];
shutterAudioPlayer =
[[AVAudioPlayer alloc] initWithContentsOfURL:[[NSURL alloc]
initFileURLWithPath: shutterplayerPath] error:NULL];
tickAudioPlayer =
[[AVAudioPlayer alloc] initWithContentsOfURL:[[NSURL alloc];
initFileURLWithPath:tickplayerPath] error:NULL];
[shutterAudioPlayer play];
[tickAudioPlayer play];
I've an AVAudioPlayer with a MP3-file. But before playing, you'll hear a hitch.
This is my code:
NSString *geluid = [NSString stringWithFormat:#"RG_%#", nummer];
NSString *path = [[NSBundle mainBundle] pathForResource:geluid ofType:#"mp3"];
AVAudioPlayer* theAudio=[[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:path] error:NULL];
theAudio.delegate=self;
[theAudio prepareToPlay];
[theAudio play];
NSURL *url = [NSURL fileURLWithPath:path];
avplayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:nil];
[avplayer prepareToPlay];
[avplayer play];
For example: You hear "Ssstop here", in stead off "Stop here".
Please help me to fix it.
Is this code running two AVAudioPlayer instances at the same time ?
Why are you using both avplayer and theAudio ?
From looking at it it looks like you should get rid of one of them...