iOS 10 MSMessageTemplateLayout: Which audio format? - imessage

I am currently trying to build a iMessag app that should send a sound to another user. I set up a little user interface and calling this code to send it to the other user:
MSMessage *newMessage = [[MSMessage alloc]init];
[newMessage setURL:[NSURL URLWithString:#"https://my.app/kiss.m4a"]];
MSMessageTemplateLayout *layout = [[MSMessageTemplateLayout alloc] init];
layout.mediaFileURL = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"kiss" ofType:#"m4a"]];
layout.trailingCaption = #"Kiss";
newMessage.layout = layout;
[self.activeConversation insertMessage:newMessage localizedChangeDescription:nil completionHandler:^(NSError * _Nullable error) {
}];
It will then make a message with the trailing caption but no sound so far. I have tried several formats including mp3, m4a, aac and amr. None of them seem to work and I could not find any documentation. Is there any information which sound formats are supported?
Thanks,
Philip

use insertAttachment: withAlternateFilename: completionHandler:completionHandler to send audio

Related

AudioKit AKTimePitch does not work for the recoded file example.m4a

I am building up a APP on iOS by AudioKit(version 4.5.3), and I find out the AKTimePitch class does not work for me, here is my code(objective-c xcode 10):
(IBAction)startButton:(id)sender {
NSURL *url = [[NSBundle mainBundle] URLForResource:#"burncalory" withExtension:#"m4a"];
AKAudioFile *file = [[AKAudioFile alloc] initForReading:url error:nil];
AKAudioPlayer *player = [[AKAudioPlayer alloc] initWithFile:file looping:NO lazyBuffering:YES error:nil completionHandler:^{
NSLog(#"Finished!");
}];
AKTimePitch *akTimePitch = [[AKTimePitch alloc] init:player rate:2.0 pitch:1600 overlap:8];
AudioKit.output = akTimePitch;
[akTimePitch start];
[AudioKit startAndReturnError:nil];
[player playFrom:0.0];
}
I check out the playground(4.5.3), and the sample of "Time Stretching and Pitch Shifting" works well.
Is there something wrong in my code to use AKTimePitch or something wrong with my audio file example.m4a? By the way, this audio file can be loaded and play well by AKAudioPlayer.
After some testing I found that the parameter in the init method does not work, but after I add akTimePitch.pitch=1600 before [player playFrom:0.0], then the AKTimePitch effects works!! I don't know why the AKTimePitch *akTimePitch = [[AKTimePitch alloc] init:player rate:2.0 pitch:1600 overlap:8]; just does not work...

Audio doesn't sound in a few iOS devices

I have developed an app that plays audio files with some voices recordings. When I debug it I have no problems with it, when I download it from the AppStore it works perfectly, my friends use it and they haven't got any problem with the app, but a few people from all over the world have contacted me to tell me that the app doesn't sound.
It is very strange because they tell me that the sound of the bell (mp3 44100Hz, mono, 128kbps) that plays first sounds but the voices (mp3 44100Hz, stereo, 96kbps) don't sound. The people that contact with me has different devices models and different versions of iOS 6.
I use AVAudioPlayer to play the files and I think that it work well.
Have you experienced the same problem?
Thank you
UPDATE
This is the Localizable.strings that I have
I load the file like this:
//
NSString *fileLang = NSLocalizedString(aItem.fileName, nil);
//more code ...
thePlayerURL = [[NSBundle mainBundle] URLForResource:fileLang withExtension:#"mp3"];
//more code ...
AVAudioPlayer *player = [[AVAudioPlayer alloc] initWithContentsOfURL: aItem.urlAudio error:&error];
player.delegate = self;
[player prepareToPlay];
This is an example of the Localizable.strings with the file keys and values, by default the name of the key is the spanish version of the file:
//AUDIO NAMES
"1_au_es" = "1_au_en";
"2_au_es" = "2_au_en";
"3_au_es" = "3_au_en";
"4_au_es" = "4_au_en";
"5_au_es" = "5_au_en";
"6_au_es" = "6_au_en";
"7_au_es" = "7_au_en";
"8_au_es" = "8_au_en";
"9_au_es" = "9_au_en";
"10_au_es" = "10_au_en";
"11_au_es" = "11_au_en";
"12_au_es" = "12_au_en";
"13_au_es" = "13_au_en";
"14_au_es" = "14_au_en";
"15_au_es" = "15_au_en";
"16_au_es" = "16_au_en";
"17_au_es" = "17_au_en";
"18_au_es" = "18_au_en";
"19_au_es" = "19_au_en";
Have you ensured that their vibrate button isn't flipped on? I wrote a translate app that had the users simply mute/put the vibrate button flipped on. It seems simple, but I've gotten almost 15 emails for that alone, saying the app had no audio.
It seems simple, but that solution is the easiest and quickest fix. Vibrate mode, turns off all audio, even for the apps.
My translate App used AVAudioPlayer, and there was nothing programmatically wrong. Just simply user error.
Maybe the audio bitrate could be the problem? I really don't know...
The bell, than works well has 128kbps and the audio files have 96kbps and sometimes don't work. The strange thing is that the audio works very well for the 90% of the uses but sometimes fail. The bell instead work well the 100% times.
:/
Is very difficult to find the precise solution to this problem without checking your Xcode project configuration. However, by the information that you mentioned in the comments, I suspect that the problem must be related with the Localization strings. Check your localization configuration, check that you are using the localization strings files correctly, check that you wrote a valid filename for each supported localization.
Also, checking your code:
NSString *fileLang = NSLocalizedString(aItem.fileName, nil);
//more code ...
thePlayerURL = [[NSBundle mainBundle] URLForResource:fileLang withExtension:#"mp3"];
//more code ...
AVAudioPlayer *player = [[AVAudioPlayer alloc] initWithContentsOfURL: aItem.urlAudio error:&error];
player.delegate = self;
[player prepareToPlay];
Why you are not using "thePlayerURL"?. Maybe that is the problem, since is the "Localized URL". I would expect:
NSString *fileLang = NSLocalizedString(aItem.fileName, nil);
//more code ...
thePlayerURL = [[NSBundle mainBundle] URLForResource:fileLang withExtension:#"mp3"];
//more code ...
AVAudioPlayer *player = [[AVAudioPlayer alloc] initWithContentsOfURL:thePlayerURL error:&error];
player.delegate = self;
[player prepareToPlay];
I recently submitted an application that was having the same problems, but it was denied due to the fact that I hadn't setup "Inter-App Audio" in Xcode and in the Certificates pane under the app info. That may help.
In case you are wondering, the application was in violation of Section 2.13 of the App Store Guidelines.

Start playing SoundCloud audio stream in iOS by using a pre-buffer

Using the following code example from SoundCloud Developers page, the AVAudioPlayer will start playing after the SCRequest response has been received. Depending on the size of the requested file, this might take some time.Does the iOS SoundCloud API offer a pre-buffer solution, so that it would be possible to start playing audio before all data has been received or do I need to implement a own solution with help of NSURLConnection in order to achieve this?
- (void)tableView:(UITableView *)tableView didSelectRowAtIndexPath:(NSIndexPath *)indexPath
{
NSDictionary *track = [self.tracks objectAtIndex:indexPath.row];
NSString *streamURL = [track objectForKey:#"stream_url"];
SCAccount *account = [SCSoundCloud account];
[SCRequest performMethod:SCRequestMethodGET
onResource:[NSURL URLWithString:streamURL]
usingParameters:nil
withAccount:account
sendingProgressHandler:nil
responseHandler:^(NSURLResponse *response, NSData *data, NSError *error) {
NSError *playerError;
player = [[AVAudioPlayer alloc] initWithData:data error:&playerError];
[player prepareToPlay];
[player play];
}];
}
To stream tracks from SoundCloud, all you need to do is pass the URL to AVPlayer with the client id:
NSString *urlString = [NSString stringWithFormat:#"%#?client_id=%#", track.streamURL, self.clientId];
player = [[AVPlayer alloc] initWithURL:[NSURL URLWithString:urlString]];
[player play];
It will take a bit of extra work to make it a progressive download. Hope that helps.
afaik there is no pre-buffer solution at the moment. If you want to contribute to our SoundCloud API we'd love to review a Pull Request from you regarding this feature.
This will probably affect CocoaSoundCloudAPI and OAuth2Client.
Happy Coding!

AVAudioPlayer Stops Before End of File

I have a function that is almost working. A different audio file is played depending on what page you're on. The problem is, some of the audio files end abruptly. For example, while the audio file plays to the end on "case 1", on "case 2" it stops about 90% in.
- (void)playAudio
{
NSURL *audioURL;
[voiceAudio release];
switch (pageNumber)
{
case 1:
audioURL = [[NSURL alloc] initFileURLWithPath:[[NSBundle mainBundle] pathForResource:#"file1" ofType:#"aac"]];
voiceAudio = [[AVAudioPlayer alloc]
initWithContentsOfURL:audioURL error:nil];
[audioURL release];
voiceAudio.delegate = self;
[voiceAudio play];
break;
case 2:
audioURL = [[NSURL alloc] initFileURLWithPath:[[NSBundle mainBundle] pathForResource:#"file2" ofType:#"aac"]];
voiceAudio = [[AVAudioPlayer alloc]
initWithContentsOfURL:audioURL error:nil];
[audioURL release];
voiceAudio.delegate = self;
[voiceAudio play];
break;
// And so on...
}
}
The AACs are a few minutes long. Maybe there's a better way to go about this? AVAudioPlayer can be a bit funky. Thanks!
Edit: AVAudioPlayer is a bit unpredictable with the 64kb mono AACs I made in Adobe Soundbooth, but seems to run fine after remuxing the files to M4A. I used MP4Box to convert them using the syntax: "mp4box -add source.aac:mpeg4 -sbr -ipod target.m4a" (credit)
Reencoding to a higher bitrate (128kb) seemed to fix some, but not all, of the files as well. I didn't thoroughly test this before I discovered remuxing.

iOS - Create an NSURL from an NSFileHandle or NSPipe

I am receiving a stream of encoded audio over the network, but there is a bunch of data mixed in so I need to receive the packets, strip out the audio then play the audio.
AVAudioPlayer, which may or may not be the best tool for this but it's the path I'm currently chasing, wants data from NSData or NSURL. NSData won't work because it is a stream of data and I want it to start as soon as it arrives and continue playing. My thought was:
NSPipe *pipe = [[NSPipe alloc] init];
NSFileHandle *writeHandle = [pipe fileHandleForWriting];
NSFileHandle *readHandle = [pipe fileHandleForReading];
// in network reception thread...
NSData *audioData = [packet getAudioData];
[writeHandle writeData:audioData];
// in audio thread...
NSURL *url = [[NSURL alloc] init];
[url setResourceValue:NSURLFileResourceTypeNamedPipe
forKey:NSURLFileResourceTypeKey
error:&error];
// Connect the readHandle
AVAudioPlayer *audioPlayer = [[AVAudioPlayer alloc]
initWithContentsOfURL:url
error:&error];
However, I don't know how to pass the `readHandle` into the URL. How do I create an NSURL from an existing NSFileHandle? Is there some better approach for this entirely? Is there a way to write data into something that can become an NSURL?
The only real requirement is that I can play the audio as near real time as possible. I don't want to queue up data for even a tenth of a second before it gets played.
It is not possible to retrieve an NSFileHandle's URL because not all handles have a corresponding URL. Your example would appear to be one such example.

Resources