Problem playing sound on iPad device with MonoTouch - ios

I am using the following code to play a .CAF file in an iPad application via Monotouch:
public void PlayClick()
{
PlaySound("Media/Click.caf");
}
private void PlaySound(string soundFile)
{
//var mediaFile = NSUrl.FromFilename(soundFile);
//var audioPlayer = new SystemSound(mediaFile);
//audioPlayer.PlaySystemSound();
var audioPlayer = new SystemSound(soundFile);
if (audioPlayer != null)
{
audioPlayer.PlaySystemSound();
}
}
It works find in the simulator - in fact, all variations I've tried (SystemSound, AVAudioPlayer, etc.) appear to work ok in the simulator, but I've not gotten a version to play on a real device yet. The sound files are all marked as Content and I checked the bundle uploaded to the iPad and the files are definitely there in a subfolder named "Media". If I change the code to use SystemSound (via the constructor with Url), I get an InvalidOperationException with the details:
Could not create system sound ID for url file://localhost/private/var/mobile/Applications/AC24496E-12E9-4690-B154-BA1AD1123EDC/Sample.app/Media/Click.caf; error=SystemSoundUnspecified
Anyone know what am I doing wrong? Thanks for any pointers to get me past this issue!

The simulator is case-aware, but the actual device is case-sensitive. This has tripped up many people.

Related

Mono audio output in iOS app when using a webRTC powered video call

The app i'm writing contains 2 parts:
An audio player that plays stereo MP3 files
Video conferencing using webRTC
Each part works perfectly in isolation, but the moment i try them together, one of two things happens:
The video conference audio fades out and we just hear the audio files (in stereo)
We get audio output from both, but the audio files are played in mono, coming out of both ears equally
My digging had taken me down a few routes:
https://developer.apple.com/forums/thread/90503
&
https://github.com/twilio/twilio-video-ios/issues/77
Which suggest that the issue could be with the audio session category, mode or options.
However i've tried lots of the combos and am struggling to get anything working as intended.
Does anyone have a better understanding of the audio options to point in the right direction?
My most recent combination
class BBAudioClass {
static private var audioCategory : AVAudioSession.Category = AVAudioSession.Category.playAndRecord
static private var audioCategoryOptions : AVAudioSession.CategoryOptions = [
AVAudioSession.CategoryOptions.mixWithOthers,
AVAudioSession.CategoryOptions.allowBluetooth,
AVAudioSession.CategoryOptions.allowAirPlay,
AVAudioSession.CategoryOptions.allowBluetoothA2DP
]
static private var audioMode = AVAudioSession.Mode.default
static func setCategory() -> Void {
do {
let audioSession: AVAudioSession = AVAudioSession.sharedInstance()
try audioSession.setCategory(
BBAudioClass.audioCategory,
mode: BBAudioClass.audioMode,
options: BBAudioClass.audioCategoryOptions
)
} catch {
};
}
}
Update
I managed to get everything working as i wanted by:
Starting the audio session
Connecting to the video conference (at this point all audio is mono)
Forcing all output to the speaker
Forcing output back to the headphones
Obviously this is a crazy thing to have to do, but does prove that it should work.
But it would be great if anyone knew WHY this works, in order that i can actually get things to work properly first time without going through all these hacky steps

Play default iOS system sound with Monotouch

I want to play the keyboard 'click' sound when pressing buttons in my app.
How do I access this sound clip with Monotouch? I don't want to pass my own sound using AudioToolbox SystemSound.FromFile(). So far all my searches have led to this solution or Objective-C code using 'AudioServicesCreateSystemSoundID' which I'm having trouble translating to C#.
With iOS 7.0 and higher, Stephane Delcroix answer seems not to work anymore...
But you can easily find the path to all the sounds in this nice project:
https://github.com/TUNER88/iOSSystemSoundsLibrary
Here is the code I used und iOS 7 (take care, it might not work with the simulator!)
private const string NotificationSoundPath = #"/System/Library/Audio/UISounds/New/Fanfare.caf";
public static void TriggerSoundAndViber()
{
SystemSound notificationSound = SystemSound.FromFile(NotificationSoundPath);
notificationSound.AddSystemSoundCompletion(SystemSound.Vibrate.PlaySystemSound);
notificationSound.PlaySystemSound();
}
Also the using() construct in the answer above caused trouble in my case... it seems like it released the sound too early, I only can hear it (and even then not complete with viber) with a breakpoint on PlaySystemSound().
Well, that's not a 1:1 port from the code in Playing system sound without importing your own, but this should do the work:
var path = NSBundle.FromIdentifier ("com.Apple.UIKit").PathForResource ("Tock", "aiff");
using (var systemSound = new SystemSound (NSUrl.FromFilename (path))) {
systemSound.PlaySystemSound ();
}
SystemSound is defined in MonoTouch.AudioToolbox. Make sure to also look at MonoTouch Play System Sound and MonoTouch: Playing sound

No sound on BlackBerry with openfl

I try to use haxe (openfl) for blackberry development.
And I test PlayingSound sample - it works.
But when I try to load sound from url - doesn't work.
Here is my code:
public function PlaySong(url:String):Void{
var _url:URLRequest = new URLRequest(url);
if (_soundChannel != null) _soundChannel.stop();
_song = new Sound();
_song.load(_url); //<--Do not work
//_song = Assets.getSound("assets/stars.mp3"); <--work
_soundChannel =_song.play(0);
}
In the flash target this code is playing my sound from the url, but when I deploy app to my device - it have no sound. On the device, sound is playing correctly only if I load it from the asset folder.
Also, I see that soundChannel position is always 0 (on device);
I try firstly to load the sound with loader, and then play it, when the loading is complete, but it's not help me too.
Help me, please.
PS Sorry for my English.
Have you tried loading it using this:
var loader:URLLoader = URLLoader(new URLRequest("url"));
loader.data = DataFormat.BINARY;
then try
loader.addEventListener(Event.COMPLETE, onComplete);
function onComplete(e:Event):Void
{
sound.loadCompressedDataFromByteArray(e.data.content)
}
Try to load bytes first, then create sound from that.
Anyway, if your code works on other mobile devices(maybe emulators), then create new issue here:
https://github.com/openfl/openfl

Native video in Air on iOS stopping sound and microphone

I'm trying to play a native video in a StageWebView in an Air for iPad app that plays sound and requires user interaction via the microphone.
Everything seems to work, but when I start playing the video, sound and microphone stop.
If I dispose the StageWebView, sound and mic get back but only after 15 seconds (and I need sound and mic to work at least straight after the StageWebView is released).
I tried to get this work on an iOS5 iPad1, and on an iOS6 iPad2, using Air 3.4, 3.5 and 3.6 beta. I tried to switch the mute button of the iPad, and I also tried to change the SoundMixer.audioPlaybackMode to Media and Ambient.
But it didn't worked and I'm stucked.
Here is my code that deals with the microphone :
var microphone:Microphone = Microphone.getMicrophone();
microphone.addEventListener(SampleDataEvent.SAMPLE_DATA, __micHandler);
private function micHandler(event : SampleDataEvent) : void {
trace("mic is working !");
}
For the audio :
_snd = new Sound();
_snd.load(new URLRequest(path));
_sndChannel = _snd.play();
private function soundStopHandler(event : MouseEvent) : void {
if(_sndChannel) _sndChannel.stop();
}
And for the video player :
_webview = new StageWebView();
_webview.stage = stage;
_webview.viewPort = new Rectangle(10, 120, 480, 300);
_webview.loadURL(path);
private function videoStopHandler(event : MouseEvent) : void {
if(_webview) {
_webview.dispose();
_webview = null;
}
}
Did anyone faced this problem before me ? Is there anything I forgot or did in a wrong way ?
Maybe this problem is related to the iOS System.
In iOS perspective, as far as I know, officially it is not possible. When you start the recording sessions, except for recording does not occupy it. Once you playback and recording sessions in the session must be solved. recording and playback session, the session can not be occupied simultaneously.
refer a apple documentation: Audio Session
AVAudioSessionCategoryPlayAndRecord or the equivalent kAudioSessionCategory_PlayAndRecord—Use this category for an application that inputs and outputs audio. The input and output need not occur simultaneously, but can if needed. This is the category to use for audio chat applications.
In AIR, if you code no problem. most likely this reason.

PhoneGap Media API cannot overwrite recorded audio on iPhone device

I have an audio file 'default.wav' in 'www/audio/' folder. In iOS simulator, I can record audio, overwrite the 'www/audio/default.wav', and play it. However, on the real device, the 'default.wav' cannot be overwritten. The device always play the unchanged audio file no matter how many times I record.
Could anyone please tell me how to save the new recorde audio or overwrite to 'default.wav' in 'www/audio/' folder on the device? I am using phonegap 1.3.0 on iPhone running iOS5.
function recordAudio() {
var src = "audio/default.wav";
mediaRec = new Media(src, onSuccess, onError);
mediaRec.startRecord();
}
function stopRecordAudio(){
mediaRec.stopRecord();
mediaRec.release();
alert('stopRecordAudio');
}
function playAudio() {
var src = "audio/default.wav";
mediaPlay = new Media(src, onSuccessPlay, onErrorPlay);
mediaPlay.play();
}
You can't overwrite files in the www directory. These are part of your application.

Resources