I have an app that plays a sound file every time the screen is touched. For some reason, the app will crash every once in a while with the following error:
reason: 'Resource tick.mp3 can not be loaded'
In case you need it, here is how I play the file each time the screen is tapped:
runAction(SKAction.playSoundFileNamed("tick.mp3", waitForCompletion: false))
This does not happen very often, maybe 1 in 10 runs of the app. Most of the time everything works as expected. I wish I knew what I am doing to cause the crash but I have no clue! I am just tapping away seemingly no different than the times when it doesn't crash. Then all of a sudden I get this issue...
If you play the sound via a playSound function, it will work
var soundFile = SKAction.playSoundFileNamed("bark.wav", waitForCompletion: false)
playSound(soundFile)
playSound:
func playSound(soundVariable : SKAction)
{
runAction(soundVariable)
}
First of all, it looks like that you are using mp3 file to play (short) sound effects. When using mp3 the audio is compressed. In memory, it will have different, bigger size. Also there is a decoding performance penalty (decoding takes CPU time). The most important thing, and the reason why I am talking about mp3 files can be found in docs:
When using hardware-assisted decoding, the device can play only a
single instance of one of the supported formats at a time. For
example, if you are playing a stereo MP3 sound using the hardware
codec, a second simultaneous MP3 sound will use software decoding.
Similarly, you cannot simultaneously play an AAC and an ALAC sound
using hardware. If the iPod application is playing an AAC or MP3 sound
in the background, it has claimed the hardware codec; your application
then plays AAC, ALAC, and MP3 audio using software decoding.
As you can see,the problem is that only one mp3 file at a time can be played using hardware. If you play more than one mp3 at a time, they will be decoded with software and that is slow.
So, I would recommend you to use .wav or .caf files to play sound effects. mp3 would be probably good for background music.
About crashing issue:
try to use .wav or .caf files instead of .mp3
try to hold a strong reference to the SKAction and reuse it as suggested by Reece Kenney.
Related
The main source of my worries comes from the m3u8 manifests I use to play video in my app.
They contain variants with audio & video and some containing only audio.
The problem occurs when I load this type of manifest in avplayercontroller/avplayer while having a bad network or low bandwidth.
In every case avplayer witch to the tracks with no video to clear some bandwidth, resulting in no image appearing on the player.
I would like to know if there is a way to force avplayer or avplayercontroller to only focus on the manifest tracks containing audio and video and forbid audio only tracks to be selected, loaded, and played.
Thank you in advance to any tips or pointers you can give me on this subject.
I am using AudioKit 4.3 with Xcode 9.4.1, Swift, and managed so far to get a sequencer playing an AKSynthKick in a classic house music fashion, it is a 4 beat loop and the kick plays in each beat, but i am clueless at adding wav or caf or aiff files to play with the sequencer, AKMIDISampler asks for notes but does not makes sense to me if it is a single file...
AKMIDISampler will work for you here. AKMIDISampler is a subclass of AKAppleSampler, and it has the methods loadWav() and loadAudioFile(). The noteNumber parameter, when using an audio file, will control playback speed/pitch. Using MIDINoteNumber 60 will play back the file at its actual speed, 72 will be double speed (and an octave higher, if it's a pitched sample), 48 will be half speed (an octave lower) and so on.
I have a completely silence audio made from audacity. But when i convert it to wav 8bit/16bit and play it, it still plays some noise when the volume is max. So how to make a completely silence audio using audacity.
I have another audio where i have add few seconds of silence inbetween. I have converted that audio to wav 8bit/16bit. When i play the audio in full volume it still plays some noise.
Because i play the audio in android using audiotrack.write() it plays buzzing sound
The noise might be generated by the electronic equipment you are using to play the track, not by the digital file itself. There is always a bit of noise and it cannot be avoided completely. You can reduce it by buying more sophisticated headphones/sound cards, woofers etc.
Take a look at the Signal-To-Noise-Ratio concept. If there is no signal (silent track) at max volume it is easier to hear noise that otherwise is hidden by the signal.
I am using The Amazing Audio Engine to simply play an audio file, but I find that when the channel starts playing, there is some automatic fade in happening.
You can see the top waveform is the output of my iPad, and the bottom waveform is the actual raw audio file. There is definitely a 30ms microfade being done.
There is nothing doing that within the amazing audio engine library, so it's something internally happening from apple's mixer audio unit. Is there any way to turn off this behavior?
I suspect that the AudioFilePlayer (used by TAAE) uses the Extended Audio File Services under the hood. ExtAudioFileRef will do that on the first read after a seek if there is any de-coding or sample rate conversion. I had to use the Audio File Services directly to get ride of the implicit fading.
We're building an iPhone game using PhoneGap.
iOS devices support many audio formats, and we are thinking about using .mp3 or .caf files for the sound effects.
Does it matter which audio format is used? What are the differences between using one versus another?
CAF for Sound Effects
MP3 for soundtracks (mp3 files can't be loop, it has a small pause in between)