Nowadays, I'm developing an app for iPhone in which I want to play an audio in left channel and right channel seperately (ps:The audio played is muti-channel), up to now, I have tried many ways, for example, finding some properties(e.g. setPan:) which I can set to do this, but failed,so,what should I do with this problem, could you please give me some suggestions? Thank you very much!
For manipulating audio at the channel level, see the AVAudioSession class in AVFoundation in the docs that come with Xcode.
In particular, the Audio Session Programming Guide.
I think Novocaine library will be helpful. You can go through
this example.
It'll help you for sure. In example, you can alter following method
- (void)filterData:(float *)data numFrames:(UInt32)numFrames numChannels:(UInt32)numChannels
in NVDSP.mm file to get what you want.
One easy, powerful, free and maintained solution to manipulate audio in iOS is AudioKit. Through it you can create something like this:
leftSignal = AKBooster(input)
leftSignal.rightGain = 0
leftPannedRight = AKPanner(leftSignal, pan: 0.5)
mixer = AKMixer(leftPannedRight)
AudioKit.output = mixer
It's a great solution to audio without the need to deal with low level frameworks. To help you start, there is a lot of tutorials online and answered questions for AudioKit here at StackOverflow. A nice start is with the AudioKit's playgrounds.
Related
I would like to modulate the signal from the mic input with a sine wave at 200HZ (FM only). Anyone know of any good tutorials/articles that will help get me started?
Any info is very welcome
Thanks
I suggest you start here Audio File Stream Services Reference
Here you can also find some basic tutorials: Getting Started with Audio & Video.
Especially the SpeakHere example app could be interesting
Hope that helps you
The standard way to do audio processing in iOS or OSX is Core Audio. Here's Apple's overview of the framework.
However, Core Audio has a reputation of being very difficult to learn, especially if you don't have experience with C. If you're still wanting to learn Core Audio, then this book is the way to go: Learning Core Audio.
There are simpler ways to work with audio on iOS and OSX, one of them being AudioKit, which was developed specifically so developers can quickly prototype audio without having to deal with lower-level memory management, buffers, and pointer arithmetic.
There are examples showing both FM synthesis and audio input via the microphone, so you should have everything you need :)
Full disclosure: I am one of the developers of AudioKit.
I would like my Xamarin-based iPhone app to play a custom tone but I'm new to iOS development and am struggling to find a simple way to do so.
Ultimately I'd like to be able to make a metal detector type of sound, where infrequent beeps become more frequent and eventually continuous but, to get started, a simple sine wave will suffice.
I've found the objectal-monotouch library (https://github.com/tescott/objectal-monotouch) and an example project in Objective-C (http://www.cocoawithlove.com/2010/10/ios-tone-generator-introduction-to.html) but the former has out-dated references to monotouch and the latter is quite a bit of code to convert for a non-Objective-C programmer.
Before I set off on either of these paths, can anyone recommend any sample code or an up-to-date library to achieve this more simply?
Many thanks,
Richard
Edit: I went ahead and ported the cocoawithlove example. Please contact me if it's of interest. It wasn't rocket science, but it wasn't trivial either, due to significant differences in the Xamarin API. If anyone knows of any resources to aid such conversions (e.g. mappings from the native API to Xamarin or better Xamarin docs!) please let me know!
NSUrl soundURL = NSUrl.FromFilename(soundfile);
using (AVAudioPlayer player = AVAudioPlayer.FromUrl(soundURL)) {
player.Volume = 1.0f;
player.PrepareToPlay();
player.Play();
}
I am trying to create a video player for iOS, but with some additional audio track reading. I have been checking out MPVideoPlayerController, and also AVPlayer in the AV Foundation, but it's all kinda vague.
What I am trying to do is play a video (from a local .mp4), and while the movie is playing get the current audio buffer/frames, so I can do some calculations and other (not video/audio relevant) actions that depend on the currently played audio. This means that the video should keep on playing, with its audio tracks, but I also want the live raw audio data for calculations (like i.e.: getting the amplitude for certain frequency's).
Does anyone have an example or hints to do this ? Of-course I checked out Apple's AV Foundation library documentation, but it was not clear enough for me.
After a really (really) long time Googling, I found a blog post that describes MTAudioProcessingTap. Introduced in iOS 6.0, it solves my problem perfectly.
The how-to/blogpost can be found here : http://chritto.wordpress.com/2013/01/07/processing-avplayers-audio-with-mtaudioprocessingtap/
I Hope it helps anyone else now....... The only thing popping up for me Googling (with a lot of different terms) is my own post here. And as long as you don't know MTAudioProcessingTap exists, you don't know how to Google for it :-)
I need to implement a wavetable player in my app. For different notes (polyphony) a note on and note off feature is needed (including looping for relevant sounds).
The samples are available or can be converted by myself, the need is for a class that is capable of playing, looping and stopping the samples or waves.
I found some open source project like fluid synth but here the question is for some sample code available for iOS or openAL.
Thank you in advance for any hints or snippets,
regards, Koen.
You could take a look at the new Sampler audio unit in iOS5. This lets you play samples with pitch control at low latency.
There is some sample code from Apple.
I am creating a musical app which generate some music. I already used MIDI functions on Mac to create a MIDI file with MIDI events (unfortunately, I don't remember names of those functions).
I am looking for a way to create instrumental notes (MIDI's or anything else) programmatically in order to play them. I also would like to have multiple channels playing those notes at the same time.
I already tried 'SoundBankPlayer' but apparently, it can't play multiple instruments at the same time.
Have you got an idea?
This answer might be a bit more work than you intended, but you can use PD on iOS to do this. More precisely, you can use libpd for iOS for the synthesis, and then use any number of community-donated patches for the sound you're looking for.
In iOS 5:
MusicSequence, MusicTrack, MusicPlayer will do what you want.
http://developer.apple.com/library/ios/#documentation/AudioToolbox/Reference/MusicSequence_Reference/Reference/reference.html#//apple_ref/doc/uid/TP40009331
Check out AUSampler AudioUnit for iOS, you'll probably have to delve into Core Audio, which has some learning curve. ;)