I am using UIImagePickerController to record a video. and am using AVPlayer to play a video. and adding AVPlayerLayer to UIImagePickerController's cameraOverlayView so that i can see video while recording.
My requirement is
I need to watch video while recording video using UIImagePickerController
using headset i need to listen audio from playing video
need to record my voice to recording video
only my voice should be recorded but not playing video's audio.
every thing working but 4. audio from playing video also mix with my voice. how to handle this case? My final goal is
Out put for the playing video is headset
Input for the recording is headset's mic
Please help me to get this done.
Your requirement is interesting. So you need to play and record at the same time, right?
So that, you will need to initialize audio session with the category AVAudioSessionCategoryPlayAndRecord.
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:&error];
Because you are using UIImagePickerController to record so you don't have much control to your speaker and your mic. So test and see if it works.
In case you still have problem, I suggest you to use AVCaptureSession to record video without audio. Look at this example how to use it record-video-with-avcapturesession-2.
UPDATE: In my VOIP application, I use AVAudioUnit to record while playing back. So I think the only way is record video and audio separately and then use AVComposition to compose its to a single movie. Using AVCaptureSession to record video only and use EZAudio to record audio. The EZAudio use AVAudioUnit to record so that it should work. You can test it by record audio while playing a movie and see if it works. I hope it will help
UPDATE: I tested and it only work if you use headphone or select microphone back.
Here is the tested code:
NSString *moviePath = [[NSBundle mainBundle] pathForResource:#"videoviewdemo" ofType:#"mp4"];
NSURL *url = [NSURL fileURLWithPath:moviePath];
// You may find a test stream at <http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8>.
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithURL:url];
AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerLayer *layer = [[AVPlayerLayer alloc] init];
[layer setPlayer:player];
[layer setFrame:CGRectMake(0, 0, 100, 100)];
[self.view.layer addSublayer:layer];
[player play];
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(1 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
//
// Setup the AVAudioSession. EZMicrophone will not work properly on iOS
// if you don't do this!
//
AVAudioSession *session = [AVAudioSession sharedInstance];
NSError *error;
[session setCategory:AVAudioSessionCategoryPlayAndRecord error:&error];
if (error)
{
NSLog(#"Error setting up audio session category: %#", error.localizedDescription);
}
[session setActive:YES error:&error];
if (error)
{
NSLog(#"Error setting up audio session active: %#", error.localizedDescription);
}
//
// Customizing the audio plot's look
//
// Background color
self.audioPlot.backgroundColor = [UIColor colorWithRed:0.984 green:0.471 blue:0.525 alpha:1.0];
// Waveform color
self.audioPlot.color = [UIColor colorWithRed:1.0 green:1.0 blue:1.0 alpha:1.0];
// Plot type
self.audioPlot.plotType = EZPlotTypeBuffer;
//
// Create the microphone
//
self.microphone = [EZMicrophone microphoneWithDelegate:self];
//
// Set up the microphone input UIPickerView items to select
// between different microphone inputs. Here what we're doing behind the hood
// is enumerating the available inputs provided by the AVAudioSession.
//
self.inputs = [EZAudioDevice inputDevices];
self.microphoneInputPickerView.dataSource = self;
self.microphoneInputPickerView.delegate = self;
//
// Start the microphone
//
[self.microphone startFetchingAudio];
self.microphoneTextLabel.text = #"Microphone On";
[[AVAudioSession sharedInstance] overrideOutputAudioPort:AVAudioSessionPortOverrideSpeaker error:nil];
});
Take a look at PBJVision library. It allows you to record video while you are watching the preview, and at the end, you can do whatever you want with audio and video footage.
Related
I am doing an iOS application with a video player in it. I have to change my video URL each time user switch it view angle. But when I am switching to a new URL my UI gets freezed for some time. Following is my code for switching video URL function.
// Create the AVPlayer using the playeritem
m_pMoviePlayer = [AVPlayer playerWithURL:[NSURL URLWithString:pUrl]];
NSError *_error = nil;
[[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayback error: &_error];
m_pMoviePlayer.allowsExternalPlayback = NO;
// You can play/pause using the AVPlayer object
[m_pMoviePlayer play];
[self registerMoviePlayerObservers];
Thanks in advance.
How to upload audio clip realtime to a server while its recording? Basically my requirement is upload an audio clip as chucks/packets while its recording.
I already did the recording part with using IQAudioRecorderController https://github.com/hackiftekhar/IQAudioRecorderController. It records the audio and save to TemporaryDirectory.
I wanted to know how to upload realtime without saving the audio clip.
This is the recording part
//Unique recording URL
NSString *fileName = [[NSProcessInfo processInfo] globallyUniqueString];
_recordingFilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:#"%#.m4a",fileName]];
// Initiate and prepare the recorder
_audioRecorder = [[AVAudioRecorder alloc] initWithURL:[NSURL fileURLWithPath:_recordingFilePath] settings:recordSetting error:nil];
_audioRecorder.delegate = self;
_audioRecorder.meteringEnabled = YES;
// Recording start
- (void)recordingButtonAction:(UIBarButtonItem *)item
{
if (_isRecording == NO)
{
_isRecording = YES;
//UI Update
{
[self showNavigationButton:NO];
_recordButton.tintColor = _recordingTintColor;
_playButton.enabled = NO;
_trashButton.enabled = NO;
}
/*
Create the recorder
*/
if ([[NSFileManager defaultManager] fileExistsAtPath:_recordingFilePath])
{
[[NSFileManager defaultManager] removeItemAtPath:_recordingFilePath error:nil];
}
_oldSessionCategory = [[AVAudioSession sharedInstance] category];
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryRecord error:nil];
[_audioRecorder prepareToRecord];
[_audioRecorder record];
}
else
{
_isRecording = NO;
//UI Update
{
[self showNavigationButton:YES];
_recordButton.tintColor = _normalTintColor;
_playButton.enabled = YES;
_trashButton.enabled = YES;
}
[_audioRecorder stop];
[[AVAudioSession sharedInstance] setCategory:_oldSessionCategory error:nil];
}
}
// Recording done
-(void)doneAction:(UIBarButtonItem*)item
{
if ([self.delegate respondsToSelector:#selector(audioRecorderController:didFinishWithAudioAtPath:)])
{
IQAudioRecorderController *controller = (IQAudioRecorderController*)[self navigationController];
[self.delegate audioRecorderController:controller didFinishWithAudioAtPath:_recordingFilePath];
}
[self dismissViewControllerAnimated:YES completion:nil];
}
There are various ways of solving this, one way is to create your own AudioGraph. The AudioGraph can grab samples from microphone or from a file. Then you proceed to an output unit, but install a callback to get the sampled frames. These you then push to your network class which then can upload packet by packet.
A good example that shows you how to write these captured packets to disk is AVCaptureAudioDataOutput .
In that example packets are written suing ExtAudioFileWriteAsync. You have to replace this with your own logic for uploading to a server. Note that while you can do that easily, one problem is that this will give you raw audio samples. If you need them as wave file or similar, you may need to wait until recording is finished, since the header of the file needs an information about contained audio samples.
The code you are currently using will work for you if you want to upload recorded file after recording is done as it will give you the final recorded file.
If you want to upload live audio recording to the server then I think you have to go with combination of,
AudioSession for recording stuff
ffmpeg for uploading your live audio to server.
You can get good help for recording audio and managing Audio Buffers from here
For ffmpeg I think you have to lear a lot. It will be easy to send static/saved audio file to server using ffmpeg but for sending live Audio Buffer to server will be tricky job.
Our app records and plays (those) videos. But somehow some of the videos are played without sound.
If i copy the video files to the mac via iTunes it plays the videos with sound, so the videos do have sound.
I checked the videos with a tool (called GSpot) and they all have the same audio codec and bitrate.
I tried about everything i found on SO but some of the videos dont play audio and the code is always the same and the videos do have sound as on a mac you can hear it.
This is how i setup the player
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
NSError *setCategoryError = nil;
[audioSession setActive:YES error:&setCategoryError];
[audioSession setCategory:AVAudioSessionCategoryPlayback
error:&setCategoryError];
_mpviewController = [[MPMoviePlayerViewController alloc] initWithContentURL:[NSURL fileURLWithPath:moviepath]];
_mpviewController.moviePlayer.movieSourceType = MPMovieSourceTypeFile;
_mpviewController.moviePlayer.controlStyle = MPMovieControlStyleNone;
[_mpviewController.moviePlayer setScalingMode:MPMovieScalingModeAspectFill];
_mpviewController.moviePlayer.shouldAutoplay = YES;
[_mpviewController.moviePlayer prepareToPlay];
_mpviewController.view.frame = CGRectMake(0, 0, _screenWidth, _screenHeight);
[self addSubview:_mpviewController.view];
some time later, on a button press, it starts playing.
The record settings
[_session beginConfiguration];
[_session setSessionPreset: AVCaptureSessionPreset640x480];
AVCaptureDevice * videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
BOOL lock = [videoDevice lockForConfiguration:&error];
if(lock) videoDevice.focusMode = AVCaptureFocusModeContinuousAutoFocus;
[videoDevice unlockForConfiguration];
if(videoDevice == nil){
assert(0);
}
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if(error){
assert(0);
}
[_session addInput:input];
AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
AVCaptureDeviceInput * audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:nil];
if(audioInput != nil) {
[_session addInput:audioInput];
}
[_session commitConfiguration];
I dont know if it is related but the audio seems to not play on videos that had a different view overlayed during the recording. It is possible to overlay a fullscreen view over the video preview while recording and it records fine and the file itself has audio and they play fine on a pc/mac... only on the device within our app those videos (again, not sure if related) have no audio. App uses same code for every video, so i can rename them, swap them etc, the behaviour has something to do with the audio channel of the video.
edit
Additional observation:
The vids where there is no sound, quicktime (only on mac) also plays no audio. QT on windows does play audio and all other players on windows do play audio as well.
Somehow the phone corrupts its own recording so that afterwards it doesnt recognize the audio channel it just recorded
edit 2
iPhone 6, iOS 8.1beta 2 -> as described
iPhone 4S, iOS 7.1.2 -> NO issue
iPad mini, iOS 8.0.2 -> always has the issue, not one videos audio channel can be read, but always exists and windows can play audio
I want to wrap this up.
The file was saved as ".mp4" and this causes the sporadic error.
Saving the file as ".mov" works. Same codec etc but without errors.
NSString *path = [[documentsDirectoryPath stringByAppendingPathComponent:
[NSString stringWithFormat:#"/%#", name]] stringByAppendingString:#".mov"];
outputURL = [[NSURL alloc] initFileURLWithPath:path];
[mAVCaptureMovieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];
I've developed an iOS app that plays two separate audio streams simultaneously. To do this I am using AVPlayer thus:
//Use URL to set up the speech radio player.
NSURL *urlStreamSpeech = [NSURL URLWithString:self.urlAddressSpeech];
playerSpeech = [AVPlayer playerWithURL:urlStreamSpeech];
//Use URL to set up the music player.
NSURL *urlStreamMusic = [NSURL URLWithString:self.urlAddressMusic];
playerMusic = [AVPlayer playerWithURL:urlStreamMusic];
I am now trying to implement a volume control so that the user will be able to control the volume for the audio streams individually. As I've tried this I have come to the conclusion that there is no volume property for the AVPlayer class. I also looked at the AVAudioPlayer class, but from what I gather that class is only able to play local audio files, not streamed audio.
So my question is: how can I control the volume of streamed audio on iOS?
NSString* resourcePath = url; //your url
NSData *_objectData = [NSData dataWithContentsOfURL:[NSURL URLWithString:resourcePath]];
NSError *error;
app.audioPlayer = [[AVAudioPlayer alloc] initWithData:_objectData error:&error];
app.audioPlayer.numberOfLoops = 0;
app.audioPlayer.volume = 1.0f;
[app.audioPlayer prepareToPlay];
if (app.audioPlayer == nil)
NSLog(#"%#", [error description]);
else
[app.audioPlayer play];
Look into the AVPlayer class, part of the AVFoundation framework... this is the objective-c (iOS) class you use for streaming audio.
look at this post Live streaming
MPMusicPlayerController* musicPlayer = [MPMusicPlayerController iPodMusicPlayer];
[musicPlayer setVolume:[SliderVolumen value]];
Only work in the device, no simulator.
Compatible with AVPlayer.
I have a problem with the AVPlayer playing in background mode, like a lot of people here and everywhere on the web. I've done what I think is supposed to work, but is still doesn't... I think my problem might be where I set and use my AudioSession and AVPlayer.
1) The "audio" key is in UIBackgroundModes of my Info.plist
2) AudioSession set like this in AppDelegate (initialised in didFinishLaunchingWithOptions):
AVAudioSession *audio = [[AVAudioSession alloc]init];
[audio setCategory:AVAudioSessionCategoryPlayback error:nil];
[audio setActive:YES error:nil];
3) I use an AVPlayer (not AVAudioPlayer) also implemented in AppDelegate. (initialised in didFinishLaunchingWithOptions, after the AudioSession), right after the AudioSession
// Load the array with the sample file
NSString *urlAddress = #"http://mystreamadress.mp3";
//Create a URL object.
self.urlStream = [NSURL URLWithString:urlAddress];
self.player = [AVPlayer playerWithURL:urlStream];
//Starts playback
[player play];
And still, the audio is suspended everytime the app goes in background (when I press the "Home" button).
By the way, the problem was just with the simulator. Works fine on the iOS devices