PBJVision setCameraMode - ios

How to switch mode video converter photo In PBJVision
now
PBJVision *vision = [PBJVision sharedInstance];
vision.delegate = self;
[vision setCameraMode:PBJCameraModePhoto];
[vision setCameraOrientation:PBJCameraOrientationPortrait];
[vision setFocusMode:PBJFocusModeAutoFocus];
[vision setOutputFormat:PBJOutputFormatPreset];
[[PBJVision sharedInstance] capturePhoto];

You can change camera mode as adding just one line. The answer is already exist in your code. That is.
[vision setCameraMode:PBJCameraModeVideo];
And use this to recording video.
[[PBJVision sharedInstance] startVideoCapture];
[[PBJVision sharedInstance] endVideoCapture];
It might be better if you know additionally these.
Changing camera mode to another seems like need a bit time.
When I had used like this, error occurred.
(In my case, change to photo mode from video mode)
[vision setCameraMode:PBJCameraModePhoto];
[vision capturePhoto];
The cause is that session setting for camera mode changing is not end completely yet.
- (void)capturePhoto
{
if (![self _canSessionCaptureWithOutput:_currentOutput] || _cameraMode != PBJCameraModePhoto) {
DLog(#"session is not setup properly for capture");
return; <--- I'm returned;
}
....
}
So be careful to write sequentially changing camera mode and calling capture. :)

Related

How to change AVCaptureMovieFileOutput video orientation during running session?

I have made a code that captures device video input and so far it is working fine. Here is what I have set
// add preview layer
_previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:_session];
_previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.videoView.layer addSublayer:_previewLayer];
// add movie output
_movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
[_session addOutput:_movieFileOutput];
AVCaptureConnection *movieFileOutputConnection = [_movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
movieFileOutputConnection.videoOrientation = [self videoOrientationFromCurrentDeviceOrientation];
// start session
[_session startRunning];
where:
- (AVCaptureVideoOrientation) videoOrientationFromCurrentDeviceOrientation {
switch ([[UIApplication sharedApplication] statusBarOrientation]) {
case UIInterfaceOrientationPortrait: {
return AVCaptureVideoOrientationPortrait;
}
case UIInterfaceOrientationLandscapeLeft: {
return AVCaptureVideoOrientationLandscapeLeft;
}
case UIInterfaceOrientationLandscapeRight: {
return AVCaptureVideoOrientationLandscapeRight;
}
case UIInterfaceOrientationPortraitUpsideDown: {
return AVCaptureVideoOrientationPortraitUpsideDown;
}
case UIInterfaceOrientationUnknown: {
return 0;
}
}
}
Now when interface orientation changes I want my output also to change, so I have this:
- (void) updatePreviewLayer {
_previewLayer.frame = CGRectMake(0, 0, self.videoView.frame.size.width, self.videoView.frame.size.height);
_previewLayer.connection.videoOrientation = [self videoOrientationFromCurrentDeviceOrientation];
[_session beginConfiguration];
AVCaptureConnection *movieFileOutpurConnection = [_movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
movieFileOutpurConnection.videoOrientation = [self videoOrientationFromCurrentDeviceOrientation];
[_session commitConfiguration];
}
But alas it is not working. It seems once I first set video orientation on movie output, it stays like than, it can not be changed later. So if I start filming in landscape mode, and then change to portrait, the video will be ok for the landscape mode, but portrait mode will be rotated. It is the same if I start in portrait mode, than landscape will be rotated.
Is there any way to do this right?
Try adding this before you start your session:
[_movieFileOutput setRecordsVideoOrientationAndMirroringChanges:YES asMetadataTrackForConnection:movieFileOutputConnection];
The header file documentation for this method makes it sound very much like what you're looking for:
Controls whether or not the movie file output will create a timed metadata track that records samples which
reflect changes made to the given connection's videoOrientation and videoMirrored properties during
recording.
There's more interesting information there, I'd read it all.
However, this method doesn't actually rotate your frames, it uses timed metadata to instruct players to do it at playback time, so it's possible that not all players will support this feature. If that's a deal breaker, then you can abandon AVCaptureMovieFileOutput in favour of the lower level AVCaptureVideoDataOutput + AVAssetWriter combination, where your videoOrientation changes actually rotate the frames, resulting in files that will playback correctly in any player:
If an AVCaptureVideoDataOutput instance's connection's videoOrientation or videoMirrored properties are set to
non-default values, the output applies the desired mirroring and orientation by physically rotating and or flipping
sample buffers as they pass through it.
p.s. I don't think you need the beginConfiguration/commitConfiguration pair if you're only changing one property as that's for batching multiple modifications into one atomic update.
Have you tried pausing the session before changing configuration?

isVideoOrientationSupported always Returns NO AVCaptureConnection

I'm using AVCaptureMetadataOutput to detect faces on iOS, and I'm trying to set the orientation of the video after the user rotates their device. However, it appears that I can't do this as every time I call the getter isVideoOrientationSupported on the only AVCaptureConnection that my AVCaptureMetadataOutput has, it always returns false. I've tried the code below in every place imaginable, yet it always returns no. Is there any way to set orientation for my metadata?
AVCaptureConnection *conn = [self.metadataOutput connectionWithMediaType:AVMediaTypeMetadataObject];
NSLog(#"%#",self.metadataOutput.connections);
if (!conn) {
NSLog(#"NULL CONNECTION OBJ");
}
if ([conn isVideoOrientationSupported]) {
NSLog(#"Supported!");
}
else {
NSLog(#"Not supported");
}
An Apple Engineer solved this for me over on the Apple Developer Forums. Here's a link. This was their response:
If you want to translate your metadata objects' coordinate space to
that of another AVCaptureOutput (such as the
AVCaptureVideoDataOutput), use
- (AVMetadataObject *)transformedMetadataObjectForMetadataObject:(AVMetadataObject *)metadataObject connection:(AVCaptureConnection *)connection NS_AVAILABLE_IOS(6_0); It's in AVCaptureOutput.h. If you want to
translate the coordinates to the coordinate space of your video
preview layer, use AVCaptureVideoPreviewLayer.h's
- (AVMetadataObject *)transformedMetadataObjectForMetadataObject:(AVMetadataObject *)metadataObject NS_AVAILABLE_IOS(6_0);

UIImagePickerController captureMode not assigning correctly

I'm using UIImagePickerController to take pictures and videos from my app. Toggling between the two isn't too bad. If the user chooses to record a video, I first check this:
if (picker.cameraCaptureMode == UIImagePickerControllerCameraCaptureModeVideo)
{
[self captureVideo];
}
else
{
picker.cameraCaptureMode = UIImagePickerControllerCameraCaptureModeVideo;
[self captureVideo];
}
This usually works totally fine. Here's the catch. I'm also using OpenTok by Tokbox to do video calls, and it seems like the captureMode assignment doesn't work after a video call. It seems completely crazy, but I made this modification to do some debugging:
if (picker.cameraCaptureMode == UIImagePickerControllerCameraCaptureModeVideo)
{
[self captureVideo];
}
else
{
picker.cameraCaptureMode = UIImagePickerControllerCameraCaptureModeVideo;
if (picker.cameraCaptureMode != UIImagePickerControllerCameraCaptureModeVideo)
{
NSLog(#"Assignment unsuccessful???")
}
[self captureVideo];
}
And i get this "Assignment unsuccessful???" log every single time. UIImagePickerController must not be allowing the assignment or something. I really can't figure it out. I've also made a forum post on OpenTok's site to see if they're possibly not releasing some camera resources, but I don't think it's their problem.
Any insight here?
Use:
+ (NSArray *)availableCaptureModesForCameraDevice:(UIImagePickerControllerCameraDevice)cameraDevice
to check which source types are available. Also if you're using a simulator, it will never assign properly.
Solved with a solution on the TokBox forum. I needed to first change my audio session before trying to access the microphone.
AVAudioSession *mySession = [AVAudioSession sharedInstance];
[mySession setCategory:AVAudioSessionCategorySoloAmbient error:nil];
[self presentViewController:picker animated:YES completion:NULL];

Camera in iOS app Tutorial

I was wondering if anyone was willing to share how to put a Camera feature into an iOS app, or if anyone knew of a simple tutorial. NOT one with any buttons, just showing on the screen what the camera is seeing. I tried Apple's Documentation, but it was too complex for my needs.
Thank you so much!
EDIT: Any simple tutorial will do fine. Like I said, I don't need anything else, besides it to display what the camera is seeing.
I don't know about a simple tutorial but adding a view that shows what the camera sees is super easy.
First:
Add a UIView to your interface builder that will be where the camera will be shown.
Second:
Add the AVFoundation framework to your project, and add its import to your ViewController .m file.
#import <AVFoundation/AVFoundation.h>
Third:
Add these 2 variables to your interface variable declarations
AVCaptureVideoPreviewLayer *_previewLayer;
AVCaptureSession *_captureSession;
Fourth:
Add this code to your viewDidLoad. (The explanation of what it does is commented)
//-- Setup Capture Session.
_captureSession = [[AVCaptureSession alloc] init];
//-- Creata a video device and input from that Device. Add the input to the capture session.
AVCaptureDevice * videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if(videoDevice == nil)
assert(0);
//-- Add the device to the session.
NSError *error;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice
error:&error];
if(error)
assert(0);
[_captureSession addInput:input];
//-- Configure the preview layer
_previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
_previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[_previewLayer setFrame:CGRectMake(0, 0,
self.cameraPreviewView.frame.size.width,
self.cameraPreviewView.frame.size.height)];
//-- Add the layer to the view that should display the camera input
[self.cameraPreviewView.layer addSublayer:_previewLayer];
//-- Start the camera
[_captureSession startRunning];
Notes:
The asserts will make the program exit in places where a camera is not available.
This only shows "The preview" of what the camera sees, if you want to manipulate the input, or take a picture or record the video you need to configure additional things like the SessionPreset and add the corresponding capture delegates. But in that case you should follow a proper tutorial or read the documentation.

Hide Red Recording Status Bar In iOS App When Not Recording

I can't get the red "Recording" status bar to hide in my app when the app is in the background and not recording.
I happen to be using The Amazing Audio Engine, but I think this question could be tackled knowledge of that library. It gets setup like this:
audioController = [[AEAudioController alloc] initWithAudioDescription:desc inputEnabled:YES];
audioController.audioSessionCategory = kAudioSessionCategory_MediaPlayback;
When the user wants to record, I turn on the mic like this:
[audioController addInputReceiver:mic];
audioController.audioSessionCategory = kAudioSessionCategory_PlayAndRecord;
When the user wants to stop recording, I turn it off:
[audioController removeInputReceiver:mic];
audioController.audioSessionCategory = kAudioSessionCategory_MediaPlayback;
The problem is, when the app isn't recording & the user leaves the app, the red "Recording" status bar still shows up. I can't stop/dispose the audioController because the app may still be playing audio.
I don't want the red recording status bar to show if I'm not recording. Any ideas how to do this?
Update
I setup the following block of code to run every 2 seconds in my app.
audioController.audioSessionCategory = kAudioSessionCategory_MediaPlayback;
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
NSError* error = nil;
[audioSession setActive:NO error: &error];
NSLog(#"error: %#", error);
[audioSession setCategory:AVAudioSessionCategoryPlayback error:nil];
This logs:
TAAE: Setting audio session category to MediaPlayback
AudioSessionSetProperty(kAudioSessionProperty_OverrideCategoryEnableBluetoothInput) result 2003329396 77686174 what
Error Domain=NSOSStatusErrorDomain Code=560030580 "The operation couldn’t be completed. (OSStatus error 560030580.)"
Obviously it fails to disable the mic because of something TAAE is holding on to. I have not added any inputs to the controller, so I don't know what it could be.
Resolved, see edit 2
The bar will never disappear as long as the mic is in use, recording or not. it's a security measure to allow the user to know that an app is listening to the microphone, not to show that the phone is recording.
The only way to get it gone is to remove the mic from the input receivers
I see that your mic isn't being removed, there has to be some bug.
Point is, you cannot hide the Red bar as long as the microphone is opened..
if you want to temporarily disable it, you can try this maybe ?
[audioController setInputEnabled:NO]
What are you trying to accomplish anyway? there might be a better way to handle things
Edit 1: Added other workarounds
I didn't know that setInputEnabled was readonly, sorry.
Well, another thing to try is to stop the controller completely, try this:
[audioController stop]
if not, try to release it if you're not using ARC, or simply
audioController = Nil;
Hope that fixes the issue. but I rather try to find out why it's not removing mic from the input receivers.. perhaps mic is Nil when you call [audioController removeInputReceiver:mic] ??
Edit 2:Added solution
The problem arises when you initialize with inputEnabled set to YES, since it's readOnly, you can't disable the input, the only way is to actually release audioController.
if you're using ARC, just set it to Nil, if not, just [audioController release]
Try to set audio servicess off when you stop recording:
AudioSessionSetActive(false);
I solved the problem by:
Having two audio controllers (as suggested in the comments): one for recording and one for playback. When the app entered background, I called [recordAudioController stop] and when I wanted to start recording again, I re-started this audio controller.
Removing a line of code from AEAudioController.m that tried to stop the audio session, making TAAE stop only the AUGraph:
- (void)stopInternal {
NSLog(#"TAAE: Stopping Engine");
checkResult(AUGraphStop(_audioGraph), "AUGraphStop");
if ( self.running ) {
// Ensure top IO unit is stopped (AUGraphStop may fail to stop it)
checkResult(AudioOutputUnitStop(_ioAudioUnit), "AudioOutputUnitStop");
}
// --- I removed the following section ---
if ( !_interrupted ) {
NSError *error = nil;
if ( ![((AVAudioSession*)[AVAudioSession sharedInstance]) setActive:NO error:&error] ) {
NSLog(#"TAAE: Couldn't deactivate audio session: %#", error);
}
}
processPendingMessagesOnRealtimeThread(self);
if ( _pollThread ) {
[_pollThread cancel];
while ( [_pollThread isExecuting] ) {
[NSThread sleepForTimeInterval:0.01];
}
_pollThread = nil;
}
}
I know the question was a little old, but in case other people met the same problem like me, I gave my final solution. The problem happened in such a recording situation:
start recording
[session setCategory:AVAudioSessionCategoryPlayAndRecord error:&error];
stop recording
[session setCategory:AVAudioSessionCategoryPlayback error:&error];
when I comment code that reset the session category to playback, the problem disappeared. I also change the start recording code to:
[session setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker error:&error];
so that voice played from receiver to speaker.

Resources