How to set the orientation of AVCaptureFileOutput? - ios

How can I set the orientation of AVCaptureFileOutput? I mean the final output, it must be landscape.

When you setup camera, you have a AVCaptureVideoDataOutput videoDataOutput, and a AVCaptureVideoPreviewLayer previewLayer. You can configure them in the following way.
//set the orientation of previewlayer, which is presented to user
AVCaptureConnection *conncetion = self.previewLayer.connection;
conncetion.videoOrientation = AVCaptureVideoOrientationLandscapeRight;
//set the output frame orientation
AVCaptureConnection *conn = [self.videoDataOutput connectionWithMediaType:AVMediaTypeVideo];
conn.videoOrientation = AVCaptureVideoOrientationLandscapeRight;
In AVCam sample code, when recording and saving as a file. The above two are set to the same orientation, which is viewcontroller's interfaceOrientation:
[[[self movieFileOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]];
And it observes device orientation, which is nice.

Related

AVCaptureVideoPreviewLayer blinks when changing sessionPreset on a running AVCaptureSession

I'm running a session with AVCaptureSessionPresetMediumto process the frames, when I find what I need I want to capture a stills image with AVCaptureSessionPresetPhoto. I'm changing the sessionPreset with the code
dispatch_async(_captureSessionQueue, ^{
[_captureSession beginConfiguration];
if ([_captureSession canSetSessionPreset:AVCaptureSessionPresetPhoto])
{
_captureSession.sessionPreset = AVCaptureSessionPresetPhoto;
}
[_captureSession commitConfiguration];
});
When this code is called the screen(AVCaptureVideoPreviewLayer) "blinks".
I can't use highResolutionStillImageOutputEnabled because I need to support iOS 7 and devices lower than iPhone 6. Does any one have an idea why this blink happens?

Is it possible to auto-adjust the image brightness in iPhone?

I use iPhone camera to capture images in my iOS APP.
AVCaptureVideoDataOutputSampleBufferDelegate is used to get images from iPhone camera.
A part of the program is shown below.
AVCaptureDevice *videoDevice = [AVCamViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:AVCaptureDevicePositionBack];
AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (error)
{
NSLog(#"%#", error);
}
if ([session canAddInput:videoDeviceInput])
{
[session addInput:videoDeviceInput];
[self setVideoDeviceInput:videoDeviceInput];
dispatch_async(dispatch_get_main_queue(), ^{
// Why are we dispatching this to the main queue?
// Because AVCaptureVideoPreviewLayer is the backing layer for AVCamPreviewView and UIView can only be manipulated on main thread.
// Note: As an exception to the above rule, it is not necessary to serialize video orientation changes on the AVCaptureVideoPreviewLayer’s connection with other session manipulation.
//[self previewView] layer
[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] setVideoOrientation:(AVCaptureVideoOrientation)[self interfaceOrientation]];
});
}
Sometimes, the environment is dark a bit. In the iPhone camera app, it can brighten the image by tapping the iPhone screen on the darker part. But I don't like such suer's involvement.
I check RGB intensity, once I realise the brightness is not enough, I like to brighten the image by adjusting the camera parameters (such as camera exposure, etc) in my program. Is it possible in iPhone programming?
Thanks
I haven't work on AVFoundation much but you can find more details here.
I hope this will be useful for you.

unrecognized selector sent to instance | Objective C

The app crash on startup and I have this error in the console :
Terminating app due to uncaught exception
'NSInvalidArgumentException', reason: '-[Chartboost showInterstitial]:
unrecognized selector sent to instance 0x7f844c9b74e0'
my code is :
#import "cocos2d.h"
#import "AppDelegate.h"
#import "IntroLayer.h"
#import "AppSpecificValues.h"
#import <RevMobAds/RevMobAds.h>
#import <Chartboost/Chartboost.h>
#implementation AppController
#synthesize gameCenterManager=gameCenterManager_, currentLeaderBoard=currentLeaderBoard_;
#synthesize window=window_, navController=navController_, director=director_;
#synthesize cb;
#synthesize nScore;
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
// Create the main window
window_ = [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]];
// Create an CCGLView with a RGB565 color buffer, and a depth buffer of 0-bits
CCGLView *glView = [CCGLView viewWithFrame:[window_ bounds]
pixelFormat:kEAGLColorFormatRGB565 //kEAGLColorFormatRGBA8
depthFormat:0 //GL_DEPTH_COMPONENT24_OES
preserveBackbuffer:NO
sharegroup:nil
multiSampling:NO
numberOfSamples:0];
director_ = (CCDirectorIOS*) [CCDirector sharedDirector];
director_.wantsFullScreenLayout = YES;
// Display FSP and SPF
// [director_ setDisplayStats:YES];
// set FPS at 60
[director_ setAnimationInterval:1.0/60];
// attach the openglView to the director
[director_ setView:glView];
// for rotation and other messages
[director_ setDelegate:self];
// 2D projection
[director_ setProjection:kCCDirectorProjection2D];
// [director setProjection:kCCDirectorProjection3D];
// Enables High Res mode (Retina Display) on iPhone 4 and maintains low res on all other devices
if( ! [director_ enableRetinaDisplay:YES] )
CCLOG(#"Retina Display Not supported");
// Default texture format for PNG/BMP/TIFF/JPEG/GIF images
// It can be RGBA8888, RGBA4444, RGB5_A1, RGB565
// You can change anytime.
[CCTexture2D setDefaultAlphaPixelFormat:kCCTexture2DPixelFormat_RGBA8888];
// If the 1st suffix is not found and if fallback is enabled then fallback suffixes are going to searched. If none is found, it will try with the name without suffix.
// On iPad HD : "-ipadhd", "-ipad", "-hd"
// On iPad : "-ipad", "-hd"
// On iPhone HD: "-hd"
CCFileUtils *sharedFileUtils = [CCFileUtils sharedFileUtils];
[sharedFileUtils setEnableFallbackSuffixes:NO]; // Default: NO. No fallback suffixes are going to be used
[sharedFileUtils setiPhoneRetinaDisplaySuffix:#"-hd"]; // Default on iPhone RetinaDisplay is "-hd"
[sharedFileUtils setiPadSuffix:#"-ipad"]; // Default on iPad is "ipad"
[sharedFileUtils setiPadRetinaDisplaySuffix:#"-ipad-hd"]; // Default on iPad RetinaDisplay is "-ipadhd"
// Assume that PVR images have premultiplied alpha
[CCTexture2D PVRImagesHavePremultipliedAlpha:YES];
self.nScore = 0;
//Revmobs
[RevMobAds startSessionWithAppID:[[NSBundle mainBundle] objectForInfoDictionaryKey:#"RevMobAPI"]];
// and add the scene to the stack. The director will run it when it automatically when the view is displayed.
[director_ pushScene: [IntroLayer scene]];
[self initGameCenter];
// Create a Navigation Controller with the Director
navController_ = [[UINavigationController alloc] initWithRootViewController:director_];
navController_.navigationBarHidden = YES;
// set the Navigation Controller as the root view controller
// [window_ addSubview:navController_.view]; // Generates flicker.
[window_ setRootViewController:navController_];
// make main window visible
[window_ makeKeyAndVisible];
return YES;
}
-(void) lunchRevmobADLink
{
[[RevMobAds session] openAdLinkWithDelegate:self];
}
- (void) setUpRevMob {
[[RevMobAds session] showFullscreen];
}
-(void) launchChartboost
{
// Initialize the Chartboost library
[Chartboost startWithAppId:#"53be6ed01873dc9741aafa"
appSignature:#"fcd1715a73c97b22c5ad557323a59d7348476"
delegate:self];
// Begin a user session. This should be done once per boot
[cb startSession];
[cb cacheInterstitial];
[cb cacheMoreApps];
}
-(void) showChartboostInterestitial
{
// Show an interstitial
[[Chartboost sharedChartboost] showInterstitial];
}
-(void) showChartboostMoreApps
{
[[Chartboost sharedChartboost] showMoreApps];
}
// Supported orientations: Landscape. Customize it for your own needs
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation
{
return UIInterfaceOrientationIsPortrait(interfaceOrientation);
}
// getting a call, pause the game
-(void) applicationWillResignActive:(UIApplication *)application
{
if( [navController_ visibleViewController] == director_ )
[director_ pause];
}
// call got rejected
-(void) applicationDidBecomeActive:(UIApplication *)application
{
if( [navController_ visibleViewController] == director_ )
[director_ resume];
[self launchChartboost];
#ifdef FREE_VERSION
[self showChartboostInterestitial];
[self setUpRevMob];
[self hideAdBanner:YES];
#endif
}
and other code.
I tried lot of things! So I ask you now, How I can resolve this???
Thank's in advance!
According to Chartboost documentation you are not trying to show interstitial the right way :
To show a static or interstitial video ad:
// Show interstitial at location HomeScreen. See Chartboost.h for available location options.
[Chartboost showInterstitial:CBLocationHomeScreen];
The way you are doing it has been removed in their SDK 5.x
All references to [Chartboost sharedChartboost] are now changed to
Chartboost:
[[Chartboost sharedChartboost] showInterstitial:CBLocationHomeScreen]; is now [Chartboost
showInterstitial:CBLocationHomeScreen];
you are calling showInterstitial on an instance although it is a class method. it should be something like [Chartboost showInterstitial...

Locking iOS Camera Exposure Crashes

I have the following code:
-(void) startCameraCapture {
// start capturing frames
// Create the AVCapture Session
session = [[AVCaptureSession alloc] init];
// create a preview layer to show the output from the camera
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
// Specify that the video should be stretched to fill the layer’s bounds.
previewLayer.videoGravity = AVLayerVideoGravityResize;
//previewView.frame = CGRectMake(126.0, 164.0, 64.0, 75.0);
previewLayer.frame = previewView.frame;
[previewView.layer addSublayer:previewLayer];
// Get the default camera device
AVCaptureDevice* camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
// get the current settings
[appDelegate loadSettings];
[session startRunning];
}
Is there a way I can lock the exposure of a camera device to only allow the screen to adjust to a certain brightness?
Sorry if I am not asking this right.
Thank you.
EDIT:
I tried adding:
[camera setExposureModeCustomWithDuration:CMTimeMake(1,1) ISO:100 completionHandler:nil];
but that only results in the app crashing at that line.
According to Apple's doc:
An NSGenericException exception is thrown if this method is invoked without first obtaining exclusive access to the receiver using lockForConfiguration:.
So add this line in front:
[camera lockForConfiguration:nil];

setAlpha not triggering after touchUpInside event occurs

I'm developing a camera app that has a video function. I want the button that the user presses to begin recording, which it does, and stop recording. This works properly. But, I want to raise the alpha of the recordButton to 1.0 when the button is pressed. This works properly, but when it is pressed again to stop recording the alpha remains at 1.0. The if...else statement that stops recording should trigger [self.recordButton setAlpha:0.50], but for some reason nothing happens, aside from recording stopping properly. Any clarification would be greatly appreciated.
- (IBAction)toggleMovieRecording:(id)sender
{
[[self recordButton] setEnabled:NO];
dispatch_async([self sessionQueue], ^{
if (![[self movieFileOutput] isRecording])
{
[self setLockInterfaceRotation:YES];
[self.recordButton setAlpha:1.0];
if ([[UIDevice currentDevice] isMultitaskingSupported])
{
// Setup background task. This is needed because the captureOutput:didFinishRecordingToOutputFileAtURL: callback is not received until the app returns to the foreground unless you request background execution time. This also ensures that there will be time to write the file to the assets library when the app is backgrounded. To conclude this background execution, -endBackgroundTask is called in -recorder:recordingDidFinishToOutputFileURL:error: after the recorded file has been saved.
[self setBackgroundRecordingID:[[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:nil]];
}
// Update the orientation on the movie file output video connection before starting recording.
[[[self movieFileOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]];
// Turn OFF flash for video recording
[AAPLCameraViewController setFlashMode:AVCaptureFlashModeOff forDevice:[self videoDevice]];
// Start recording to a temporary file.
NSString *outputFilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:[#"movie" stringByAppendingPathExtension:#"mov"]];
[[self movieFileOutput] startRecordingToOutputFileURL:[NSURL fileURLWithPath:outputFilePath] recordingDelegate:self];
}
else
{
[[self movieFileOutput] stopRecording];
[self.recordButton setAlpha:0.50];
}
});
}
You have to make sure that anything related to UI have to be done on the main queue. Check this answer
iPhone - Grand Central Dispatch main thread
To make sure add this block around setting alpha.
dispatch_async(dispatch_get_main_queue(), ^{
[self.recordButton setAlpha:0.50];
});
Let me know if that worked.

Resources