I have a problem with AVCaptureSession startRunning. It's problem only on iphone 5 with IOS 7. My app should record video and show all on AVCaptureVideoPreviewLayer, but if I test on iphone 5, first time call error = AVErrorMediaServicesWereReset.
This is my code, where I create capturemanager and startRunning:
-(void)startVideoTranslation{
CGRect r = videoBackView.bounds;
videoPreviewView = [[UIView alloc] initWithFrame:r];
[videoBackView addSubview:videoPreviewView];
if (currentGameType == MPGameTypeCrocodile){
if (captureManager == nil) {
captureManager = [[AVCamCaptureManager alloc] init];
[captureManager setDelegate:self];
if ([captureManager setupSession]) {
//Create video preview layer and add it to the UI
AVCaptureVideoPreviewLayer *newCaptureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:[captureManager session]];
CALayer *viewLayer = [videoPreviewView layer];
[viewLayer setMasksToBounds:YES];
CGRect bounds = [videoPreviewView bounds];
bounds.origin.y = bounds.origin.y+1;
[newCaptureVideoPreviewLayer setFrame:bounds];
if ([newCaptureVideoPreviewLayer.connection isVideoOrientationSupported]) {
[newCaptureVideoPreviewLayer.connection setVideoOrientation:AVCaptureVideoOrientationLandscapeRight];
}
[newCaptureVideoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[viewLayer insertSublayer:newCaptureVideoPreviewLayer below:[[viewLayer sublayers] objectAtIndex:0]];
captureVideoPreviewLayer = newCaptureVideoPreviewLayer;
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(isErrorSession:) name:AVCaptureSessionRuntimeErrorNotification object:nil];
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT,0), ^{
[[captureManager session] startRunning];
});
}
}
}
}
And this selector I call here:
-(void)showPrepareView{
[self playSound:#"start"];
BGView.frame = videoBackView.bounds;
[prepareView addSubview:videoBackView];
[self startVideoTranslation];
[videoBackView addSubview:BGView];
In this code I use AVAudioPlayer in [self playSound:],
on videobackView i add my PreviewLayer.
Does anyone know the reason for this problem?
Now,I solve this problem so, I move in -(id)initWithFrame:
captureManager = [[AVCamCaptureManager alloc] init];
[captureManager setDelegate:self]
[captureManager setupSession];
[[captureManager session] startRunning];
But I don't understand what is it.
Related
We have an AVCaptureVideoPreviewLayer that should display the Camera image.
On iPhone 10 (and iPhone 7), the image is displayed correctly.
However on iPhone XR it is displayed in pink.
Here is my code how I setup the Camera Session:
UIView* scanView = [[UIView alloc]initWithFrame:[self view].frame];
[self setCameraScanView:scanView];
[[self cameraScanView]setContentMode:UIViewContentModeScaleAspectFit];
[self setScanCaptureSession:[[AVCaptureSession alloc] init]];
[[self scanCaptureSession]setSessionPreset:AVCaptureSessionPresetLow];
AVCaptureDevice* camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureVideoDataOutput* videoOutput = [[AVCaptureVideoDataOutput alloc] init];
[[videoOutput connectionWithMediaType:AVMediaTypeVideo]setEnabled:YES];
[videoOutput setAlwaysDiscardsLateVideoFrames:YES];
[videoOutput setSampleBufferDelegate:self
queue:dispatch_get_main_queue()];
[[self scanCaptureSession]addOutput:videoOutput];
NSError* error;
AVCaptureDeviceInput* input = [[AVCaptureDeviceInput alloc]
initWithDevice:camera error:&error];
[[self scanCaptureSession]addInput:input];
if(error) {
NSLog(#"There was an error setting up Device Input!");
}
AVCaptureVideoPreviewLayer* previewLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:[self scanCaptureSession]];
[self setCameraLayer:previewLayer];
[[self cameraLayer] setContentsFormat:kCAContentsFormatRGBA16Float];
[[self cameraLayer] setOpaque:YES];
CALayer* rootLayer = [[self cameraScanView]layer];
[[self cameraLayer]setFrame:[self cameraScanView].bounds];
[[self cameraLayer]setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[rootLayer addSublayer:[self cameraLayer]];
[[self scanCaptureSession]commitConfiguration];
[[self view] addSubview:[self cameraScanView]];
Here is an image how it looks on iPhone XR.
I have been working on a project for a while now and I have come to one thing I want to really work out that I haven't been able to figure out.
In the application when taking a front facing picture, I would like front flash to actually make the picture brighter.
I am using a custom AVCaptureSession camera, it is full screen. Here is the code that does make a flash happen, just the picture isn't brighter at all.
//Here is the code for a front flash on the picture button press. It does flash, just doesn't help.
UIWindow* wnd = [UIApplication sharedApplication].keyWindow;
UIView *v = [[UIView alloc] initWithFrame: CGRectMake(0, 0, wnd.frame.size.width, wnd.frame.size.height)];
[wnd addSubview: v];
v.backgroundColor = [UIColor whiteColor];
[UIView beginAnimations: nil context: nil];
[UIView setAnimationDuration: 1.0];
v.alpha = 0.0f;
[UIView commitAnimations];
//imageView is just the actual view the the cameras image fills.
imageView.hidden = NO;
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections) {
for(AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
videoConnection = connection;
break;
}
}if (videoConnection) {
break;
}
} [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer != NULL) {
imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *thePicture = [UIImage imageWithData:imageData];
self.imageView.image = thePicture;
//After the picture is on the screen, I just make sure some buttons are supposed to be where they are supposed to be.
saveButtonOutlet.hidden = NO;
saveButtonOutlet.enabled = YES;
diaryEntryOutlet.hidden = YES;
diaryEntryOutlet.enabled = NO;
}
}];
}
You need to set the screen to white before the image is captured, wait for the capture to complete and then remove the white screen in the completion block.
You should also dispatch the capture after a short delay to ensure the screen has turned white -
UIWindow* wnd = [UIApplication sharedApplication].keyWindow;
UIView *v = [[UIView alloc] initWithFrame: CGRectMake(0, 0, wnd.frame.size.width, wnd.frame.size.height)];
[wnd addSubview: v];
v.backgroundColor = [UIColor whiteColor];
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(0.1 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer != NULL) {
imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *thePicture = [UIImage imageWithData:imageData];
dispatch_async(dispatch_get_main_queue(), ^{
self.imageView.image = thePicture;
[v removeFromSuperview];
});
}
//After the picture is on the screen, I just make sure some buttons are supposed to be where they are supposed to be.
saveButtonOutlet.hidden = NO;
saveButtonOutlet.enabled = YES;
diaryEntryOutlet.hidden = YES;
diaryEntryOutlet.enabled = NO;
}];
}];
I am working Audioplayer functionality.I am using AVAudioplayer framework.i have some problem.i want to put an activity indicator view in song loading time.so that when you touch the play it starts the activity indicator view and when the audio starts, the activity indicator stops.That is my requirement.i am writing some code.But it is not working fine for me.help me any body.
-(void)viewDidLoad {
[self temp];
[self loadData];
loadingLabel = [[UILabel alloc] initWithFrame:CGRectMake(94, 28, 130, 22)];
loadingLabel.backgroundColor = [UIColor clearColor];
loadingLabel.textColor = [UIColor blackColor];
[loadingLabel setFont:[UIFont systemFontOfSize:10.0]];
loadingLabel.adjustsFontSizeToFitWidth = YES;
loadingLabel.textAlignment = NSTextAlignmentCenter;
loadingLabel.text = #"loading In...";
loadingLabel.autoresizingMask = UIViewAutoresizingFlexibleLeftMargin | UIViewAutoresizingFlexibleRightMargin | UIViewAutoresizingFlexibleBottomMargin | UIViewAutoresizingFlexibleTopMargin;
}
-(void)temp {
// loading View
loadingView=[[UILabel alloc]initWithFrame:CGRectMake(135, 200, 40, 40)];
loadingView.backgroundColor=[UIColor clearColor];
loadingView.clipsToBounds=YES;
loadingView.layer.cornerRadius=10.0;
activityView = [[UIActivityIndicatorView alloc] initWithActivityIndicatorStyle:UIActivityIndicatorViewStyleWhite];
activityView.tag=960;
activityView.frame = CGRectMake(10, 11, activityView.bounds.size.width, activityView.bounds.size.height);
[loadingView addSubview:activityView];
}
-(void)playOrPauseButtonPressed:(id)sender {
if(playing==NO)
{
[playButton setBackgroundImage:[UIImage imageNamed:#"Pause.png"] forState:UIControlStateNormal];
// Here Pause.png is a image showing Pause Button.
NSError *err=nil;
if (!audioPlayer)
{
audioSession=[AVAudioSession sharedInstance];
[audioSession setCategory:AVAudioSessionCategoryPlayback error:nil];
NSLog(#"%# %d",urlsArray,selectedIndex);
NSString *sourcePath=[urlsArray objectAtIndex:selectedIndex];
NSData *objectData=[NSData dataWithContentsOfURL:[NSURL URLWithString:sourcePath]];
NSLog(#"%#",objectData);
audioPlayer = [[AVAudioPlayer alloc] initWithData:objectData error:&err];
[UIApplication sharedApplication].networkActivityIndicatorVisible=YES;
[loadingView addSubview:activityView];
[loadingView addSubview:loadingLabel];
[self.view addSubview:loadingView];
[activityView startAnimating];
[self.view addSubview:loadingView];
}
if(err)
{
NSLog(#"Error %ld,%#",(long)err.code,err.localizedDescription);
}
NSTimeInterval bufferDuration=0.005;
[audioSession setPreferredIOBufferDuration:bufferDuration error:&err];
if(err)
{
NSLog(#"Error %ld, %#", (long)err.code, err.localizedDescription);
}
double sampleRate = 44100.0;
[audioSession setPreferredSampleRate:sampleRate error:&err];
if(err)
{
NSLog(#"Error %ld, %#",(long)err.code,err.localizedDescription);
}
[audioSession setActive:YES error:&err];
if(err)
{
NSLog(#"Error %ld,%#", (long)err.code, err.localizedDescription);
}
sampRate=audioSession.sampleRate;
bufferDuration=audioSession.IOBufferDuration;
NSLog(#"SampeRate:%0.0fHZI/OBufferDuration:%f",sampleRate,bufferDuration);
audioPlayer.numberOfLoops = 0;
[audioPlayer prepareToPlay];
[audioPlayer play];
audioPlayer.delegate=self;
if(!audioPlayer.playing)
{
[audioPlayer play];
}
playing=YES;
}
else if (playing==YES)
{
[playButton setBackgroundImage:[UIImage imageNamed:#"play12.png"] forState:UIControlStateNormal];
[audioPlayer pause];
playing=NO;
timer = [NSTimer scheduledTimerWithTimeInterval:1.0 target:self selector:#selector(updateViewForPlayerState) userInfo:nil repeats:YES];
}
if (self.audioPlayer)
{
[self updateViewForPlayerInfo];
[self updateViewForPlayerState];
[self.audioPlayer setDelegate:self];
}
}
-(void)loadData
{
[UIApplication sharedApplication].networkActivityIndicatorVisible=YES;
[self.view addSubview:loadingView];
[activityView startAnimating];
loadingConnection=[[NSURLConnection alloc]initWithRequest:request delegate:self startImmediately:YES];
}
-(void)connectionDidFinishLoading:(NSURLConnection *)connection
{
[UIApplication sharedApplication].networkActivityIndicatorVisible=NO;
[activityView stopAnimating];
[loadingView removeFromSuperview];
}
you can crate loading custom loading view like bellow..
-(void) showLoadingView
{
CGRect screenRect = [UIScreen mainScreen].bounds;
if (loadingView == nil)
{
loadingView = [[UIView alloc] initWithFrame:screenRect];
loadingView.opaque = NO;
loadingView.backgroundColor = [UIColor darkGrayColor];
loadingView.alpha = 0.5;
UIActivityIndicatorView *spinningWheel = [[UIActivityIndicatorView alloc] initWithFrame:CGRectMake(self.window.center.x-18.0,self.window.center.y-18.0, 37.0, 37.0)];
[spinningWheel startAnimating];
spinningWheel.activityIndicatorViewStyle = UIActivityIndicatorViewStyleWhiteLarge;
spinningWheel.alpha = 1.0;
[loadingView addSubview:spinningWheel];
[spinningWheel release];
}
[window addSubview:loadingView];
}
-(void) showLoadingViewMessage:(NSString *)msg
{
CGRect screenRect = [UIScreen mainScreen].bounds;
if (loadingView == nil)
{
loadingView = [[UIView alloc] initWithFrame:screenRect];
loadingView.opaque = NO;
loadingView.backgroundColor = [UIColor darkGrayColor];
loadingView.alpha = 0.5;
UIActivityIndicatorView *spinningWheel = [[UIActivityIndicatorView alloc] initWithFrame:CGRectMake(self.window.center.x-18.0, self.window.center.y-18.0, 37.0, 37.0)];
[spinningWheel startAnimating];
spinningWheel.activityIndicatorViewStyle = UIActivityIndicatorViewStyleWhiteLarge;
spinningWheel.alpha = 1.0;
[loadingView addSubview:spinningWheel];
[spinningWheel release];
}
UILabel *lblTitle = [[UILabel alloc] initWithFrame:CGRectMake(0, 250.0, 320.0, 80.0)];
lblTitle.text = msg;
lblTitle.textAlignment = UITextAlignmentCenter;
lblTitle.textColor = [UIColor whiteColor];
lblTitle.alpha = 1.0;
lblTitle.backgroundColor = [UIColor clearColor];
lblTitle.numberOfLines = 0;
//lblTitle.layer.borderColor = [UIColor blueColor].CGColor;
//lblTitle.layer.borderWidth = 1.0;
[loadingView addSubview:lblTitle];
[lblTitle release];
[window addSubview:loadingView];
}
-(void) hideLoadingView
{
if(loadingView)
{
[loadingView removeFromSuperview];
loadingView = nil;
}
}
put that all three methods in your AppDelegate and Call the showLoadingView whenevery you whant loading and call hideLoadingView whenever You dont want .
i hope it is helpful for you..
I am using Avsession to capture images .
while i capture first image i will redirect user to apply effects on it and then get back to my capture image screen and i will preview image on cam capture screen .Then i will allow user to capture 3 more images and i will preview all images on cam capture screen. while i capturing my third image i will receive memory warning.
After Capturing Every image i will crop image to resize.
I have write below code in viewdidload
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device hasTorch] && [device hasFlash]){
[device lockForConfiguration:nil];
[device setTorchMode:cameraFlashMode];
[device unlockForConfiguration];
[self.btnFlashLight setHidden:NO];
}
else
[self.btnFlashLight setHidden:YES];
cameraMode = AVCaptureDevicePositionBack;
firstImage.layer.borderWidth = 2.0f;
firstImage.layer.borderColor = [UIColor grayColor].CGColor;
secondImage.layer.borderWidth = 2.0f;
secondImage.layer.borderColor = [UIColor grayColor].CGColor;
thirdImage.layer.borderWidth = 2.0f;
thirdImage.layer.borderColor = [UIColor grayColor].CGColor;
forthImage.layer.borderWidth = 2.0f;
forthImage.layer.borderColor = [UIColor grayColor].CGColor;
if ([capturedImages count] == 0)
[self.btnNext setEnabled:NO];
if ([self captureManager] == nil) {
CamCaptureManager *manager = [[CamCaptureManager alloc] init];
[self setCaptureManager:manager];
[[self captureManager] setDelegate:self];
if ([[self captureManager] setupSession]) {
// Create video preview layer and add it to the UI
newCaptureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:[[self captureManager] session]];
UIView *view = [self videoPreviewView];
CALayer *viewLayer = [view layer];
[viewLayer setMasksToBounds:YES];
CGRect bounds = [view bounds];
[newCaptureVideoPreviewLayer setFrame:bounds];
[newCaptureVideoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[viewLayer insertSublayer:newCaptureVideoPreviewLayer below:[[viewLayer sublayers] objectAtIndex:0]];
UIBezierPath *fullScreenPath = [UIBezierPath bezierPathWithRect:view.bounds];
CAShapeLayer *maskLayer = [CAShapeLayer layer];
[fullScreenPath appendPath:[UIBezierPath bezierPathWithRoundedRect:[GeneralDeclaration generalDeclaration].cameraMaskFrame byRoundingCorners:UIRectCornerAllCorners cornerRadii:CGSizeMake(0.0f, 0.0f)]];
maskLayer.fillRule = kCAFillRuleEvenOdd;
maskLayer.path = fullScreenPath.CGPath;
maskLayer.fillColor = [UIColor blackColor].CGColor;
maskLayer.opacity = 0.6f;
maskLayer.lineWidth = 1.0f;
maskLayer.strokeColor = [UIColor whiteColor].CGColor;
[viewLayer insertSublayer:maskLayer above:[[viewLayer sublayers] objectAtIndex:0]];
[self setCaptureVideoPreviewLayer:newCaptureVideoPreviewLayer];
[[self captureManager] session ].sessionPreset = AVCaptureSessionPresetPhoto;
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
[[[self captureManager] session] startRunning];
});
}
}
I have written Below code to process captured Image:
[[self captureManager] captureStillImage];
UIView *flashView = [[UIView alloc] initWithFrame:[[self videoPreviewView] frame]];
[flashView setBackgroundColor:[UIColor whiteColor]];
[[[self view] window] addSubview:flashView];
[UIView animateWithDuration:.4f
animations:^{
[flashView setAlpha:0.f];
}
completion:^(BOOL finished){
[flashView removeFromSuperview];
}
];
firstImage.image = nil;
secondImage.image = nil;
thirdImage.image = nil;
forthImage.image = nil;
//firstImage.userInteractionEnabled = YES;
secondImage.userInteractionEnabled = YES;
thirdImage.userInteractionEnabled = YES;
forthImage.userInteractionEnabled = YES;
for (UIImageView *subview in [firstImage subviews]) {
[subview removeFromSuperview];
}
for (UIImageView *subview in [secondImage subviews]) {
[subview removeFromSuperview];
}
for (UIImageView *subview in [thirdImage subviews]) {
[subview removeFromSuperview];
}
for (UIImageView *subview in [forthImage subviews]) {
[subview removeFromSuperview];
}
for(int i = 0; i < [capturedImages count]; i++)
{
UIImage *removeImage = [UIImage imageNamed:#"remove_image.png"];
UIImageView *removeImageView = [[UIImageView alloc] initWithImage:removeImage];
removeImageView.tag = i;
[removeImageView setCenter:CGPointMake(firstImage.bounds.size.width/2, firstImage.bounds.size.height/2)];
UITapGestureRecognizer *removeImageTap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(removeImage_Tapped:)];
removeImageTap.numberOfTapsRequired = 1;
removeImageView.userInteractionEnabled = YES;
[removeImageView addGestureRecognizer:removeImageTap];
switch (i) {
case 0:
firstImage.image = [capturedImages objectAtIndex:i];
//[firstImage addSubview:removeImageView];
break;
case 1:
secondImage.image = [capturedImages objectAtIndex:i];
[secondImage addSubview:removeImageView];
break;
case 2:
thirdImage.image = [capturedImages objectAtIndex:i];
[thirdImage addSubview:removeImageView];
break;
case 3:
forthImage.image = [capturedImages objectAtIndex:i];
[forthImage addSubview:removeImageView];
break;
default:
break;
}
}
Please help me to solve.
Set property sessionPreset like this:
[yourcamera session].sessionPreset = AVCaptureSessionPresetHigh
This will not totally remove memory warning but may help to reduce it.
I've written an application that takes advantage of the new AVCaptureMetadataOutput APIs in iOS 7 for barcode scanning.
I have the following code in one of my view controllers:
- (void)viewDidLoad
{
[super viewDidLoad];
highlightView = [[UIView alloc] init];
[highlightView setAutoresizingMask:UIViewAutoresizingFlexibleTopMargin | UIViewAutoresizingFlexibleLeftMargin | UIViewAutoresizingFlexibleRightMargin | UIViewAutoresizingFlexibleBottomMargin];
[[highlightView layer] setBorderColor:[[UIColor greenColor] CGColor]];
[[highlightView layer] setBorderWidth:3.0];
[[self view] addSubview:highlightView];
session = [[AVCaptureSession alloc] init];
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
input = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil];
[session addInput:input];
output = [[AVCaptureMetadataOutput alloc] init];
[output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[session addOutput:output];
[output setMetadataObjectTypes:[output availableMetadataObjectTypes]];
[output setRectOfInterest:[[self view] bounds]];
previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
[previewLayer setFrame:[[self view] bounds]];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
if ([[previewLayer connection] isVideoOrientationSupported]) {
[[previewLayer connection] setVideoOrientation:(AVCaptureVideoOrientation)[[UIApplication sharedApplication] statusBarOrientation]];
}
[[[self view] layer] insertSublayer:previewLayer above:[[self view] layer]];
[session startRunning];
[[self view] bringSubviewToFront:highlightView];
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection
{
CGRect highlightViewRect = CGRectZero;
AVMetadataMachineReadableCodeObject *barcode;
NSArray *barCodeTypes = #[AVMetadataObjectTypeUPCECode, AVMetadataObjectTypeCode39Code,AVMetadataObjectTypeCode39Mod43Code, AVMetadataObjectTypeEAN13Code, AVMetadataObjectTypeEAN8Code, AVMetadataObjectTypeCode93Code, AVMetadataObjectTypeCode128Code, AVMetadataObjectTypePDF417Code, AVMetadataObjectTypeQRCode, AVMetadataObjectTypeAztecCode];
for (AVMetadataObject *metadata in metadataObjects) {
if ([barCodeTypes containsObject:[metadata type]]) {
barcode = (AVMetadataMachineReadableCodeObject *)[previewLayer transformedMetadataObjectForMetadataObject:(AVMetadataMachineReadableCodeObject *)metadata];
highlightViewRect = [barcode bounds];
break;
}
}
[highlightView setFrame:highlightViewRect];
[delegate barcodeScannerController:self didFinishScanningWithBarcode:[barcode stringValue]];
}
This code works in that it can detect various barcode types and convert barcodes into their string values. What I'm wondering why on iPhones, only barcodes near the center of the view are detected, while on iPads, only barcodes located near the bottom of the view are detected. It's a very peculiar behaviour, and in the case of the iPad, not intuitive at all.