I am trying to copy all of the functionality of this example app provided by Apple called AVCam: https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html#//apple_ref/doc/uid/DTS40010112-Intro-DontLinkElementID_2
The only thing I am having trouble with is changing the size and location of a UIView object. The problem is I am using apple's sample code and it's just not making sense for me.
Apple provides this sample VC that I have called MediaCapturePreviewView. Here is the header file code:
#import <UIKit/UIKit.h>
#class AVCaptureSession;
#interface MediaCapturePreviewView : UIView
#property (nonatomic) AVCaptureSession *session;
#end
Here is it's main file code:
#import "MediaCapturePreviewView.h"
#import <AVFoundation/AVFoundation.h>
#implementation MediaCapturePreviewView
- (void)drawRect:(CGRect)rect
{
// Drawing code
}
*/
+ (Class)layerClass
{
return [AVCaptureVideoPreviewLayer class];
}
- (AVCaptureSession *)session
{
return [(AVCaptureVideoPreviewLayer *)[self layer] session];
}
- (void)setSession:(AVCaptureSession *)session
{
[(AVCaptureVideoPreviewLayer *)[self layer] setSession:session];
}
#end
Then in my app's central View Controller, I have imported the header file of "MediaCapturePreviewView" that I just showed you. Last but not least, in my central view controller's main file I have an IBOutlet in the interface area that looks like this:
#property (nonatomic, weak) IBOutlet MediaCapturePreviewView *previewView;
The above IBOutlet has been connected to a UIView object in Interface Builder that covers the entire iPhone screen.
And then here is a small example of how previewView is used when you tap the "take a picture" button:
- (IBAction)snapStillImage:(id)sender
{
dispatch_async([self sessionQueue], ^{
// Update the orientation on the still image output video connection before capturing.
[[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]];
I have tried adding in this code right after the above method calls but it is not helping:
_previewView.frame = CGRectMake(0, 0, self.view.bounds.size.width, self.view.frame.size.height/2);
I know that you can edit properties for CALayer class objects like frame, bounds, and position but I'm just stuck and don't know where exactly I need to edit these things.
This is what the current UIView looks like when taking a picture:
And this is what I need it to look like:
I basically need it to take up exactly the top 50% of the iPhone's screen.
Any help is greatly appreciated thank you.
The video has a default capture ratio which fits the iPhone's screen. If you try to scale it to a different ratio (such as half screen) you will get a distorted image or a cropped one.
In order to achieve what you're trying to do I think it would be easier to just cover up half of the screen with a UIView containing the controls you want (such as take a photo button) and then just crop the captured full screen image to half its size with something like this:
CGRect clippedRect = CGRectMake(0, 0, self.view.frame.size.width,self.view.frame.size.height/2.0);
CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], clippedRect);
UIImage *newImage = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
Related
I want to have an UIActionSheet with custom UIAlertAction but seem like custom UIAlertAction is not legal.
The UIAlertController class is intended to be used as-is and does not support subclassing. The view hierarchy for this class is private and must not be modified.
So I'm trying to create a view which look exactly like UIActionSheet. By using Debug View Hierarchy, I can know how Apple applies hierarchy, constraint, color for UIActionSheet to do exactly which they did.
I can create everything except UIActionSheet's background. It's a view which contains a non-transparent view and an UIVisualEffectView.
The UIVisualEffectView overlap non-transparent view but somehow UIVisualEffectView still works.
How can UIVisualEffectView still work when there is a non-transparent below it? If it's possible, how can I make something like this?
Note: Background of UIActionSheet is not only an UIVisualEffectView. Please don't give an answer like this.
The view below the UIVisualEffectView (the one with the red arrow in the question) contains a layer which has a CAFilter "overlayBlendMode" (private class) set as compositing filter. It's applied over the layer behind it. In the screenshot below I changed the background colour of the _UIDimmingKnockoutBackdropView to green and the filter is applied over that.
Private method
To create the same effect, you need to put a white opaque view below the UIVisualEffectView and apply a CAFilter to it:
CAFilter *filter = [CAFilter filterWithName:#"overlayBlendMode"];
[[contentView layer] setCompositingFilter:filter];
Since CAFilter is private, we need a header for it:
#import <Foundation/Foundation.h>
#interface CAFilter : NSObject <NSCoding, NSCopying, NSMutableCopying> {
void * _attr;
void * _cache;
unsigned int _flags;
NSString * _name;
unsigned int _type;
}
#property BOOL cachesInputImage;
#property (getter=isEnabled) BOOL enabled;
#property (copy) NSString *name;
#property (readonly) NSString *type;
// Image: /System/Library/Frameworks/QuartzCore.framework/QuartzCore
+ (void)CAMLParserStartElement:(id)arg1;
+ (BOOL)automaticallyNotifiesObserversForKey:(id)arg1;
+ (id)filterTypes;
+ (id)filterWithName:(id)arg1;
+ (id)filterWithType:(id)arg1;
- (void)CAMLParser:(id)arg1 setValue:(id)arg2 forKey:(id)arg3;
- (id)CAMLTypeForKey:(id)arg1;
- (struct Object { int (**x1)(); struct Atomic { struct { int x_1_2_1; } x_2_1_1; } x2; }*)CA_copyRenderValue;
- (BOOL)cachesInputImage;
- (id)copyWithZone:(struct _NSZone { }*)arg1;
- (void)dealloc;
- (BOOL)enabled;
- (void)encodeWithCAMLWriter:(id)arg1;
- (void)encodeWithCoder:(id)arg1;
- (id)initWithCoder:(id)arg1;
- (id)initWithName:(id)arg1;
- (id)initWithType:(id)arg1;
- (BOOL)isEnabled;
- (id)mutableCopyWithZone:(struct _NSZone { }*)arg1;
- (id)name;
- (void)setCachesInputImage:(BOOL)arg1;
- (void)setDefaults;
- (void)setEnabled:(BOOL)arg1;
- (void)setName:(id)arg1;
- (void)setValue:(id)arg1 forKey:(id)arg2;
- (id)type;
- (id)valueForKey:(id)arg1;
// Image: /System/Library/PrivateFrameworks/PhotosUICore.framework/PhotosUICore
+ (id)px_filterWithPXCompositingFilterType:(int)arg1;
#end
The result:
The view I've created prints the same compositing filter as Apple used:
Workaround
As a workaround you can create an image of the view on the background and a image of the white view and apply the overlay blend mode with that. This results in a comparable result as using the private class.
Example:
UIImage *image = [self imageForView:contentView];
UIView *superview = [contentView superview];
[contentView setHidden:YES];
UIImage *backgroundImage = [self imageForView:superview];
CGImageRef imageRef = CGImageCreateWithImageInRect([backgroundImage CGImage], [contentView frame]);
backgroundImage = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
[contentView setHidden:NO];
CIFilter *filter = [CIFilter filterWithName:#"CIOverlayBlendMode"];
[filter setValue:[backgroundImage CIImage] forKey:kCIInputImageKey];
[filter setValue:[image CIImage] forKey:kCIInputBackgroundImageKey];
[contentView setBackgroundColor:[UIColor colorWithPatternImage:[UIImage imageWithCIImage:[filter outputImage]]]];
}
- (UIImage *)imageForView:(UIView *)view
{
UIGraphicsBeginImageContextWithOptions([view bounds].size, [view isOpaque], 1.0);
[[view layer] renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
Result:
Please check my storyboard views set up and result.
I used actual apple's action sheet with same image to control the result.
Storyboard set up
Result
Original apple's action sheet
And one of approaches to implement custom action sheet
You can check apple's docs for dynamic stack view Dynamically Changing the Stack View’s Content
I have reviewed almost all questions on OS related to custom NavigationBar but unable to find those solutions helpful. Please have a look on following screenshot,
Red portion represents an icon (small image) in the center of navigationBar.
Please suggest some solution. Thanks in advance.
EDIT: I want to implement that for all UINavigationBar in app. I mean on each view
NOTE: I do not know who is down voting my question but i want to ask a question from those persons. Do you have any solution for my problem? If not, then why you are down voting my question? I will not be able to get help in this way. It's totally wrong.
You can subclass UINavigationBar.
#import "VSNavigationBar.h"
#interface VSNavigationBar ()
#property (nonatomic, weak) UIView *noseView;
#end
#implementation VSNavigationBar
-(void)layoutSubviews
{
[super layoutSubviews];
static CGFloat width = 80;
if (!_noseView) {
UIView *noseView = [[UIView alloc] initWithFrame:CGRectMake(self.bounds.size.width / 2 - width / 2, 0, width, width)];
self.noseView = noseView;
self.noseView.backgroundColor = self.barTintColor;
self.noseView.layer.cornerRadius = self.noseView.frame.size.width / 2;
[self addSubview:_noseView];
}
_noseView.frame = CGRectMake(self.bounds.size.width / 2 - width / 2, 0, width, width);
}
#end
And in your Storyboard you would select the NavigationController scene, in the tree view on the left select Navigation Bar and on the right side select the identity inspector and change the class to the subclass.
you need to create an image that looks like central part of the image you have provided and say -
UIImage *image = [UIImage imageNamed: #"logo.png"];
UIImageView *imageview = [[UIImageView alloc] initWithImage: image];
self.navigationItem.titleView = image view;
If you are trying to get circular effect like the bottom of the image you provided, in that case you need to have same image as above and set it as background image of navbar -
[navigationBar setBackgroundImage:[UIImage imageNamed: #"yourImage"] forBarMetrics:UIBarMetricsDefault];
I know there are lots of similar questions here and I checked the famous one, and then grasped the difference between Bounds, and Frame.
Now I have some problem related to them. I played around with them , but it didn't show as I expected.
What I don't understand here is:
Why the frame origin of Y is 44.000000 below the top even I set the UIImageView at the left corner?
Because bounds should be
"The bounds of an UIView is the rectangle, expressed as a location (x,y) and size (width,height) relative to its own coordinate system (0,0)." (Cocoa: What's the difference between the frame and the bounds?)
I thought the frame also should start at the left corner here.
#import "ViewController.h"
#interface ViewController ()
//#property (weak, nonatomic) IBOutlet UIImageView *image;
#property (weak, nonatomic) IBOutlet UIImageView *image2;
#end
#implementation ViewController
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
//UIImage *imageView = [UIImage imageNamed:#"stanford"];
// self.image.alpha = 1;
// self.image.contentMode = 1;
// self.image.image = imageView;
// [self.image setAlpha:0.1];
// [self.image setContentMode:UIViewContentModeCenter];
// [self.image setImage:imageView];
NSURL *url = [[NSURL alloc] initWithString:#"http://upload.wikimedia.org/wikipedia/commons/thumb/e/ed/Pitbull_2%2C_2012.jpg/472px-Pitbull_2%2C_2012.jpg"];
NSData *data = [NSData dataWithContentsOfURL:url];
UIImage *imageData = [UIImage imageWithData:data];
[self.image2 setBounds:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height)];
[self.image2 setImage:imageData];
NSLog(#"frame.orign.x:%f,frame.origin.y:%f",self.image2.frame.origin.x,self.image2.frame.origin.y);
}
The 44 magic number value actually comes from the navigation bar :) Whereas the height of the status bar is 20 points.
If you want your image to cover the entire screen then you will need to either get rid of your status bar, or make your status bar translucent so that content can be displayed underneath.
If you don't set your status bar/navigation bar to translucent, then the point of origin 0,0 would start just underneath the bars as you are experiencing now.
status bar is set using
[[UIApplication sharedApplication] setStatusBarStyle: UIStatusBarStyleBlackTranslucent];
The navigation bar, if you have one displayed can be set using this
theNavigationController.navigationBar.barStyle = UIBarStyleBlackTranslucent;
Only then will your following code display your content across the full screen
[self.image2 setFrame:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height)]
This is because setting the bars to translucent will have have them displayed as overlays instead.
The 44 comes from the nav bar which takes up that much height
I am applying blur to various sections of my app using the UIImage+ImageEffects.h sample code that was provided in one of the apps from the WWDC in 2013. It works well and I'm able to recreate the iOS7 effects for any UIImage.
However, I would like to recreate the effect of the UINavigationBar blurred transparency but using any view that I choose. Similar to the screenshot shown below.
For instance, say that I have a UITableView that takes up half of the screen. I also have a UIImageView background as a separate view behind it that occupies the entire screen. I would only like to blur the UIImageView background for just that section of the screen that's under the tableview.
Here's my question. How do I create create a UIImage by taking a "screenshot" of whatever is behind a UIView that is displayed? Is this even possible?
Here is my screen hierarchy. Nothing complex. I would like the "Blurred Image View" to contain a blurred image of the section of the "Image View" that is sitting as the main UIImageView in the hierarchy.
If you are deploing only on iOS7 you can use the new api, that are a lot faster than -renderInContext and use the ImageEffects category on that image taken from the view. Add this a a category on UIView
#interface UIView (RenderView)
- (UIImage *) imageByRenderingView;
- (UIImage *) imageByRenderingViewOpaque:(BOOL) yesOrNO;
#end
#implementation UIView (RenderView)
- (UIImage *) imageByRenderingViewOpaque:(BOOL) yesOrNO {
UIGraphicsBeginImageContextWithOptions(self.bounds.size, yesOrNO, 0);
if ([self respondsToSelector:#selector(drawViewHierarchyInRect:afterScreenUpdates:)]) {
[self drawViewHierarchyInRect:self.bounds afterScreenUpdates:NO];
}
else {
[self.layer renderInContext:UIGraphicsGetCurrentContext()];
}
UIImage *resultingImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return resultingImage;
}
- (UIImage *) imageByRenderingView{
return [self imageByRenderingViewOpaque:NO];
}
This snippet is a UIView category ok also for system lower than iOS7. It takes an image on a view and its subviews.
Use the below piece of code for iOS < 7
#import <QuartzCore/QuartzCore.h>
+ (UIImage *) imageWithView:(UIView *)view
{
UIGraphicsBeginImageContextWithOptions(view.bounds.size, view.opaque, 0.0);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return img;
}
I am playing around with Xcode 5 and storyboarding. I'm at the point where I have captured a UIImage using the camera, but I can't figure out how to display the image on the phone after I've captured it. I have specified an IBOutlet UIImageView and set it's value to the image that I captured from the camera, but that doesn't seem to do anything at all.
This is my interface:
#interface HCViewController ()
#property (nonatomic, retain) IBOutlet UIImageView *image;
#end
And this is my didFinishPickingWithMediaInfo method:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
// Get the picture from the camera
UIImage *imageFromCamera = [info objectForKey:UIImagePickerControllerEditedImage];
// Set the IBOutlet UIImageView to the picture from the camera
// This does nothing as far as I can tell
self.image = [[UIImageView alloc] initWithImage:imageFromCamera];
// Let's take a closer look at this
NSLog(#"%#", self.image);
// Close the camera view controller
[self dismissViewControllerAnimated:YES completion:nil];
}
This is what I see in the logger for self.image:
2013-09-10 17:17:45.169 YourMom[6136:60b] <UIImageView: 0x1556fdc0; frame = (0 0; 0 0); userInteractionEnabled = NO; layer = <CALayer: 0x155d5b10>>
My storyboard has a View Controller with two different scenes that can be swiped back and forth. Both of the scenes have a "View" with an "Image View" as a sub-item. The Image Views each have "Referencing Outlets" that seem to be connected to the image variable that I defined in my interface. However, simply setting the value of image doesn't change the phone display.
After reading this SO question: How can I change the image displayed in an UIImageView programmatically? I tried [self.image setImage:image], but that didn't appear to do anything either. How do I tell Xcode that I want image to show up in the view?
Your mistake is in this:
self.image = [[UIImageView alloc] initWithImage:imageFromCamera];
Because you are using Storyboard your UIImageView will be initialized for you. By executing this line of code you are throwing away an old UIImageView and replacing it with a new UIImageView that has no frame.
You just need to do this:
[self.image setImage:imageFromCamera]
Also, you might be getting the wrong image from info. Try UIImagePickerControllerOriginalImage instead of UIImagePickerControllerEditedImage.
Hope this is helpful, cheers!
UIImage View has an property called image.
This property is of type UIImage.
So if you have an UIImage, then you can set the property as follows:
imageView.image = image;
// this is your delegate method to set the image taken from your camera
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
// Get the picture from the camera
UIImage *imageFromCamera = [info objectForKey:UIImagePickerControllerEditedImage];
// alloc image view as you have to set it image on it
UIImageView *imgView = [[UIImageView alloc]init];
imgView.image = imageFromCamera;
// you can also set the image on button like this
UIImage *imageFromCamera = [info objectForKey:UIImagePickerControllerEditedImage];
// imgBtn is UIButton property
[self.imgBtn setImage:imageFromCamera forState:UIControlStateNormal];
// Let's take a closer look at this
NSLog(#"%#", self.image);
// Close the camera view controller
[self dismissViewControllerAnimated:YES completion:nil];
}
You can try this code to display your image.
imageView.image : UIImage = UIImage(named: "YourImageName")
Hope this can help you with your problems.