iOS Mapbox MGLMapView: fit an image to a MGLMapView - ios

I used an UIImage to initialize an MGLImageSource. Then I use the MGLImageSource to initialize a MGLRasterStyleLayer and I add the layer to a MGLMapView. It turned out that the image is too big to fit entirely into the map view. How do I tell MGLMapview to scale an image automatically so that the image fits entirely into the view like the 'scale to fit' in iOS ? thanks !
Here is how I add the image into the map view:
UIImage *radarImage = [UIImage imageNamed:#"radar.png"];
MGLImageSource *source = [[MGLImageSource alloc] initWithIdentifier:#"radar" coordinateQuad:coordinates image:radarImage];
[style addSource:source];
MGLRasterStyleLayer *radarLayer = [[MGLRasterStyleLayer alloc] initWithIdentifier:#"radar-layer" source:source];
[style addLayer: radarLayer];

set a property of mapview called visibleCoordinates; it has two members SW and NE which stands for the top right corner and low left corner of the visible view. That will restrict the size of the image and automatically resize the image.

Related

Why Imageview always stretched show image stretched iOS?

I am using an imageView and it contains different images, but some images are stretched. How do I display the images without any stretch? I have set the contentMode to aspect fit. However, the images are not in the size of the imageView, and it is in different sizes. I want to show the images as the same size without stretching.
imageview1 = [[UIImageView alloc]initWithFrame:CGRectMake(30, 10, 190, 190)];
imageview1.contentMode = UIViewContentModeScaleAspectFit;
imageview1.clipsToBounds = YES;
Please try setting content mode before setting frame like below
imageview1.contentMode = UIViewContentModeScaleAspectFit;
imageview1 = [[UIImageView alloc]initWithFrame:CGRectMake(30, 10, 190, 190)];
imageview1.clipsToBounds = YES;
If you want to display each image without stretching and of same size, than set contentMode property of UIImageView's instance to UIViewContentModeScaleAspectFill. For instance-
imageview1 = [[UIImageView alloc]initWithFrame:CGRectMake(30, 10, 190, 190)];
imageview1.contentMode = UIViewContentModeScaleAspectFill;
imageview1.clipsToBounds = YES;
UIViewContentModeScaleAspectFill, will fill the entire area of image view, keeping the aspect ratio intact. In the process of filling entire image view, either vertical or horizontal length will be fully covered and the filling will continue till the time other dimension is fully filled. In the process, your content(either across vertical or horizontal) will be visible outside the frame. To clip this extra content we have set clipsToBounds property of image view to YES.
UIViewContentModeScaleAspectFit, will fill the image view's area till the time any one length either vertical or horizontal is fully filled keeping the aspect ratio intact. Its useful, if you are not required to show each image as same size as the other direction (if vertical is fully filled than horizontal or vice versa), is not fully covered. Since this will show blank spaces in the direction which is not fully covered.

iOS: understanding frame and views

I am working programmatically an application for iOS based on a ViewController. I am trying to do so programmatically as I want to understand the underlying concepts.
I have created a subclass of UIImageView and initialized this using an image. In the initialization method I added also a second UIImageView as I would like to handle the two differently but be part of the same object. Ultimately I would like to be able to scale the object (and hence the 2 UIImages) according to the device screen resolution (e.g. if resolution is low then I will scale the two images by 50%). I want to do this because I would like to be able to implement a zoom in and zoom out feature as well as supporting multiple resolutions and screen layouts.
Additional information:
The two images have different size (500x500 pixels) and (350x350
pixels).
My questions are:
how do I position the second image exactly in the center of the first? (I used the center property of the main UIImage but I think I got it wrong.. I thought that the center was the exact center of the square but either I am using it incorrectly or there is something I am missing)
are there any negative side effects for using this approach (UIView subclass class containing an additional UIView?) (E.g. Is it going to create confusion when applying transformation algorithms? Does it reduce the randering speed? Or more simply is it a bad design pattern?)
I find it difficult to understand the positioning of the second image. See code snipped below, this is what I use:
CGRect innerButtonFrame = CGRectMake(self.center.x/2, self.center.y/2,innerButtonSelectedImage.size.width,innerButtonSelectedImage.size.height);
Taken from:
-(id) initWithImage:(UIImage *)image
{
if(self = [super initWithImage:image]){
//
self.userInteractionEnabled = true;
// Initialize gesture recognizers
UITapGestureRecognizer *tapInView = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(tapInImageView:)];
[self addGestureRecognizer:tapInView];
UILongPressGestureRecognizer *longPress = [[UILongPressGestureRecognizer alloc] initWithTarget:self action:#selector(longPressInView:)];
[self addGestureRecognizer:longPress];
// Initialize labels
..
// Inner circle image
innerButtonView = [[UIImageView alloc] init];
innerButtonSelectedImage = [UIImage imageNamed:#"inner circle.png"];
CGRect innerButtonFrame = CGRectMake(self.center.x/2, self.center.y/2,innerButtonSelectedImage.size.width,innerButtonSelectedImage.size.height);
innerButtonView.frame = innerButtonFrame;
[innerButtonView setImage:innerButtonSelectedImage];
// Add additional ui components to view
[self addSubview:innerButtonView];
..
[self addSubview:descriptionLabel];
}
return self;
}
EDIT: This is how it looks like if I change the positioning code to the following:
CGRect innerButtonFrame = CGRectMake(0, 0,innerButtonSelectedImage.size.width,innerButtonSelectedImage.size.height);
innerButtonView.frame = innerButtonFrame;
I also don't understand why the image is bigger than the screen.. as the blue one should be 500x500 pixel wide and the screen of the iPhone 6 should be 1334 x 750.
How about:
CGRect innerButtonFrame = CGRectMake(0, 0, innerButtonSelectedImage.size.width,innerButtonSelectedImage.size.height);
innerButtonFrame.center = self.center;
If you need 500*500 circle then add the circle half means Replace 500*500 with 250*250 . And small circle replace 350*350 with 175*175 And solve your problem.
I hope your problem will solve..Enjoy
Thanks..

How to Clone A UIImageView

I have a UIImageView and I want to make a copy of it and place it somewhere on the screen. How do I do this?
I currently only know how to copy and paste the image manually and make a separate IBOutlet for each one, but this is very inefficient because I want to make a game that generates obstacles (UIImageViews) forever so I can't do it the manual way.
You want to make sure you match all of the properties up as well, like size, clipping, image aspect, opacity, etc.
CGPoint locationOfCloneImageView = CGPointMake(0, 0);//x and y coordinates of where you want your image. (More specifically, the x and y coordinated of where you want the CENTER of your image to be)
UIImageView *cloneImageView = [[UIImageView alloc] initWithImage:originalImageView.image];
cloneImageView.frame = CGRectMake(0, 0, originalImageView.frame.size.width, originalImageView.frame.size.height);//same size as old image view
cloneImageView.alpha = originalImageView.alpha;//same view opacity
cloneImageView.layer.opacity = originalImageView.layer.opacity;//same layer opacity
cloneImageView.clipsToBounds = originalImageView.clipsToBounds;//same clipping settings
cloneImageView.backgroundColor = originalImageView.backgroundColor;//same BG color
cloneImageView.tintColor = originalImageView.tintColor;//matches tint color.
cloneImageView.contentMode = originalImageView.contentMode;//matches up things like aspectFill and stuff.
cloneImageView.highlighted = originalImageView.highlighted;//matches whether it's highlighted or not
cloneImageView.opaque = originalImageView.opaque;//matches can-be-opaque BOOL
cloneImageView.userInteractionEnabled = originalImageView.userInteractionEnabled;//touches are detected or not
cloneImageView.multipleTouchEnabled = originalImageView.multipleTouchEnabled;//multi-touches are detected or not
cloneImageView.autoresizesSubviews = originalImageView.autoresizesSubviews;//matches whether or not subviews resize upon bounds change of image view.
//cloneImageView.hidden = originalImageView.hidden;//commented out because you probably never need this one haha... But if the first one is hidden, so is this clone (if uncommented)
cloneImageView.layer.zPosition = originalImageView.layer.zPosition+1;//places it above other views in the parent view and above the original image. You can also just use `insertSubview: aboveSubview:` in code below to achieve this.
[originalImageView.superview addSubview:cloneImageView];//adds this image view to the same parent view that the other image view is in.
cloneImageView.center = locationOfCloneImageView;//set at start of code.
You will need to create one new UIImageView with its new frame where you want to place it. set its image property of your existing imageView's image, and after that add it to your view.
UIImageView *newImageView = [[UIImageView alloc] initWithFrame:CGRectMake(0,0,50,50)];
newImageView.image = oldImageView.image;
[self.view addSubView:newImageView]

need a very tiny (rectangular in shape) overlay over UIImagePickerController, and then crop the image accordingly - UPDATED

In my application, i need the user to take a snap of only a 10 letter word (using overlay, which should be right in the centre of the screen of the UIImagePicker), and then in need to show him that image (only the part of the image covered by that rectangle). So, I need to crop that image according to the overlay.
Here, i have taken a picture using UIImagePickerControl. Now, i want to see the dimensions of the image that i have taken..
UIImage *imageToprocess = [info objectForKey:UIImagePickerControllerOriginalImage];
NSLog(#"image width %f", imageToprocess.size.width);
NSLog(#"image height %f", imageToprocess.size.height);
I see the following result on console.. But how is this possible. the dimensions of the image is exceeding the dimension of the iPhone screen size.. (which is 320, 568)
UsingTesseractOCR[524:60b] image width 2448.000000
2013-12-17 16:02:18.962 UsingTesseractOCR[524:60b] image height 3264.000000
Can anybody help me out here?? I have gone through several questions here, but did not understand how to do it.
Please help..
Refer this sample code for image capturing and cropping.
https://github.com/kishikawakatsumi/CropImageSample
For creating overlay, first create a custom view (of full dimensions of camera preview) and add an transparent image with just a rectangle in its background. use this view as overlay view.
myview =[[UIImageView alloc]init];
myview.frame=CGRectMake(0, 0, 320, 431);
// why 431? bcoz height = height of device - height of tabbar present in the
bottom for camera controls of picker
//for iphone 4 ,480-49
myview.backgroundColor =[UIColor clearColor];
myview.opaque = NO;
myview.image =[UIImage imageNamed:#"A45Box.png"];
myview.userInteractionEnabled =YES;
note that you create a background image appropriately (means dimensions). You can also draw rectangle programmatically but this is much easy way.
Secondly, talking about your cropping issue, you have to get your hands dirty....Try these links for help
https://github.com/iosdeveloper/ImageCropper
https://github.com/barrettj/BJImageCropper
https://github.com/ardalahmet/SSPhotoCropperViewController

Instantiate an image to appear at several different points on the screen?

I have an image called Empty.png, it is a small-ish square tile, how could I instantiate the image to appear at several different points on the screen?
Thank you for any help in advance :)
You can can place UIImageView's wherever you want the image to appear.And then set the image property of each image view as this image. (UIImage object).
If you are using interface builder then you just have to type in the name of the file in the attributes inspector of the imageview in the interface builder.
Or you could do this:
UIImage *img = [UIImage imageName:#"Empty.png"];
imageView.image = img; //Assuming this is your utlet to the image view.
It depends on how you want to use it.
If you just draw it with core graphics, let's say in drawInRect: or so, then you simply draw it several times.
If you want to display it within one or a number of image views, then instanciate your UIImageViews and assign the same object to all of them. Or let the Interface Builder do the instanciation for you. But you cannot add a single UIView object several times to one or a number of subview-hierarchies. If you add a UIView as subview to a view then it will disappear from the position where it was before.
10 UIImageView may use the same UIView but you need 10 UIImageViews to display all of them.
The same applies to UIButtons and every UI-thing that has an image or background image.
this will get you one image into some view
CGPoint position = CGPointMake(x,y);
UIImageView *img = [[UIImageView alloc] init];
img.image = [UIImage imageNamed:#"Empty.png"];
img.frame = CGRectMake(position.x, position.y, img.image.size.width, img.image.size.height);
[someUIView addSubview:img];
if you make an array for the positions (x,y) of all the images, then you can just run it in a for loop and it will place the images into the view at the positions you want
note: CGPoints cant be stored in an NSArray since its not an NSObject type, either use a C/C++ array or use something else that can fit into a NSArray

Resources