I want to show progress activity in the form of image getting completed. A simple example is when we install app from app store, we can see the image icon getting full as downloading gets completed, i.e. I get total bytes to be downloaded but I want to show progress in the form of icon image getting completed, i.e. according to bytes completed, want to show that icon image completed, I am looking for that kind of progress activity. Has anybody done that. Please help.
You need an image sequence of empty image and partially filled images until image is fully filled. Then it must be animated so output will look like image is getting filled.
A sample code would look like,
UIImageView *animationView=[[UIImageView alloc]initWithFrame:CGRectMake(0, 0,320, 460)];//specify your size here
animationView.contentMode = UIViewContentModeCenter;
animationView.animationImages = [NSArray arrayWithObjects:
[UIImage imageNamed:#"1.png"],
[UIImage imageNamed:#"2.png"],
[UIImage imageNamed:#"3.png"],
[UIImage imageNamed:#"4.png"],
[UIImage imageNamed:#"5.png"],
[UIImage imageNamed:#"6.png"],
[UIImage imageNamed:#"7.png"],
[UIImage imageNamed:#"8.png"],
[UIImage imageNamed:#"9.png"],
[UIImage imageNamed:#"10.png"],nil];
animationView.animationDuration = 1.5f;
animationView.animationRepeatCount= 0;
[animationView startAnimating];
[self.view addSubview:animationView];
If you're not downloading the image the why not use a CAShapeLayer used as a mask over your image. In other words your image is there the whole time but more of it becomes visible as your mask changes.
If your mask was a single slice of a circle you could then use the NSURLSession/NSURLConnection delegate to apply a CATransform around the centre point of the circle and then recalculate the bezierpath of the mask. Before the delegate returns apply the new mask and you would end up with a circular progress view where more of your image becomes visible with each call to the delegate.
Related
In my application want to repeat image in imageview but when i repeat image small small slot display how to resolved??
I tried below code :
self.bgimg.backgroundColor = [UIColor colorWithPatternImage:[UIImage imageNamed: #"pattern-dots"]];
Because your image size is smaller than the view size.
So, you provide enough larger image than self.view size.
You can try to check by bellow code fragment.
UIImage *image = [UIImage imageNamed: #"pattern-dots"];//#"oval"
if (self.view.frame.size.width<=image.size.width) {
self.view.backgroundColor = [UIColor colorWithPatternImage:image];
}else{
NSLog(#"You image is too small to fit.");
}
NSLog(#"view Size:%#",NSStringFromCGSize(self.view.frame.size));
NSLog(#"imsge Size:%#",NSStringFromCGSize(image.size));
I am using AV Foundation and have created a main layer and a sub layer. The main layer displays a live "preview" of what the camera sees before the user takes a photo. After the user takes a photo, I want to set the value of the sublayer's contents property to the captured photo. Everything works perfectly, except for setting the contents of the sublayer.
And I know the sublayer is working because I am able to give it a background color of blue and when I take a photo in the app it will successfully turn the sublayer blue.
Here is my code where I am trying to set the sublayer to be the captured image:
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc]initWithData:imageData];
CALayer *subLayer = [CALayer layer];
subLayer.contents = (id)[UIImage imageWithData:imageData];
subLayer.frame = _previewLayer.frame;
[_previewLayer addSublayer:subLayer];
I have tried several different ways of setting the sublayer's contents property like these, but none of them work:
subLayer.contents = (id) [UIImage imageWithData:imageData];
subLayer.contents = [UIImage imageWithData:imageData];
subLayer.contents = image;
Also, I know the sublayer is setup properly because if I add this statement it will turn the sublayer completely blue when I take a photo:
subLayer.backgroundColor = [UIColor blueColor].CGColor;
Any ideas how I can update the sublayer and make it display the photo that is being captured?
On a whim I tried adding .CGImage to the imageWithData method call and it is now working. I sure wish it had that listed as a requirement in the xcode documentation.
I'm trying to develop a simple app that shows a stack of images: each image on top of the other. And when you swipe or click a button, the first image will disappear and the image below it will have the same size as the one that was on the top (as in the image below, but without the rotation effect)
I've tried to fetch the images and create the frame of the images at each for loop iteration:
UIImageView*picture = [[UIImageView alloc] initWithFrame:CGRectMake(40+10*i, 101-10*i, 240-20*i, 200)];
UIImage *image = [UIImage imageWithData:imageData];
picture.image = image;
[dView addSubview:picture];
and it worked. But I still can't find how to make the second image in the same size as the one that was on top of it.
CGrect nextImageFrame = nextImage.frame;
nextImageFrame.size.width = 240;
nextImageFrame.size.height = 200;
nextImage.frame = nextImageFrame;
assuming that nextImage is the image you want to show next
Can anyone help me understand how I apply a background image object to a UIView please?
I have created a background image which is a blurred version of a background and I would like to apply it to be the background of a uiView in the foreground which would ideally mask the background image.
I have the following code so far -
_blurImage = [source stackBlur:50];
[_HPBlurView.backgroundColor = [UIColor colorWithPatternImage:[_blurImage]]];
I would like to apply the image object(_blurImage) to be the background image of _hpBlurView but i'm struggling to get it working!
At first glance, you are using too many brackets. Here is a working version of your code :
_burImage = [source stackBlur:50];
_HPBlurImage.backgroundColor = [UIColor colorWithPatternImage:_blurImage];
I can't see what stackBlur:50 returns. So start from the beginning. colorWithPatternImag takes UIImage as a parameter. So Start by adding a picture, any picture, to your application. Lets imagine that the image is called image.png. This is one way to do it:
UIImage *image = [UIImage imageNamed:#"image.png"];
_HPBlurView.backgroundColor = [UIColor colorWithPatternImage:image];
This should help to get you going.
Create an image and add to the background.:
UIImage *image = [UIImage imageNamed:#"youimage"];
self.view.backgroundColor = [UIColor colorWithPatternImage:image];
It's that.
To make sure everything resizes properly, no matter rotation, device size and iOS version, I just set an UIImageView
//Create UIImageView
UIImageView *backgroundImageView = [[UIImageView alloc] initWithFrame:self.view.frame]; //or in your case you should use your _blurView
backgroundImageView.image = [UIImage imageNamed:#"image.png"];
//set it as a subview
[self.view addSubview:backgoundImageView]; //in your case, again, use _blurView
//just in case
[self.view sendSubviewToBack:backgroundImageView];
I have a UIImageView that can be moved/scaled (self.imageForEditing). On top of this image view I have an overlay with a hole cut out, which is static and can't be moved. I need to save just the part of the underlying image that is visible through the hole at the time a button is pressed. My current attempt:
- (IBAction)saveImage
{
UIImage *image = self.imageForEditing.image;
CGImageRef originalMask = [UIImage imageNamed:#"picOverlay"].CGImage;
CGImageRef mask = CGImageMaskCreate(CGImageGetWidth(originalMask),
CGImageGetHeight(originalMask),
CGImageGetBitsPerComponent(originalMask),
CGImageGetBitsPerPixel(originalMask),
CGImageGetBytesPerRow(originalMask),
CGImageGetDataProvider(originalMask), nil, YES);
CGImageRef maskedImageRef = CGImageCreateWithMask(image.CGImage, mask);
UIImage *maskedImage = [UIImage imageWithCGImage:maskedImageRef scale:image.scale orientation:image.imageOrientation];
CGImageRelease(mask);
CGImageRelease(maskedImageRef);
UIImageView *test = [[UIImageView alloc] initWithImage:maskedImage];
[self.view addSubview:test];
}
As a test I'm just trying to add the newly created image to the top left of the screen. Theoretically it should be a small round image (the part that was visible through the overlay). But I'm just getting the whole image created again. What am I doing wrong? And how can I account for the fact that self.imageForEditing can be moved around?
CGImageCreateWithMask returns an image of the same size as the original's one.
That is why you get the original image (I assume) with the mask being applied.
You can apply the mask and then remove the invisible border. Use the advice from this question: iOS: How to trim an image to the useful parts (remove transparent border)
Find the bounds of the non-transparent part of the image and redraw it into a new image.