Coreplot - stretchable Images - core-plot

I was not able to find any example nor any other question regarding stretchable images in CorePlot. I'm trying to use an annotation in my graph whose image must be a stretchable one (the corners of the image must be untouched).
I set the fill of the annotation with fillWithImage: and I think I'm right on that spot, but the image is resizing itself completely, stretching the corners as well.
I tried all the combinations with UIImage and CPTImage that I know of or have seen. But to no avail.
One example is:
UIImage *annotationImage = [[UIImage imageNamed:#"image.png"] stretchableImageWithLeftCapWidth:13 topCapHeight:12];
CPTFill *annotationFill = [CPTFill fillWithImage:[CPTImage imageWithCGImage:annotationImage.CGImage]];
borderLayer.fill = annotationFill;
Please tell me this is possible and I'm missing something.
Thanks in advance

This is not supported in the current version of Core Plot. Please open an enhancement request on the Core Plot issue tracker.

Related

Fill color on specific portion of image?

I want to fill specific color on specific area of an image.
EX:
In above Joker image, If touch on hair of Joker then fill specific color on hair Or touch on nose then fill specific color on nose.. etc. I hope may you understand what I am trying to say.
After googling it's may be achieve by use of UIBezierPath or CGContext Reference but I am very new for it, I tried to read this documentation but I do not understand (take more time) anything also I have limit of time for this Project. So I can not spend more time on it.
Also I found that we can use Flood fill algorithm. But I don't know how to use in my case.
NOTE: I don't want to divide original image (such like hair. nose, cap,...etc) because If I will do then there will be so many images in bundle so I need to handle it for both normal and retina device so this option is not helpful for me.
So please give me your valuable suggestion and also tell me which is best for me UIBezierPath or CGContext Reference? How can I fill color on specific portion of image? and/or can we fill color under the black border of area ? Because I am new at Quartz 2D Programming.
Use the Github library below. The post uses the flood fill algorithm : UIImageScanlineFloodfill
Objective C description : ObjFloodFill
If you want to look at detailed explanation of the algorithm : Recursion Explained with the Flood Fill Algorithm (and Zombies and Cats)
few of the other tutorials in other languages : Lode's Computer Graphics Tutorial : Flood Fill
Rather than attempting to flood fill an area of a raster-based image, a better approach (and much smaller amount of data) would be to create vector images. Once you have a vector image, you can stroke the outline to draw it uncolored, or you can fill the outline to draw it colored.
I recommend using CoreGraphics calls like CGContextStrokePath() and CGContextFillPath() to do the drawing. This will likely look better than using the flood fill because you will get nicely anti-aliased edges.
Apple has some good documentation on how to draw with Quartz 2D. Particularly the section on Paths is useful to what you're trying to do.
You can Clip the context based on image Alpha transparency
I have created a quick image with black color and alpha
Then using the below Code
- (void)drawRect:(CGRect)rect
{
CGContextRef context = UIGraphicsGetCurrentContext(); // Get the context
CGContextSetFillColorWithColor(context, [UIColor blueColor].CGColor); // Set the fill color to be blue
// Flip the context so that the image is not flipped
CGContextTranslateCTM(context, 0, rect.size.height);
CGContextScaleCTM(context, 1.0, -1.0);
// Clip the context by a mask created from the image
UIImage *image = [UIImage imageNamed:#"image.png"];
CGImageRef cgImage = image.CGImage;
CGContextClipToMask(context, rect, cgImage);
// Finally fill the context with the color and mask you set earlier
CGContextFillRect(context, rect);
}
The result
This is a quick hint of what you can do. However you need now to convert your image to add alpha transparent to the parts you need to remove
After a quick search I found these links
How can I change the color 'white' from a UIImage to transparent
How to make one colour transparent in UIImage
If You create your image as SVG vector base image, it will be very light (less than png, jpg) and really easy to manage via Quartz 2D using bezier paths. Bezier paths can be filled (white at starting). UIBezierPath have a method (containsPoint:) that help define if you click inside.

iOS. Complex stretching?

I want to stretch an image using 2 stretch areas. So I need to achieve something like this:
But by default in iOS I can define one rect only.
Is it possible to solve this issue without incision into 2 separate images when each of them has only one rect?
As stated, I would definitely do with 2 images. Or add a category on top of UIImage which does your job. The key is what kind of parameter will you give to the method?
The only thing which iOS provides out of the box is (as described in this post)
// Image with cap insets
UIImage *image = [[UIImage imageNamed:#"image"] resizableImageWithCapInsets:UIEdgeInsetsMake(0, 16, 0, 16)];
There is no way to do what you are referring to without splitting the image, or writing a custom image rendering UIView subclass. You should be careful if going with the later as you will be throwing away a lot of optimisations present in UIImageView.

How can I emboss a UIImage?

I have UITableView with the cells having a image. Now I'm trying to make the image look like a UITabBarItem image when it is selected. I was going to follow this little tutorial to clip the images to a gradient. http://mobiledevelopertips.com/cocoa/how-to-mask-an-image.html
I wanted to emboss the clip image to give it more life but haven't been able to find a simple explanation of how to do so with a UIImage.
I found this but I had a hard time understanding the process of embossing. http://javieralog.blogspot.com/2012/01/nice-emboss-effect-using-core-graphics.html
If I can get any help with or a lead it would be greatly appreciated.
In addition to the Core Graphics implementations and NYXImagesKit, I have an emboss filter in my open source GPUImage framework. To emboss a UIImage, you can simply use the following code:
GPUImageEmbossFilter *embossFilter = [[GPUImageEmbossFilter alloc] init];
embossFilter.intensity = 2.0;
UIImage *embossedImage = [embossFilter imageByFilteringImage:inputImage];
I wrote up a method titled Cocoa Touch - Adding texture with overlay view. You might find that helpful. It does require that you have an overlay view in gray-scale that would generate the "emboss" look. If you know Photoshop or other image editing programs, you may be able to create an appropriate overlay to meet your needs.
After a long time digging through google searches. I believe I found what I need.
This is a set of categories that allow you to manipulate UIImages easily.
http://www.cocoaintheshell.com/2012/01/nyximagesutilities-nyximageskit/

How to mask a UIView to highlight a selection?

The problem that I am facing is simple (and less abstract than the question itself). I am looking for a solution to highlight an area of an image (the selection) while the rest of the image is faded or grayed out. You can compare the effect with the interface you see in, for example, Photoshop when you crop an image. The image is grayed out and the the area that will be cropped is clear.
My initial idea was to use masking for this (hence the question), but I am not sure if this is a viable approach and, if it is, how to proceed.
Not sure if this is the best way, but it should work.
First, you create a screenshot of the view.
UIGraphicsBeginImageContextWithOptions(captureView.bounds.size, view.opaque, 0.0);
[captureView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *screenshot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
This snippet is 'stolen' and slightly modified from here:
Low quality of capture view context on iPad
Then you could create a grayscale mask image of same dimensions as the original (screenshot).
Follow the clear & simple instructions on How to Mask an Image.
Then you create UIImageView, set the masked image as it's image and add it on top of your original view. You also might want to set the backgroundColor of this UIImageView to your liking.
EDIT:
A simpler way would probably be using view.layer.mask, which is "an optional layer whose alpha channel is used as a mask to select between the layer's background and the result of compositing the layer's contents with its filtered background." (from CALayer class reference)
Some literature:
UIView Class Reference
CALayer Class Reference
And a simple example how mask can be made with (possibly hidden) another UIView:
Masking a UIView

How to do curved corners on UIImageView even when setting the UIImage at runtime?

I found examples on StackOverflow for how to do curved corners on a UIImageView, and this worked fine when I was setting the image in my Xcode gui builder tool. When I then added some code to change the image at runtime, the curved corners were only partially working.
Here is the code I am using to try and achieve curved corners:
thumbnail.image = [UIImage imageNamed:self.video.thumbnailFileName];
thumbnail.contentMode = UIViewContentModeScaleAspectFill;
thumbnail.clipsToBounds = YES;
// round the corners:
thumbnail.layer.cornerRadius = 10.0;
thumbnail.layer.masksToBounds = YES;
// add a border:
thumbnail.layer.borderColor = [UIColor lightGrayColor].CGColor;
thumbnail.layer.borderWidth = 3.0;
BEFORE: When setting the image via the gui builder in Xcode:
AFTER: Setting the UIImage programmatically. Notice the curved corners look terrible:
The original PNG image being used in the app:
The issue here was a series of strange coincidences...
First issue was I had two versions of this image at one point, and somehow the first version I had was still being transferred over to my app bundle (or was in my app bundle from previous debug builds). That first one was a different since and wasn't lining up correctly with the curved corners. So once I cleaned my build, that original file was gone.
Second issue was the second file had a slightly different filename, which I didn't notice, and so that was the second reason I couldn't get this working properly.
So ... the code was right, just had issues with my image files! Sorry for the confusion.
It looks like the layer order has gone wrong. The image while rounded at the corners now appears to be above the border instead of below.
I'm weak on layers, but a guess: reverse the order of your operations.
For anyone trying to round corners of a UIImageView, note that you need to import
#import <QuartzCore/QuartzCore.h>
first....to use the layer properties.

Resources