Core Plot annotation hiding my plot - core-plot

I've set up a scattered plots with time as the x axis. I'm drawing a little sun image in the background at noon time:
CGRect imageRect = CGRectMake(0, 0 , 50, 50);
CPTBorderedLayer * imageLayer = [[CPTBorderedLayer alloc] initWithFrame:imageRect];
imageLayer.fill = [CPTFill fillWithImage:[CPTImage imageWithCGImage:[[UIImage imageNamed:#"SunIcon.png"] CGImage]]];
NSArray *anchorPoint = [NSArray arrayWithObjects:
[NSNumber numberWithInt:12*60*60], [NSNumber numberWithInt:100],
nil];
CPTPlotSpaceAnnotation *imageAnnotation1 = [[CPTPlotSpaceAnnotation alloc] initWithPlotSpace:(CPTXYPlotSpace *)graph.defaultPlotSpace anchorPlotPoint:anchorPoint];
imageAnnotation.contentLayer = imageLayer;
[graph addAnnotation:imageAnnotation];
[graph addSublayer:imageLayer];
The problem is that the sun now hides my plots. How can I send the image to the background behind all plots?
Edit: The opposite case is that my right aligned y axis gets hidden by the plot. Can I bring this layer to the front?

Add the annotation to the plot area and then move the image layer to the front of the sublayer order.
[graph.plotAreaFrame.plotArea addAnnotation:imageAnnotation];
[imageLayer removeFromSuperlayer];
[graph.plotAreaFrame.plotArea insertSublayer:imageLayer atIndex:0];

Related

Scrolling through data only in core plot iOS

Currently I'm using core plot cocoa pod to draw a scatter graph, currently I need scrolling to be enabled but at the same time I need the x & y axis the be fixed on the corner of the screen and only move through data any way to do this ?.
My current implementation for the following code makes me scrolls through data but x-axis or y-axis disappears when I scroll far through the graph
-(void) viewDidAppear:(BOOL)animated{
[super viewDidAppear:animated];
CPTGraphHostingView* hostView = [[CPTGraphHostingView alloc] initWithFrame:self.view.frame];
[self.view addSubview: hostView];
// Create a CPTGraph object and add to hostView
CPTGraph* graph = [[CPTXYGraph alloc] initWithFrame:hostView.bounds];
hostView.hostedGraph = graph;
// Get the (default) plotspace from the graph so we can set its x/y ranges
CPTXYPlotSpace *plotSpace = (CPTXYPlotSpace *) graph.defaultPlotSpace;
plotSpace.allowsUserInteraction=YES;
// Note that these CPTPlotRange are defined by START and LENGTH (not START and END) !!
[plotSpace setYRange:[CPTPlotRange plotRangeWithLocation:[NSNumber numberWithInt:0] length:[NSNumber numberWithInt:10]]];
[plotSpace setXRange:[CPTPlotRange plotRangeWithLocation:[NSNumber numberWithInt:0] length:[NSNumber numberWithInt:10]]];
plotSpace.delegate = self;
graph.paddingLeft = 10.0;
graph.paddingRight = 10.0;
graph.paddingTop = 10.0;
graph.paddingBottom = 10.0;
// Create the plot (we do not define actual x/y values yet, these will be supplied by the datasource...)
CPTScatterPlot* plot = [[CPTScatterPlot alloc] initWithFrame:CGRectZero];
// Let's keep it simple and let this class act as datasource (therefore we implemtn <CPTPlotDataSource>)
plot.dataSource = self;
// Finally, add the created plot to the default plot space of the CPTGraph object we created before
[graph addPlot:plot toPlotSpace:graph.defaultPlotSpace];
}
Thanks in advance :)
Use the axisConstraints:
x.axisConstraints = [CPTConstraints constraintWithRelativeOffset:0.0];
y.axisConstraints = [CPTConstraints constraintWithRelativeOffset:0.0];
Use 0.0 to place the axis to the left (y-axis) or bottom (x-axis). Use 1.0 to place the axis to the right (y-axis) or top (x-axis). Any fraction between 0.0 and 1.0 is valid.

Remove blank space around scatter graph in core plot

I'm using core plot for line graph.in the scatter graph i'm getting some blank space around the scatter graph.as you can see the screen shot(in debug mode) my graph plot area is covered by empty space.How can i remove it?
Initialising host view
self.hostView = [(CPTGraphHostingView *) [CPTGraphHostingView alloc] initWithFrame:CGRectMake(0, 0, self.frame.size.width, self.frame.size.height-10)];
self.hostView.backgroundColor = [UIColor clearColor];;
self.hostView.allowPinchScaling = NO;
[self addSubview:self.hostView];
Initializing graph
graph = [[CPTXYGraph alloc] initWithFrame:self.hostView.bounds];
[graph applyTheme:[CPTTheme themeNamed:kCPTPlainWhiteTheme]];
self.hostView.hostedGraph = graph;
EDIT
graph.plotAreaFrame.PaddingLeft = 55.0f;
graph.plotAreaFrame.PaddingRight = 15.0f;
graph.plotAreaFrame.PaddingTop = 10.0f;
graph.plotAreaFrame.paddingBottom = 28.0;
These are the paddings i have provided.
The graph starts with 20 pixels of padding on each edge but that's easy to change:
graph.PaddingLeft = 0.0;
graph.PaddingTop = 0.0;
graph.PaddingRight = 0.0;
graph.PaddingBottom = 0.0;

CorePlot horizontal gradient on axis

I'm trying to create a horizontal gradient on my y axis in CorePlot:
CPTGradient *axisGradient = [CPTGradient gradientWithBeginningColor:[CPTColor redColor] endingColor:[CPTColor yellowColor]];
axisGradient.angle = 180.f;
CPTMutableLineStyle *axisStyle = [y.axisLineStyle mutableCopy];
axisStyle.lineWidth = 6.f;
axisStyle.lineGradient = axisGradient;
y.axisLineStyle = axisStyle;
No matter what I set the gradient angle to the gradient is vertical. Anyone have any ideas?
for horizontal gradient you can use
CPTColor *areaColorStart = [CPTColor colorWithComponentRed:0.74 green:0.78 blue:0.82 alpha:1];
CPTColor *areaColorEnd = [CPTColor colorWithComponentRed:0.9 green:0.91 blue:0.92 alpha:.2];
CPTGradient *areaGradient1 = [CPTGradient gradientWithBeginningColor:areaColorStart endingColor:areaColorEnd];
areaGradient1.angle = 0.0f;
CPTFill *areaGradientFill = [CPTFill fillWithGradient:areaGradient1];
plot.areaFill = areaGradientFill;
plot.areaBaseValue = CPTDecimalFromFloat(yRange);
hope it will help.
Core Plot draws gradient line fills along the line path. If you want a tall, skinny rectangle filled with a gradient that fades from one color at the top to another at the bottom, use a plot space annotation. Create a CPTBorderedLayer for the annotation content and give it a gradient fill ([CPTFill fillWithGradient:axisGradient]).

How can I use found face Coordinate(core image)

UIImage* image = [UIImage imageNamed:# "face.png" ];
UIImageView testImage = [[UIImageView alloc] initWithImage: image];
[testImage setTransform:CGAffineTransformMakeScale(1, -1)];
[[[UIApplication sharedApplication] delegate].window setTransform:
CGAffineTransformMakeScale(1, -1)];
[testImage setFrame:CGRectMake(0, 0, testImage.image.size.width,
testImage.image.size.height)];
[self.view addSubview:testImage];
CIImage* ciimage = [CIImage imageWithCGImage:imag​​e.CGImage];
NSDictionary* opts = [NSDictionary dictionaryWithObject:
CIDetectorAccuracyHigh forKey:CIDetectorAccuracy];
CIDetector* detector = [CIDetector detectorOfType:CIDetectorTypeFace
context:nil options:opts];
NSArray* features = [detector featuresInImage:ciimage];
for (CIFaceFeature *faceFeature in features)
{
CGFloat faceWidth = faceFeature.bounds.size.width;
UIView* faceView = [[UIView alloc] initWithFrame:faceFeature.bounds];
faceView.layer.borderWidth = 1;
faceView.layer.borderColor = [[UIColor redColor] CGColor];
[self.view addSubview:faceView];
}
how can I found the face coordinate?
I try to use facefeature.bounds.origin.x and facefeature.bounds.origin.y
but sometimes , it is not the correct coordinate
how can I found coordinate?
------------------2016/04/10------------------
This is my problem ios
x,y is opposite than c#
Here's the basic idea behind it, CIDetector allows you to extrapolate points for the left eye, right eye, and mouth from the image. From that we can do some basic math to create a rectangle that spans between these points, e.g.
for (CIFaceFeature *faceFeature in features)
{
CGPoint lefteye = faceFeature.leftEyePosition;
CGPoint righteye = faceFeature.rightEyePosition;
CGPoint mouth = faceFeature.mouthPosition;
//Face Rectangle
CGRect faceRectangle = CGRectMake(lefteye.x, lefteye.y, righteye.x - lefteye.x, mouth.y - righteye.y);
//Face Center
CGPoint faceCenter = CGPointMake(faceRectangle.origin.x + (faceRectangle.size.width / 2), faceRectangle.origin.y + (faceRectangle.size.height / 2));
UIView* faceView = [[UIView alloc] initWithFrame:faceRectangle];
faceView.layer.borderWidth = 1;
faceView.layer.borderColor = [[UIColor redColor] CGColor];
[self.view addSubview:faceView];
}
Keep in mind, I'm not on a computer right now to test this part of the function for you, but I believe the coordinates outputted by the detector are true to the resolution of the input image. This would cause inaccuracy when trying to apply the create rect to an on screen view using iOS's points coordinate system. This being said all you should have to do is run the newly created rectangle through a convertRect function to get the proper coordinates.

CorePlot: Data Labels with multiple layers

I'm trying to create a plot with both text and an image as the data labels for each point. The code I'm using looks like this;
//Point symbol
NSInteger symbolIndex = interval.symbolIndex;
UIImage *img = [Common getPointSymbol:symbolIndex];
CGRect imageRect = CGRectMake(0, 0 , 60, 50);
CPTBorderedLayer * imageLayer = [[CPTBorderedLayer alloc] initWithFrame:imageRect];
imageLayer.fill = [CPTFill fillWithImage:[CPTImage imageWithCGImage:[img CGImage] scale:img.scale]];
//Point text
CPTMutableTextStyle *dataLabelTextStyle = [CPTMutableTextStyle textStyle];
dataLabelTextStyle.color = [CPTColor blackColor];
dataLabelTextStyle.fontSize = 12.0f;
dataLabelTextStyle.fontName = #"Helvetica";
CPTTextLayer *textLayer = [[CPTTextLayer alloc] initWithText:point.title style:dataLabelTextStyle];
[imageLayer addSublayer:textLayer];
return imageLayer;
This works well for the data points that are initially visible on the plot, but for points which are outside the starting plot area only the symbol-images are drawn. If zooming out and forcing the plot to redraw, both layers will again appear. The same problem occurs if reversing the order of the layers, and then only the text layer is drawn. In effect, it seems sublayers are not rendered if starting outside the plot.
Is this a core plot bug? Is it possible to merge layers into one to mitigate this problem?
Thankful for help!
CPTTextLayer is a subclass of CPTBorderedLayer. You can add the background fill to the text layer and then use the padding properties to position the text within the layer.

Resources