I have a code to draw a moving line over timage, and I want the previous old line always to disappear, but this code makes all lines to appears there for ever... When I enable DoubleBuffering to form, the code below works OK but the background bitmap located in myimg does not show at all.
Fbitmap.Canvas.Pen.Color:=clRed;
Fbitmap.Canvas.Pen.Width:=2;
Fbitmap.Canvas.MoveTo(Xo,Yo);
Fbitmap.Canvas.LineTo(Xs,Ys);
myimg.Canvas.CopyRect(Rect(0, 0, Width, Height), FBitmap.Canvas, Rect(0, 0, Width, Height));
It's not surprising that this happens. TImage is intended for static images. When you draw on its canvas, what you draw stays there. That is by design.
It seems to me that you have chosen the wrong control. Obvious candidates for this are:
A TPaintBox. In the OnPaint handler draw the background, and then the lines.
A custom control that draws the background in response to WM_ERASEBKGND, and the foreground in response to WM_PAINT.
The latter option may be overkill for you but in my experience is the best defence against flickering.
There are several ways you can accomplish this. The easiest one is to leave the image as is, and to add another component (TShape) which you will move over the image.
If you have to write the line on the image, you need to keet the original image in the memory and use CopyRect over the one with the line before drawing a new one. Faster variation of this method is to keep in memory only the part of the image where the line will be drawn, so you can copy it over the line later, thus deleting it.
Related
The question relates to: Drawing on a paintbox - How to keep up with mouse movements without delay?
I was going to at some point ask the question of how to repaint only part of a paintbox without invalidating the whole paintbox, which is slow when there is a lot of drawing going on or in my case when there are a lot of tiles drawn on screen.
From the link above Peter Kostov did touch on the subject briefly in one of his comments:
you can partly BitBlt the offscreen bitmap (only regions where it is changed). This will improve the performance dramatically.
I have limited graphic skills and have never really used BitBlt before but I will be reading more about it after I post this question.
With that said, I wanted to know how exactly could you determine if regions of a bitmap have changed? Does this involve something simple or is there more magic so to speak involved in determining which regions have changed?
Right now I am still painting directly to the paintbox but once I draw to the offscreen buffer bitmap my next step is to optimise the painting and the above comments sound exactly like what I need, only the determining regions that have changed has confused me slightly.
Of course if there are other ways of doing this please feel free to comment.
Thanks.
You don't have to use BitBlt() directly if you draw to an offscreen TBitmap, you can use TCanvas.CopyRect() instead (which uses StretchBlt() internally). But either way, when you need to invalidate just a portion of the TPaintBox (the portion corresponding to the section of the offscreen bitmap that you drew on), you can use InvalidateRect() directly to specify the appropriate rectangle of the TPaintBox, instead of calling TControl.Invalidate() (which calls InvalidateRect() with the lpRect parameter set to NULL). Whenever the TPaintBox.OnPaint event is triggered, InvalidateRect() will have established a clipping rectangle within the canvas, any drawing you do outside of that rectangle will be ignored. If you want to manually optimize your drawing beyond that, you can use the TCanvas.ClipRect property to determine the rectangle of the TPaintBox that needs to be drawn, and just copy that portion from your onscreen bitmap.
The only gotcha is that TPaintBox is a TGraphicControl descendant, so it does not have its own HWND that you can pass to InvalidateRect(). You would have to use its Parent.Handle HWND instead. Which means you would have to translate TPaintBox-relative coordinates into Parent-relative coordinates and vice versa when needed.
You need to take charge of the painting in order to do this:
Call InvalidateRect to invalidate portions of a window.
When handling WM_PAINT you call BeginPaint which yields a paint struct containing the rect to be painted.
All of this requires a window, and unfortunately for you, TPaintBox is not windowed. So you could use the parent control's window handle, but frankly it would be cleaner to use a windowed control.
You could use my windowed paint control from this question as a starting point: How could I fade in/out a TImage? Use the ClipRect of the control's canvas when painting to determine the part of the canvas that needs re-painting.
I am drawing many audio meters on a view and finding that drawRect can not keep up with the speed of the audio change. In practice only a very small part of the image changes at a time so I really only want to draw the incremental changes.
I have created a CGLayer and when the data changes I use CGContextBeginPath, CGContextMoveToPoint, CGContextAddLineToPoint and CGContextStrokePath to draw in the CGLayer.
In drawRect in the view I use CGContextDrawLayerAtPoint to display the layer.
When the data changes I draw just the difference by drawing a line over the top in the CGLayer. I had assumed it was like photoshop and the new data just draws over the old but I now believe that all the lines I have ever drawn remain present in the layer. Is that correct?
If so is there a way to remove lines from a CGLayer?
What exactly do you mean by 'audio meter' show some snapshots of your intended designs. Shows us some code...
These are my suggestions-
1) Yes the new data just draws on top of CGLayer unless you release it CGLayerRelease(layer)
2) CGContextStrokePath is an expensive operation. You may want to create a generic line stroke and store them in UIImage. Then reuse the UIImage everytime your datachanges.
3) Simplest solution: use UIProgressView if you just want to show audio levels.
I now believe that all the lines I have ever drawn remain present in the layer. Is that correct?
Yes.
If so is there a way to remove lines from a CGLayer?
No. There is not. You would create a new layer. Generally, you create a layer for what is drawn repeatedly.
Your drawing may be able to be simplified by drawing rects rather than paths.
For some audio meters, dividing the meter into multiple pieces may help (you could use a CGLayer here). Similarly, you may be able to just draw rectangles selectively and/or clip drawing, images, and/or layers.
We are devloping a Drawing app. In that when we draw crossed lines ,the intersection point clears the previously drawn pixels where both lines intersects eachother.We are using setneedsdisplayinrect to refresh the drawing data.
How to over come this issue?
tl;dr: You need to store the previous line segments and redraw them when draw in the same rectangle again.
We are using setneedsdisplayinrect to refresh the drawing data
That is a good thing. Can you see any side effects of doing that? If not, try passing the entire rectangle and see what happens. You will see that only the very last segment is drawn.
Now you know that you need to store and possibly redraw previous line segments (or just their image).
Naive approach
The first and simplest solution would be to store all the lines in an array and redraw them. You would notice that this will slow down your app a lot especially when after having drawn for a while. Also, it doesn't go very well with only drawing what you need
Only drawing lines inside the refreshed rectangle
You could speed up the above implementation by filtering all the lines in the array to only redraw those that intersect the refreshed rect. This could for example be done by getting the bounding box for the line segment (using CGPathGetBoundingBox(path)) and checking if it intersects the refreshed rectangle (using CGRectIntersectsRect(refreshRect, boundingBox)).
That would reduce some of the drawing but you would still end up with a very long array of lines and see performance problems after a while.
Rasterize old line segments
One good way of not having to store all previous lines is to draw them into a bitmap (a separate image context (see UIGraphicsBeginImageContextWithOptions(...))) draw that image before drawing the new line segment. However, that would almost double the drawing you would have to do so it should not be done for every frame.
One thing you could do is to store the last 100 line segments (or maybe the last 1000 or whatever your own performance investigation shows, you should investigate these things yourself) and draw the rest into an image. Every time you have a 100 new lines you add them to the image context – by first drawing the image and then drawing the new 100 line – and save that as the new image.
One other thing you could do is to take all the new lines and draw them to the image every time the user lifts their finger. This can work well in combination with the above suggestion.
Depending on the complexity of your app you may need one or more of these suggestions to keep your app responsive (which is very important for a drawing app) but investigate first and don't make your solution overly complex if you don't have to.
I have been trying so much but have no solution find out yet. I have to implement the painting and erasing on iOS so I successfully implemented the painting logic using UIBezierPath. The problem is that for erasing, I implemented the same logic as for painting by using kCGBlendModeClear but the problem is that I cant redraw on the erased area and this is because in each pass in drawRect i have to stroke both the painting and erasing paths. So is there anyway that we can subtract erasing path from drawing path to get the resultant path and then stroke it. I am very new to Core Graphics and looking forward for your reply and comments. Or any other logic to implement the same. I can't use eraser as background color because my background is textured.
You don't need to stroke the path every time, in fact doing so is a huge performance hit. I guarantee if you try it on an iPad 3 you will be met with a nearly unresponsive screen after a few strokes. You only need to add and stroke the path once. After that, it will be stored as pixel data. So don't keep track of your strokes, just add them, stroke them, and get rid of them. Also look into using a CGLayer (you can draw to that outside the main loop, and only render it to your rect in the main loop so it saves lots of time).
These are the steps that I use, and I am doing the exact same thing (I use a CGPath instead of UIBezierPath, but the idea is the same):
1) In touches began, store the touch point and set the context to either erase or draw, depending on what the user has selected.
2) In touches moved, if the point is a certain arbitrary distance away from the last point, then move to the last point (CGContextMoveToPoint) and draw a line to the new point (CGContextAddLineToPoint) in my CGLayer. Calculate the rectangle that was changed (i.e. contains the two points) and call setNeedsDisplayInRect: with that rectangle.
3) In drawRect render the CGLayer into the current window context ( UIGraphicsGetCurrentContext() ).
On an iPad 3 (the one that everyone has the most trouble with due to its enormous pixel count) this process takes between 0.05 ms and 0.15ms per render (depending on how fast you swipe). There is one caveat though, if you don't take the proper precautions, the entire frame rectangle will be redrawn even if you only use setNeedsDisplayInRect: My hacky way to combat this (thanks to the dev forums) is described in my self answered question here, Otherwise, if your view takes a long time to draw the entire frame (mine took an unacceptable 150 ms) you will get a short stutter under certain conditions while the view buffer gets recreated.
EDIT With the new info from your comments, it seems that the answer to this question will benefit you -> Use a CoreGraphic Stroke as Alpha Mask in iPhone App
Hai here is the code for making painting, erasing, undo, redo, saving as picture. you can check sample code and implement this on your project.
Here
I am deriving a custom control from TVirtualDrawTree and I am overriding the DoPaintBackground event to draw a background gradient effect for the tree view.
I am also overriding the DoBeforeItemPaint function so I can custom draw the tree view items. However, I can't quite manage to get the items to paint with a transparent background.
Looking into the source for TVirtualDrawTree, it looks as though the item is copied to a TBitmap image and then copied onto the canvas, however, I have tried editing the source and setting the transparency options on the bitmap itself and it still doesn't seem to be working.
I have also tried clearing the canvas before drawing e.g. Canvas.Brush.Style := bsClear and filling the item rect but no joy.
Don't do transparency, cheat!
Drawing the gradient sounds like a lot of work: draw it to a temporary bitmap so you don't need to re-generate it every time DoPaintBackgrdound() is called. Once you have the background in a bitmap you can BitBlt the relevant portion into the Canvas when you handle DoBeforeItemPaint, and you can BitBlt the bitmap to the whole virtual tree when you need the whole background.
This way you don't need to deal with expensive transparency yet for the end user it looks like your items had been painted using transparency. It's a win:win situation.