Recreate Photoshop's "mixer brush tool" for painting iOS app - ios

I'm attempting to create a painting app for iOS. One feature I hope to have is like Photoshop's "Mixer Brush Tool." In Photoshop you can adjust the wetness, mix and flow of the brush.
I looked at all the core image effects, even the distortion effects but did not see anything similar. Anyone know of any sample code that does this? If not, any direction to guide me through would be great. I'm stuck at the moment.

Related

Does Apple's PencilKit support that Drawing and Holding to Create Perfect Shapes, if not, how to implement it?

What I want:
The picture is from Notes app in iPad. The blurred red path I drew with Apple Pencil. Holding and then the perfect circle shape generated.
I checked the Apple's developer documentation about PencilKit. It seems no related API to support it.
But apple's Notes app of the system in iPad can do it. Does it mean that Apple have not opened the relevant API? However, I really need to implement Drawing and Holding to Create Perfect Shapes in our app. How to do it?
The only thing I can think of is to use coreML to reshape pencil path? I know very little about it.

Best approach for coding a painting app on iOS / iPad

I’m trying to build a drawing/painting app for the iPad, with textured brush tips and paper.
So far, all drawing app example codes I've come across seem to work by stroking a path. However, I'd like to actually apply a texture all along the path, to simulate say, an oil brush, or charcoal.
Here is an example of a brush tip texture: Bursh tip
The result when painting with the same brush tip: Result
In the results, the top output is what it looks like when the "brush tip" texture is applied far apart along the path.
The bottom result is the texture applied with very small steps along the path. Those who've worked in Photoshop with custom brushes will find this familiar.
I had once prototyped this in Processing years ago (I've since lost the source code), and got it to work in real-time.
In Processing, I converted both the brush tip PNG and the canvas (or the image I'm painting on to) into an array of integers. Then, I simply copied the values from the brush tip to the canvas texture, at the appropriate index. At the end of the cycle, I displayed the image, for that time-step. Repeat this dozens of times in-between each point returned by the mouse.
How would I approach this in iOS, and in real-time? I tried this (https://blog.avenuecode.com/how-to-use-uikit-for-low-level-image-processing-in-swift) but it's way too slow.
This makes me believe Metal might be the only way forward. Is that true, or am complicating this unnecessarily?
Thank you for any guidance!
PS. I'm coding in Swift 5, targeting iOS 13, in Xcode 11.5.
Welcome!
I recommend you check out Core Image. It's Apple's framework for image processing (on a higher level than Metal, though it can integrate with Metal). Unfortunately, the documentation is a bit out-dated, but I'm sure you can translate it into Swift.
Here Apple describes how you would realize a painting app with Core Image and here you can download the corresponding sample project.

AR.js Custom Markers

I am new to WebAR, but I have experience with building AR scenes with Unity3D and software such as Vuforia and 8thWall.
I have a question with the markers with AR.js. Why are they stuck with the thick black border? Is the software not able to just recognize a unique image like how Vuforia and Wikitude works? I apologize for how naive I may be when it comes to WebAR, however, I see this as an issue for the adoption rate of this technology if developers cannot use truly custom images and patterns. Is there a solution available that I may have missed somewhere? What happens if someone deletes/erases the big black border on the marker? Does it still work?
Thanks to anyone who can shed some light on this!
Ar.js uses artoolkit and therefore is marker based. If you want to use it, there is not much you can do about it. Still You can have unique images inside the black box but not much else.
Worth mentioning, it is possible to make the borders thinner, there's even a branch - work in progress worth looking into.
Aframe-argon tried integrating aframe and vuforia, but i'm not sure if it's up to date.

How to achieve paint brush strokes in iOS

I am developing an iOS app that uses a paint brush to draw in a board. I have created a simple paint tool with which user can paint,but what I really need is a paint tool that draws with brush stroke. Is there anyway to achieve the same in native iOS app development?
Here is a sample image of the brush stroke I would like to create:
the fact is you'll have to go to GLPaint
https://developer.apple.com/library/ios/samplecode/GLPaint/Introduction/Intro.html
consider also https://github.com/rbuussyghin/glpaint
and https://stackoverflow.com/a/2045262/294884
this is not easy.
Indeed, it's a huge P.I.T.A. Surprisingly, nobody has a ready-made solution for this in iOS.
For 2019 ...
Surprisingly, as far as I know, there is STILL no ready-made solution for this. Strange thing!
For 2021 ...
Appears there are finally some libraries in Metal, example
https://github.com/Harley-xk/MaLiang

Any idea about simulating the stroke in iOS Paint app?

I'm working on an paint app for iOS platform.And I used CGContextAddQuadCurveToPoint to make the line more smooth. But I'm stuck in how to simulate the stroke.
Papers 53 is a really cool app. I just wanna simulate the stroke just like what 'Papers 53' does.
Any idea about changing the width of the line dynamically and smoothly during drawing?
I found an article about this stuff.
http://www.merowing.info/2012/04/drawing-smooth-lines-with-cocos2d-ios-inspired-by-paper/
This one is really reliable and I just used its algorithm to simulate smooth stroke by using Quartz2d successfully.

Resources