Detect Image with good contrast and well textured - ios

I have a requirement where user click image and I have to check whether that clicked image will be compatible for AR Image trackable.
Example of good image as shown in WWDC

ARReferenceImage's validate() function will determine whether image is suitable for ARKit image tracking, when you can't upload image through xcode

Related

Image Rotates and zooms while editing

I am using a library called SnapSliderFilters to apply filters to an image. I need the filters to be applied exactly the way it is presented in the library's example.
When I implemented the code exactly as directed on an image captured with a custom camera, sliding sideways, while applying the image filter, it also causes the image to rotate by 90 degrees and the image to zoom.
In other case, when I select an image from gallery, that is of exact dimension as the screen, it works perfectly fine.
So, I think I have found the cause to be image size but I don't know where to begin.
Here is how it displays:

Is it possible to should be able to move the crop square in side a uiimagepickercontroller in camera

[1]: https://i.stack.imgur.com/NCMd0.jpg
In the above image how can i should be able to move the crop square, not the image.
enter image description here
My advice is to use the library for achieving this. Here you can find the list of libraries for image upload.
List

Choosing Photo's from Camera Roll on every device

I have an app that chooses an image from the Camera Roll or Takes a photo using the Camera. The problem is that when I run the app on an iPad, if I select an image it doesn't display the full image later on.
Image you select : (Notice that the waterfall is pretty much entered.)
Image displayed : (The image is displayed wrong)
The problem is only on big devices like iPads. I'm saving the image using CoreData and then recovering it. How can I get it to display the full I'm age?
Niall
You need to set UIImageView contentMode to ScaleAspectFit

Color selected part of image taken from camera on touch

My requirement is to fill specific color on specific area of an image. But the image should be an image taken from iphone camera or photo gallery. For example, I could take a picture of me with a blue shirt, the app should allow me to change the color of the shirt to red.
Exactly the functionality of "Paint bucket" tool of the photoshop.
I found couple of approaches
1) Using MASKS with prepared images
color selected part of image on touch
Fill color on specific portion of image?
Scanline Flood Fill Algorithm
https://github.com/Chintan-Dave/UIImageScanlineFloodfill
2) Using GLPaint (Actually this is NOT the solution I am running after)
My question is,
Is it possible to color specific area of a image WITH OUT having MASKS or with generating masks for the image on run time?
Scanline Flood Fill Algorithm does that in to a certain level. But when it comes to real time images(like selfie images) it wont work correctly?

iOS Native Image Cropping with image OUTSIDE of UIImagePickerController

So, if I select an image with the UIImagePicker from the camera or camera roll, before I return to my app, it provides this great cropping feature.
I was hoping that there was a way to call this crop API directly without going through the camera or camera roll. If I have an existing image in an app (for instance a user profile photo) and the user wants to re-crop that image, can I open the crop tool?
I haven't seen anything about calling this crop tool outside of UIImagePicker.

Resources