How to preview automatically edited image after picking with UIImagePickerController - ios

I my app users can take and save photos, but before I save them on disk, I have to compress and downscale them. Is it possible to show automatically edited image in a standard preview screen right after user captured the image with UIImagePickerController? Or should I build my own camera with AVFoundation? If so, could anyone suggest some lightweight opensource camera for my purposes?

You're going to have to build your own solution with AVCaptureSession, which is not hard, since you more than likely will want to keep the original photo in a temp file, compress it, show it on a custom view with an image view in it and then ask the user if they want to save it or not.
Here's Apple's Docs but there are plenty tutorials on how to do this

Related

How to capture a photo automatically in iPhone and iPad

How to capture photo automatically in android phone? is about how to take a picture automatically without people's interaction. This feature is needed in many applications. For example, when you are going to take a picture of a document, you expect that the camera can take it automatically when the full document is insider the picture (or four corners of the document). So my question is how about doing it in iPhone or iPad?
Recently, I am working on Cordova, and does someone know that there are some plugins that have already existed for this kind of camera operations? Thanks
EDIT:
This operation will be done in an APP that will be given the full access of the camera, and the task is how to develop such an APP.
Instead of capturing photo, you should capture video frames. When the captured frame satisfies your requirements, stop capturing the video and proceed.

How to implement a camera taking GIF in iOS?

I want to implement such a function that enable users to make GIF directly from their camera.
In detail, I want to show users a camera view, and a record button. When the button is tapped, the camera starts to record video. In fact, however, behind the scene the camera is actually taking photos at constant speed, say 1 shot per 0.5 second. When the record ends, we got an array of images and then connect them into a GIF.
I think there might be 2 approaches:
1、Directly taking images: Use AVCaptureStillImageOutput's -captureStillImageAsynchronouslyFromConnection method. But it will block UI every time it is called.
2、Take a video and extract several images from it. I have checked video taking libraries such as PBJVision and SCRecorder, and noticed that taking video is typically writing data to a mp4 file locally. I cannot figure out how to extract images at specific time intervals from a video file. Also, is there a way to store the video in memory?
Could anyone help?
Creating Gif
Create and and export an animated gif via iOS?
Convert Images to gif using ios
Extract Images from Video
Get a particular frame by time value using AVAssetReader
Similar here Creating a Movie from Images
How do I export UIImage array as a movie?
You can use a library called 'Regift' by Matthew Palmer, which will convert video to GIF.
Here it is: https://github.com/matthewpalmer/Regift
You can also check out the following answer here on SO:
https://stackoverflow.com/a/28150109/3288936
Hope this will help! :)

Create custom UIImagePickerController to set aspect ratio of camera

Stack Overflow maybe has 1000 threads addressing and re-addressing cropping images on iOS; many of which contains answers claiming to work just like Instagram (my guess is they never open Instagram). So instead of simply using the word Instagram, let me describe the functionalities of what I am trying to do:
I want to create CustomUIImagePickerController such that:
CAMERA: I can either set the exact dimension of the image that the camera takes; or on the very next screen (i.e. retake/use image) have a custom rectangle that the user can move to crop the image just taken.
GALLERY: (same as above:) set the dimensions of the frame that user will use to crop the image.
So far on SO one answer comes close: the answer points to https://github.com/gekitz/GKImagePicker.
But the crucial problem with that project is that it only works with gallery. It does not work with camera.
So lastly, if you look at the Instagram app for iOS-7, it takes complete control of the picture taking experience. How do I do that? I don’t want to have my users go through the whole standard iOS UIImagePickerController experience and then on top of that have them go through my own cropper just to load an image. That’s simply terrible user experience. How many steps or screens does one need to take or load a picture? Right? I know I can't be the only developer who would love to solve this problem. I don’t mind being the one to find the solution and then share it with the world. But presently I don’t even know where to start?
Does anyone have any ideas where I might start?
BTW: the following are not answers:
how to change the UIImagePickerController crop frame
Set dimensions for UIImagePickerController "move and scale" cropbox

ALAssest of an Image taken from Camera without saving it

Hi I was wondering if theres a way to extract ALAsset of an image taken from Camera but without saving it...
Ive come across various example that used writeImageToSavedPhotosAlbum and then fetched the ALAssest, but i dont deem it necessary to save the image in the camera roll, was just wondering if this could be done otherwise
No ALAssest exists until the image has been successfully saved to the image library. Until then you just have a UIImage that has come from the picker. There is no mandate that the image needs to be saved into the library and any decision about whether you want to save the image should be based on what the app tells the user and if the user would naturally expect to find the image in the library after taking / saving it in the app.

Simple way to capture a high quality image with AVCapture

All I need to do is capture an image, and all I can find is complicated code on capturing video or multiple frames. I can't use UIImagePickerController because I do not want to see the camera shutter animation and I have a custom overlay, and my app is landscape only. What is the simplest way to manually capture an image from the front or back live camera view in the correct orientation? I don't want to save it to the camera roll, I want to present it in a view controller for editing.
Take a look to the SquareCam (http://developer.apple.com/library/ios/#samplecode/SquareCam/Introduction/Intro.html) example from Apple. It contains all what you need for high-quality capture of images. I recently copy-pasted the code from this project myself where I solved the same task as you. It works well :)

Resources