Picking images from camera and album in iOS Swift 5 - ios

I'm working in the application where users will select multiple images from album or taking pictures from camera and send it to server via REST API. Size limit is 5MB for all images. Is there any way to determine the size of images while selecting it from UIImagePicker before loading into application?. Also suggest me the better approach do the same

You can’t do it using built in UIImagePickerController. You can write your own asset picker.

Related

Storing UIImage for later use

I am making what can be described as an image manipulation app. My idea is that the user imports UIImages through the pod YPImagePicker and they are stored in two arrays, one for thumbnails and one for the fullsized images. A UICollectionView is then populated with the thumbnails and when the user taps the thumbnail the fullsized image is displayed in a UIImageView.
I am having memory issues with this solution. The RAM hits 300 MB on an iPhone X when I have roughly 10-12 images imported, which I have understood is too much. I guess it is because I store all the fullsized images in an array? Should I store the fullsized images on the users hard drive and not in RAM-memory? Or is there any way I can access the images from the users photo library and fetch the image when the user taps the thumbnail?

Is "slicing" a video attached to iOS app assets possible?

With PNG images, one can attach img.png, img#2x.png, and img#3x.png via assets and then load the correct one at runtime based on the device screen dimensions. In addition, the iTunes upload/download process will "slice" asset resources so that a large app that contains 1x, 2x, and 3x assets can avoid having to download 1x and 3x assets when running on a 2x device. What I am wondering is this, can videos attached to an app also take advantage of slicing so that the app need not download 3 sets of video resources in those cases where videos created for specific screen sizes are included in the app? I see that assets do contain a generic "data" type, but it seems to only be able to mark data files as differing in terms of the Metal API version.
So, I looked all around the internets but was unable to find anything on how to solve this tricky issue. In the end, I rolled my own custom solution that basically wraps the m4v video file and treats the binary file as a PNG image, so that it can be included in an iOS asset catalog. This makes it possible to load #2x or #3x assets on iPhone and use iPad specific video asset dimensions on an iPad while also supporting slicing. For a working example, take a peek at my AlphaOverVideo Framework on github, the Bloom demo shows the client side logic to decode from PNG and then load the decoded .m4v video into a looping player. This demo is cool because it shows off the slicing idea, but it also contains a full screen 1 to 1 pixel aspect video of a flower blooming that shows how amazing perfectly rendered video can look. If anyone is interested in the command line encoder, I uploaded it to github at PNGRewrap.

Difference in quality when video is uploaded directly from camera and when it is selected from gallery

Problem
I am currently developing an iPhone app, where I need to upload a video with a multipart post request.
I am using AfNetworking and UIImagePickerController.
I set the quality of the UIImagePickerController to high.
When I try to upload a video that has just been recorded in the app, the quality is significantly lower, than when I choose a previously recorded video from the gallery.
Question
Does anybody have experience like this one ?

Take panoramic photos from iPhone app and split into several photos

I am developing an application that allows user to take panoramic photos with iPhone. These photos will be used for virtual tour by the user. I have done some exploration on this and found some good open source libraries that make panoramic view form n number of photos (feels like Google Map street view).
I captured photo in Panorama view with iPhone5 S and imported it does not give me many images. It just gives a single large image.
Is it possible to split iPhone panoramic images in multiple images. If possible how can I do this?

are Asset thumbnail pre-saved in iOS

I am writing an app that relays an image saved on the iOS device to a larger screen. Since a fullRes image is too large and takes long to transfer (using an on-device CocoaHTTP server), I am trying to load thumbnail first.
In Windows, we have a thumbs.db, which means that if we access that, there is no image-resizing etc ... its a thumbnail version of the image pre-saved by the OS.
Does the [UIImage imageWithCGImage:asset.aspectRatioThumbnail] for the ALAsset class in iOS does the same action, or does it load the complete hi-res image and then scales it down before returning?
The documentation does not specify but from my experiments reading the thumbnail is 5 times faster than loading the image from disk (even without decoding or scaling it). I assume iOS stores the pre-made thumbnails somewhere for fast access.

Resources