Swift - force square photo from library and camera - ios

I'm building an app and I need to FORCE the user to upload square pictures (just like Instagram does), however I'd like to avoid programming an interface from scratch as we're short in time.
It is important to note that the USER must CHOOSE which part of the image he/she wants to show, so cropping the image programatically without asking the user is out of the question.
I've managed to get this to work via camera, however via library I can't seem to force the user to use a square image. Here's the code I have:
func presentGallery(){
// from library
picker.allowsEditing = true
picker.sourceType = UIImagePickerControllerSourceType.PhotoLibrary
presentViewController(picker, animated: true, completion: nil)
}
Then on my imagepickercontroller:
var chosenImage = info[UIImagePickerControllerEditedImage] as! UIImage
However I don't get the desired result. It would be fine if the "minimum zoom" was to show 100% of the height of the image, or if I could add a white/black background to the top and bottom of the image.
Here's the problem:
Instead of something like this:
My app needs to work starting from iOS7.

You should do some sort of check to make sure that the picture is square if they're picking from their library.
Once you get the image (using imagePickerController didFinishPickingMediaWithInfo), then get the image with [info objectForKey:UIImagePickerControllerOriginalImage];. Once you've done this, perform the check:
if (image.size.height != image.size.width) // Show some alert
What might be an even better solution is creating a view which allows the user to pick any photo, and then choose a square part of the photo to import into your app, like Instagram does.

Related

Share full-size image from remote URL

I found this tutorial in order to let my users share images from my app:
https://www.hackingwithswift.com/articles/118/uiactivityviewcontroller-by-example
Here is the relevant code:
let items = [yourImage]
let ac = UIActivityViewController(activityItems: items, applicationActivities: nil)
present(ac, animated: true)
It looks like I need to pass a UIImage in the items array.
In my app, I show a feed of images. In the API call for the feed, each image has a thumbnail URL (smaller size) and a full-size URL. My app displays the thumbnail images in the feed in order to keep my app speedy.
This activity view controller is triggered when the user long-presses one of the images in the feed. The problem is that I want the user to be able to share the full-size image (which isn't loaded into any ImageView in the feed), instead of the thumbnail image.
How can I do this?
I've considered fetching the full-size image on long press by following this tutorial, but this causes issues in that the activity view controller doesn't actually show until the image is fully downloaded.
What else can I do??
As an additional question, would the same apply to if I wanted to let the user share a video (mp4)?

Share image and/or text through Share Menu on iOS (Swift)

I am looking for a way to share both an image (a screenshot generated by the app) and text, with a preference to the image.
When I try to achieve this, I see it only works for Apple's own Messaging app and maybe a few other apps. But I don't see apps like Snapchat, Tiktok and other famous apps show up in the default share menu that's provided by Apple. For other apps, only the text shows up and the image is ignored.
However, when I leave out the text from the UIActivityViewController so it only contains the image to share, all other apps show up and most of them work perfectly with my image.
How can I make my code so that, if supported, it shares BOTH the image and the text and in other cases gives preference to the image instead of the text. I still want all the apps (like Snapchat, Tiktok and other social media apps) to show up to share to it because this only happens when only the image is in the UIActivityViewController.
This is the code to make sharing work in my app.
func shareMenu() {
if var top = scene?.view?.window?.rootViewController {
while let presentedViewController = top.presentedViewController {
top = presentedViewController
}
let screenshotImage = getScreenshot(scene: scene!)
let activityVC = UIActivityViewController(activityItems: [screenshotImage , "The text that's need to be shared."], applicationActivities: nil)
activityVC.popoverPresentationController?.sourceView = view
top.present(activityVC, animated: true, completion: nil)
}
}

UIImageView: How to show scaled image?

I'm new to iOS app development and I begun to learn from this great tutorial:
Start Developing iOS Apps (Swift)
https://developer.apple.com/library/content/referencelibrary/GettingStarted/DevelopiOSAppsSwift/WorkWithViewControllers.html#//apple_ref/doc/uid/TP40015214-CH6-SW1
Everything looks great but when trying to load an image from "Camera roll" on the simulator, I get the image shown oversized and filling all the iPhone simulator screen instead of just showing into the UIImageView box as shown on tutorial images.
The funny fact is that also downloading the correct project provided at the end of the lesson (see the bottom of the page) I get the same issue.
Googling around I get some ideas to insert a:
// The info dictionary may contain multiple representations of the image. You want to use the original
guard let selectedImage = info[UIImagePickerControllerOriginalImage] as? UIImage else {
fatalError("Expected a dictionary containing an image, but was provided the following: \(info)")
}
// Set photoImageView to display the selected image
photoImageView.image = selectedImage
photoImageView.contentMode = UIViewContentMode.scaleAspectFit
// Dismiss the picker
dismiss(animated: true, completion: nil)
after loading image from the camera roll, but lead to no results...
I'm using Xcode 9.2 with deployment target 11.2 and iPhone 8 plus as a target for the simulator.
Any help is apreciated.
It happens because the sample code is only restricting the proportion of the UIImageView (the 1:1 constraint you added in storyboard). In this way, it will always keep your imageView squared but will not limit the size.
As the sample tells, you entered a placeholder size, that is only kept until you set some image on it. When you enter a new image on the UIImageView it changes to the size of the image you set (in your case, probably a bigger image for the camera roll). That's probably why its getting so large.
I think the easy way to fix this to add other constraints in storyboard to limit the size (like actually size or border constraints).

Screenshot everything on display

I am trying to take a screenshot of absolutely everything that is displayed on the iPhone display, in the same way that pressing the home + power buttons together does. The code I currently have to screenshot is this:
func screenShotMethod() {
//hide UI
buttonTrigger.hidden = true
//take screenshot
let layer = UIApplication.sharedApplication().keyWindow!.layer
let scale = UIScreen.mainScreen().scale
UIGraphicsBeginImageContextWithOptions(view.frame.size, false, scale);
layer.renderInContext(UIGraphicsGetCurrentContext()!)
let screenshot = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
UIImageWriteToSavedPhotosAlbum(screenshot, nil, nil, nil)
//show UI
buttonTrigger.hidden = false
}
The point is that I am using the camera and putting a picture over any face it detects, and the easiest way to save that to the camera roll is just to hide the UI and screenshot the screen. With this method however, I get a screenshot of the face tracking picture, in the correct position and size, but not of what the camera sees underneath - just white. I am enabling the camera using the CameraEngine framework in the viewDidLoad() like this:
override func viewDidLoad() {
super.viewDidLoad()
self.cameraEngine.startSession()
Is there a better way to screenshot everything to behave like the hardware induced method? Or how could I include what the camera sees in the screenshot?
Thank you!
UPDATE: In case anyone in the future wants to know how I fixed this, because I can't screenshot things I'm not drawing myself, I solved this issue by taking a picture with the camera and setting that image as the background of the view, and then performing the screenshot function and it works!
Starting in iOS 9 it is no longer possible to take a screenshot that includes elements of the screen not drawn by your program. You can only capture your application's views and layers. Apple doesn't expose the function that is triggered by power+home to third party developers.

Screenshotting on Iphone in swift only has a white background

Some background: I am just trying to do a simple program using xcode 6 beta 7 in swift to screenshot the iphone after I press a button. It is done in SpiteKit and in the game scene. The background is a random png image and the "hello world" default sample text. I programmatically put a press-able button (the default spaceship image is the button) in gamescene didMoveToView function using the following code:
button.setScale(0.2)
screen.frame = CGRect(origin: CGPointMake(self.size.width/4, self.size.height/1.5), size: button.size)
screen.setImage(playAgainButton, forState: UIControlState.Normal)
screen.addTarget(self, action: "action:", forControlEvents: UIControlEvents.TouchUpInside)
self.view!.addSubview(screen)
This sets my press-able button on the screen where it is linked to a function to take a screenshot using this next code:
func action(sender:UIButton!){
UIGraphicsBeginImageContext(self.view!.bounds.size)
self.view!.layer.renderInContext(UIGraphicsGetCurrentContext())
let screenshot = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
UIImageWriteToSavedPhotosAlbum(screenshot, nil, nil, nil)
}
So this code does take a screenshot, but when I look in photos, only the press-able button is shown and the rest of the image is white. The image is shown below:
Below, I think the screen shot should look like this as this is what the screen looks like in the simulator (I just used some random images/text as background):
Can someone explain to me why the screenshot program is not also taking a picture of the background and how to fix it?
I've looked online and I have not seen any question that is a solution to my problem. Online I saw some similar problems and tried out their fixes: I imported quartzCore, coreGraphics, and coreImages, but with no fix. Also, I tried using the ViewController and setting a UIbutton on there with a IBaction to screen shot, but still get the same white background image. I've tried different background images with the same result.
I am fairly new to programming so, any help would be appreciated! Thank you in advance!
Have you set the background as pattern-background-color? => Maybe this doesn't work as expected with renderInContext.
Is the background image rendered in self.view, either as a subview or in its drawRect: method? => if not, of course it will not be rendered.
The easiest way to get a screenshot is with the function
UIImage *_UICreateScreenUIImage();
you have to declare it first, because its an undocumented function. I'm not sure if Apple will accept an app that contains a call to this function, but I think it will be okay (I know the review guidelines say you must not use undocumented functions, but I think they will not care in this case.) It's available and working in iOS 5 - iOS 8 (i have tested it. don't care about iOS 4.)
Another thing that should also work is to render the whole screen instead of just one particular view:
UIGraphicsBeginImageContext(self.view!.window!.bounds.size)
self.view!.window!.layer.renderInContext(UIGraphicsGetCurrentContext())
let screenshot = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
UIImageWriteToSavedPhotosAlbum(screenshot, nil, nil, nil)
(note that I replaced view! with view!.window! to get the one and only window instance.)

Resources