Screenshotting on Iphone in swift only has a white background - ios

Some background: I am just trying to do a simple program using xcode 6 beta 7 in swift to screenshot the iphone after I press a button. It is done in SpiteKit and in the game scene. The background is a random png image and the "hello world" default sample text. I programmatically put a press-able button (the default spaceship image is the button) in gamescene didMoveToView function using the following code:
button.setScale(0.2)
screen.frame = CGRect(origin: CGPointMake(self.size.width/4, self.size.height/1.5), size: button.size)
screen.setImage(playAgainButton, forState: UIControlState.Normal)
screen.addTarget(self, action: "action:", forControlEvents: UIControlEvents.TouchUpInside)
self.view!.addSubview(screen)
This sets my press-able button on the screen where it is linked to a function to take a screenshot using this next code:
func action(sender:UIButton!){
UIGraphicsBeginImageContext(self.view!.bounds.size)
self.view!.layer.renderInContext(UIGraphicsGetCurrentContext())
let screenshot = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
UIImageWriteToSavedPhotosAlbum(screenshot, nil, nil, nil)
}
So this code does take a screenshot, but when I look in photos, only the press-able button is shown and the rest of the image is white. The image is shown below:
Below, I think the screen shot should look like this as this is what the screen looks like in the simulator (I just used some random images/text as background):
Can someone explain to me why the screenshot program is not also taking a picture of the background and how to fix it?
I've looked online and I have not seen any question that is a solution to my problem. Online I saw some similar problems and tried out their fixes: I imported quartzCore, coreGraphics, and coreImages, but with no fix. Also, I tried using the ViewController and setting a UIbutton on there with a IBaction to screen shot, but still get the same white background image. I've tried different background images with the same result.
I am fairly new to programming so, any help would be appreciated! Thank you in advance!

Have you set the background as pattern-background-color? => Maybe this doesn't work as expected with renderInContext.
Is the background image rendered in self.view, either as a subview or in its drawRect: method? => if not, of course it will not be rendered.
The easiest way to get a screenshot is with the function
UIImage *_UICreateScreenUIImage();
you have to declare it first, because its an undocumented function. I'm not sure if Apple will accept an app that contains a call to this function, but I think it will be okay (I know the review guidelines say you must not use undocumented functions, but I think they will not care in this case.) It's available and working in iOS 5 - iOS 8 (i have tested it. don't care about iOS 4.)
Another thing that should also work is to render the whole screen instead of just one particular view:
UIGraphicsBeginImageContext(self.view!.window!.bounds.size)
self.view!.window!.layer.renderInContext(UIGraphicsGetCurrentContext())
let screenshot = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
UIImageWriteToSavedPhotosAlbum(screenshot, nil, nil, nil)
(note that I replaced view! with view!.window! to get the one and only window instance.)

Related

UIImageView: How to show scaled image?

I'm new to iOS app development and I begun to learn from this great tutorial:
Start Developing iOS Apps (Swift)
https://developer.apple.com/library/content/referencelibrary/GettingStarted/DevelopiOSAppsSwift/WorkWithViewControllers.html#//apple_ref/doc/uid/TP40015214-CH6-SW1
Everything looks great but when trying to load an image from "Camera roll" on the simulator, I get the image shown oversized and filling all the iPhone simulator screen instead of just showing into the UIImageView box as shown on tutorial images.
The funny fact is that also downloading the correct project provided at the end of the lesson (see the bottom of the page) I get the same issue.
Googling around I get some ideas to insert a:
// The info dictionary may contain multiple representations of the image. You want to use the original
guard let selectedImage = info[UIImagePickerControllerOriginalImage] as? UIImage else {
fatalError("Expected a dictionary containing an image, but was provided the following: \(info)")
}
// Set photoImageView to display the selected image
photoImageView.image = selectedImage
photoImageView.contentMode = UIViewContentMode.scaleAspectFit
// Dismiss the picker
dismiss(animated: true, completion: nil)
after loading image from the camera roll, but lead to no results...
I'm using Xcode 9.2 with deployment target 11.2 and iPhone 8 plus as a target for the simulator.
Any help is apreciated.
It happens because the sample code is only restricting the proportion of the UIImageView (the 1:1 constraint you added in storyboard). In this way, it will always keep your imageView squared but will not limit the size.
As the sample tells, you entered a placeholder size, that is only kept until you set some image on it. When you enter a new image on the UIImageView it changes to the size of the image you set (in your case, probably a bigger image for the camera roll). That's probably why its getting so large.
I think the easy way to fix this to add other constraints in storyboard to limit the size (like actually size or border constraints).

UISlider to change UIImage brightness/contrast in Swift Playground is very laggy.

I am currently creating a filter app. The user has the option to change brightness, contrast, etc.
I have implemented the ability to change these attributes of the image and it currently works but when I test this in Xcode Playgrounds the slider is extremely slow and makes Xcode very laggy.
I assume this is because I have written this in a very inefficient way. I don't want to copy and paste all of my code to Stackoverflow (I want to avoid making this too complicated) so I have uploaded it on Github. Located Here if you download the repo and open the playground up it is 100% running and if you test the slider you will see it is very laggy.
I think what is making it so laggy is that I am reseting the image view every time the slider value is changed resulting in a new UIImage being assigned to it every fraction of a second. Here is a tiny snippet of what I I just said, in code. You will probably still have to look at the code I posted on github since I made some protocols and classes
slider.addTarget(c, action: #selector(c.BrightnessChanged(sender:)), for: .valueChanged)
func updateBrightness(sender:UISlider) {
controls.brightness(sender.value)
img.image = controls.outputUIImage()
}
I genuinely have no intention asking for a solution without having tried myself; I have searched all over the interwebs to figure this out with no success.
Thanks Stackoverflow homies!

cannot change duration of animated dynamic images in watchOS 2

UPDATE: I made an XCode 7.3 sample project displaying the problem
Question:
I am transferring image frames as NSData from the iPhone (Obj-C) to the watch (Swift) via session:didReceiveMessage:replyHandler:. The frames are originally requested by the watch via session.sendMessage(myMessage, replyHandler:). I am converting those images back to PNG with UIImage(data: frame) and appending to an array named images. I have a WKInterfaceImage named animationImage where I am able to load the frames and display them like so:
let frames = UIImage.animatedImageWithImages(images, duration: myDuration)
animationImage.setImage(frames)
animationImage.startAnimating()
The problem is that, no matter what value in myDuration I always get the same speed (i.e.: super fast):
These animations display properly in the phone:
What am I doing wrong?
XCode Version 7.3 (7D175) with iOS 9.0 (deployment target)
EDIT:
This is what the docs say in respect to animating a watchOS WKInterfaceImage:
For animations you generate dynamically, use the
animatedImageWithImages:duration: method of UIImage to assemble your
animation in your WatchKit extension, and then set that animation
using the setImage: method.
I can't explain why startAnimating() of WKInterfaceImage doesn't utilize the duration of the animated image, but it does seem to animate appropriately when using this:
animationImage.startAnimatingWithImagesInRange(NSMakeRange(0, images.count), duration: myDuration, repeatCount: 0)
Set animation duration to imageView like,
animationImage.animationDuration = myDuration
Hope this will help :)

Screenshot everything on display

I am trying to take a screenshot of absolutely everything that is displayed on the iPhone display, in the same way that pressing the home + power buttons together does. The code I currently have to screenshot is this:
func screenShotMethod() {
//hide UI
buttonTrigger.hidden = true
//take screenshot
let layer = UIApplication.sharedApplication().keyWindow!.layer
let scale = UIScreen.mainScreen().scale
UIGraphicsBeginImageContextWithOptions(view.frame.size, false, scale);
layer.renderInContext(UIGraphicsGetCurrentContext()!)
let screenshot = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
UIImageWriteToSavedPhotosAlbum(screenshot, nil, nil, nil)
//show UI
buttonTrigger.hidden = false
}
The point is that I am using the camera and putting a picture over any face it detects, and the easiest way to save that to the camera roll is just to hide the UI and screenshot the screen. With this method however, I get a screenshot of the face tracking picture, in the correct position and size, but not of what the camera sees underneath - just white. I am enabling the camera using the CameraEngine framework in the viewDidLoad() like this:
override func viewDidLoad() {
super.viewDidLoad()
self.cameraEngine.startSession()
Is there a better way to screenshot everything to behave like the hardware induced method? Or how could I include what the camera sees in the screenshot?
Thank you!
UPDATE: In case anyone in the future wants to know how I fixed this, because I can't screenshot things I'm not drawing myself, I solved this issue by taking a picture with the camera and setting that image as the background of the view, and then performing the screenshot function and it works!
Starting in iOS 9 it is no longer possible to take a screenshot that includes elements of the screen not drawn by your program. You can only capture your application's views and layers. Apple doesn't expose the function that is triggered by power+home to third party developers.

Swift - force square photo from library and camera

I'm building an app and I need to FORCE the user to upload square pictures (just like Instagram does), however I'd like to avoid programming an interface from scratch as we're short in time.
It is important to note that the USER must CHOOSE which part of the image he/she wants to show, so cropping the image programatically without asking the user is out of the question.
I've managed to get this to work via camera, however via library I can't seem to force the user to use a square image. Here's the code I have:
func presentGallery(){
// from library
picker.allowsEditing = true
picker.sourceType = UIImagePickerControllerSourceType.PhotoLibrary
presentViewController(picker, animated: true, completion: nil)
}
Then on my imagepickercontroller:
var chosenImage = info[UIImagePickerControllerEditedImage] as! UIImage
However I don't get the desired result. It would be fine if the "minimum zoom" was to show 100% of the height of the image, or if I could add a white/black background to the top and bottom of the image.
Here's the problem:
Instead of something like this:
My app needs to work starting from iOS7.
You should do some sort of check to make sure that the picture is square if they're picking from their library.
Once you get the image (using imagePickerController didFinishPickingMediaWithInfo), then get the image with [info objectForKey:UIImagePickerControllerOriginalImage];. Once you've done this, perform the check:
if (image.size.height != image.size.width) // Show some alert
What might be an even better solution is creating a view which allows the user to pick any photo, and then choose a square part of the photo to import into your app, like Instagram does.

Resources