I know how to save the captured photo to the library but I added some extra code because I wanted the label in my camera view to combine together and save. When I try to save it to the photo library it doesn't save with the label. Here is the code I have:
#IBAction func takePicture(sender: AnyObject) {
if let videoConnection = stillImageOutput!.connectionWithMediaType(AVMediaTypeVideo) {
videoConnection.videoOrientation = AVCaptureVideoOrientation.Portrait
stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: {(sampleBuffer, error) in
if (sampleBuffer != nil) {
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
let dataProvider = CGDataProviderCreateWithCFData(imageData)
let cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, CGColorRenderingIntent.RenderingIntentDefault)
let image = UIImage(CGImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.Right)
UIGraphicsBeginImageContextWithOptions(self.previewCamera.bounds.size, self.previewCamera.opaque, 0.0)
self.previewCamera.layer.renderInContext(UIGraphicsGetCurrentContext()!)
UIGraphicsEndImageContext()
self.capturedImage.image = UIGraphicsGetImageFromCurrentImageContext()
//saves captured picture to camera roll.
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil)
}
}
Related
I needed to convert my UI Image that I took with AV Camera Foundation into black and white, so after the user 'approves' the photo, I called this function:
func convertToGrayScale(with originalImage:UIImage, imageStyle:String) -> UIImage {
let currentFilter = CIFilter(name: imageStyle)
currentFilter!.setValue(CIImage(image: originalImage), forKey: kCIInputImageKey)
let output = currentFilter!.outputImage
let context = CIContext(options: nil)
let cgimg = context.createCGImage(output!,from: output!.extent)
let processedImage = UIImage(cgImage: cgimg!)
return processedImage
}
in this block of code:
// Save Image to Camera Roll
#IBAction func saveButton(_ sender: Any) {
let newImage = convertToGrayScale(with: image!, imageStyle: "CIPhotoEffectNoir")
let imageToSave = newImage
UIImageWriteToSavedPhotosAlbum(imageToSave, nil, nil, nil)
uploadPhoto()
// downloadPhoto()
dismiss(animated: true, completion: nil)
}
which is triggered when the user approves the photo.
Now, the problem is that while the image is black and white when it is saved to the camera roll, it is obviously not when it is uploaded to the storage, because the upload function (as seen below) passes the unconverted image into the storage:
// Upload to Firebase Storage
func uploadPhoto() {
let imageName = NSUUID().uuidString
let storageRef = Storage.storage().reference().child(MyKeys.imagesFolder).child("\(imageName)")
if let imageData = image!.jpegData(compressionQuality: 1) {
storageRef.putData(imageData, metadata: nil, completion: { (metadata, error) in
if error != nil {
print(error?.localizedDescription as Any)
return
}
print(metadata as Any)
})
}
else {
self.present(alertVC, animated: true, completion: nil)
return
}
}
I tried taking out the function and directly pasting the code into the #IBA Action function, but when I tried running it on my phone, I couldn't even save the image to the camera roll or even the storage. How should I modify the functions to save this black and white image to the storage?
I'm currently having an issue with AVCaptureStillImageOutput where when I try to take a picture the image is currently nil. My current attempts at bug fixing have found that captureStillImageAsynchronously method isn't being called at all and I haven't been able to test whether the sample buffer is nil or not. I'm using this method to feed the camera image into another method that combines the camera image and another image into a single image. The thread fails during that last method. When I try to examine the image from the capture method it is unavailable. What do I need to do to get the camera capture working?
public func capturePhotoOutput()->UIImage
{
var image:UIImage = UIImage()
if let videoConnection = stillImageOutput!.connection(withMediaType: AVMediaTypeVideo)
{
print("Video Connection established ---------------------")
stillImageOutput?.captureStillImageAsynchronously(from: videoConnection, completionHandler: {(sampleBuffer, error) in
if (sampleBuffer != nil)
{
print("Sample Buffer not nil ---------------------")
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
let dataProvider = CGDataProvider(data: imageData! as CFData)
let cgImageRef = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent)
let camImage = UIImage(cgImage: cgImageRef!, scale: CGFloat(1.0), orientation: UIImageOrientation.right)
image = camImage
}
else
{
print("nil sample buffer ---------------------")
}
})
}
if (stillImageOutput?.isCapturingStillImage)!
{
print("image capture in progress ---------------------")
}
else
{
print("capture not in progress -------------------")
}
return image
}
EDIT: Added below method where the camera image is being used.
func takePicture()-> UIImage
{
/*
videoComponent!.getVideoController().capturePhotoOutput
{ (image) in
//Your code
guard let topImage = image else
{
print("No image")
return
}
}
*/
let topImage = videoComponent!.getVideoController().capturePhotoOutput() //overlay + Camera
let bottomImage = captureTextView() //text
let size = CGSize(width:(topImage.size.width),height:(topImage.size.height)+(bottomImage.size.height))
UIGraphicsBeginImageContextWithOptions(size, false, 0.0)
topImage.draw(in: CGRect(x:0, y:0, width:size.width, height: (topImage.size.height)))
bottomImage.draw(in: CGRect(x:(size.width-bottomImage.size.width)/2, y:(topImage.size.height), width: bottomImage.size.width, height: (bottomImage.size.height)))
let newImage:UIImage = UIGraphicsGetImageFromCurrentImageContext()!
UIGraphicsEndImageContext()
return newImage
}
If you use async method the function will return a wrong value, because the async call is still in progress. You can use a completion block, like that:
public func capturePhotoOutput(completion: (UIImage?) -> ())
{
if let videoConnection = stillImageOutput!.connection(withMediaType: AVMediaTypeVideo)
{
print("Video Connection established ---------------------")
stillImageOutput?.captureStillImageAsynchronously(from: videoConnection, completionHandler: {(sampleBuffer, error) in
if (sampleBuffer != nil)
{
print("Sample Buffer not nil ---------------------")
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
let dataProvider = CGDataProvider(data: imageData! as CFData)
let cgImageRef = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent)
let camImage = UIImage(cgImage: cgImageRef!, scale: CGFloat(1.0), orientation: UIImageOrientation.right)
completion(camImage)
}
else
{
completion(nil)
}
})
}
else
{
completion(nil)
}
}
How to use it:
capturePhotoOutput
{ (image) in
guard let topImage = image else{
print("No image")
return
}
//Your code
}
Edit:
func takePicture()
{
videoComponent!.getVideoController().capturePhotoOutput
{ (image) in
guard let topImage = image else
{
print("No image")
return
}
let bottomImage = self.captureTextView() //text
let size = CGSize(width:(topImage.size.width),height:(topImage.size.height)+(bottomImage.size.height))
UIGraphicsBeginImageContextWithOptions(size, false, 0.0)
topImage.draw(in: CGRect(x:0, y:0, width:size.width, height: (topImage.size.height)))
bottomImage.draw(in: CGRect(x:(size.width-bottomImage.size.width)/2, y:(topImage.size.height), width: bottomImage.size.width, height: (bottomImage.size.height)))
let newImage:UIImage = UIGraphicsGetImageFromCurrentImageContext()!
UIGraphicsEndImageContext()
self.setPicture(image: newImage)
}
}
func setPicture(image:UIImage)
{
//Your code after takePicture
}
I am having trouble taking 6 photos and storing them in an object array in order to animate them. I keep getting an error saying:
Array index out of range
Also, I realized that the "image" object isn't recognized outside the if-statement for some reason. What am I doing wrong?
func didPressTakePhoto(){
var picArray: [UIImage] = []
for index in 1...6 {
if let videoConnection = stillImageOutput?.connectionWithMediaType(AVMediaTypeVideo){
videoConnection.videoOrientation = AVCaptureVideoOrientation.Portrait
stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: {
(sampleBuffer, error) in
if sampleBuffer != nil {
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
let dataProvider = CGDataProviderCreateWithCFData(imageData)
let cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, CGColorRenderingIntent.RenderingIntentDefault)
let image = UIImage(CGImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.Right)
self.tempImageView.image = image
self.tempImageView.hidden = false
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
picArray[index] = image;
}
})
}
}
self.gifView.animationImages = picArray;
self.gifView.animationDuration = 1.0
self.gifView.startAnimating()
}
picArray is empty, so you shouldn't use insert method. instead you have to use the append method.
The reason why is picArray is empty you are inserting the values inside the asynchronous block. The for loop completes before inserting images because it doesn't wait for asynchronous blocks to be complete.
You have to wait for asynchronous block to be complete before animating the image view.
You can achieve this using dispatch_group
func didPressTakePhoto(){
var picArray: [UIImage] = []
let dispatchGroup = dispatch_group_create()
for index in 1...6 {
dispatch_group_enter(dispatchGroup)
if let videoConnection = stillImageOutput?.connectionWithMediaType(AVMediaTypeVideo){
videoConnection.videoOrientation = AVCaptureVideoOrientation.Portrait
stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: {
(sampleBuffer, error) in
if sampleBuffer != nil {
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
let dataProvider = CGDataProviderCreateWithCFData(imageData)
let cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, CGColorRenderingIntent.RenderingIntentDefault)
let image = UIImage(CGImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.Right)
self.tempImageView.image = image
self.tempImageView.hidden = false
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
picArray.append(image);
}
dispatch_group_leave(dispatchGroup)
})
} else {
dispatch_group_leave(dispatchGroup)
}
}
dispatch_group_notify(dispatchGroup, dispatch_get_main_queue()) {
self.gifView.animationImages = picArray;
self.gifView.animationDuration = 1.0
self.gifView.startAnimating()
}
}
Hope this helps.
Im making a camera app and I want to add a label to the pictures that are taken like in the app MSQRD and save to the photo album. I got the label to display on to the image but when I go to the photo album it shows an image but without the label. What am I doing wrong with my code. Here is the code that Im using currently:
#IBAction func takePicture(sender: AnyObject) {
if let videoConnection = stillImageOutput!.connectionWithMediaType(AVMediaTypeVideo) {
videoConnection.videoOrientation = AVCaptureVideoOrientation.Portrait
stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: {(sampleBuffer, error) in
if (sampleBuffer != nil) {
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
let dataProvider = CGDataProviderCreateWithCFData(imageData)
let cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, CGColorRenderingIntent.RenderingIntentDefault)
let image = UIImage(CGImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.Right)
//edited this part it saves the entire view in the photo album except for the image that was taken and the label.
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, self.view.opaque, 0.0)
self.view.layer.renderInContext(UIGraphicsGetCurrentContext()!)
let newImage: UIImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
//saves captured picture to camera roll.
UIImageWriteToSavedPhotosAlbum(newImage, nil, nil, nil)
//fade in the image that was taken
UIView.animateWithDuration(0.5, delay: 0.1, options: UIViewAnimationOptions.CurveLinear, animations: {
self.capturedImage.image = image
self.capturedImage.alpha = 1.0
}, completion: nil)
}
}
}
You made the capturedImage from UIGraphicsImageContext, but never used. Change UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil) to UIImageWriteToSavedPhotosAlbum(capturedImage, nil, nil, nil).
I am trying to save an image as a bitmap using AVFoundation. Right now I am using jpeg, and was wondering what I would need to change in order to save the image as a bitmap. Side note - we plan on storing the byte[] onto our google app engine and then any device (Android or IOS) that would need the image, would be able to pull the byte[] from the database and convert it into an image on the device. Is this a valid way of going about storing images? If not what would you suggest?
Here is my code saving the image as a jpeg
#IBAction func didPressTakePhoto(sender: AnyObject) {
if let videoConnection = stillImageOutput!.connectionWithMediaType(AVMediaTypeVideo) {
videoConnection.videoOrientation = AVCaptureVideoOrientation.Portrait
stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: {(sampleBuffer, error) in
if (sampleBuffer != nil) {
var imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
var dataProvider = CGDataProviderCreateWithCFData(imageData)
var cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, kCGRenderingIntentDefault)
//change orientation based on if the camera is front or back
if self.selectedCamera == 1 {
var image = UIImage(CGImage: cgImageRef, scale: 1.0, orientation: UIImageOrientation.LeftMirrored)
self.capturedImage.image = image
} else {
var image = UIImage(CGImage: cgImageRef, scale: 1.0, orientation:UIImageOrientation.Right)
self.capturedImage.image = image
}