So i'm using AVFoundation to make a camera, but for some reason i keep getting this error when i go to capture, most of the time it results in my phone losing connection to Xcode and crashing the app, I also use it to crop images and get the same sort of error.
Is anybody able to tell me why this error occurs?
Communications error: <OS_xpc_error: <error: 0x19b14ca80> { count = 1,
contents = "XPCErrorDescription" => <string: 0x19b14ce78> {
length = 22, contents = "Connection interrupted" } }>
Here is the code for the capture:
public func capturePictureWithCompletition(imageCompletition: (UIImage?, NSError?) -> Void) {
var blockHandler = { (imageDataSampleBuffer: CMSampleBuffer?, error: NSError?) -> Void in
if let imageDataSampleBuffer = imageDataSampleBuffer {
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
let image = UIImage(data: imageData)
imageCompletition(image?.normalizedImages, nil)
} else {
imageCompletition(nil, error)
}
}
if self.cameraIsSetup {
if self.cameraOutputMode == .StillImage {
if let stillImageOutput = self.stillImageOutput {
let connection = stillImageOutput.connectionWithMediaType(AVMediaTypeVideo)
stillImageOutput.captureStillImageAsynchronouslyFromConnection(connection, completionHandler: blockHandler)
}
}
}
}
It seems to be a memory crashing problem.
Try to resize your image.
This question can help you: How to Resize image in Swift?
Related
I'm trying to create an app that detects the text in a photo taken by the device's camera using MLKit's text detection features. Below is the code in my photoOutput method, as well as the code for the method that it calls:
func photoOutput(_ output: AVCapturePhotoOutput,
didFinishProcessingPhoto photo: AVCapturePhoto,
error: Error?) {
print("worked")
PHPhotoLibrary.shared().performChanges( {
let creationRequest = PHAssetCreationRequest.forAsset()
creationRequest.addResource(with: PHAssetResourceType.photo, data: photo.fileDataRepresentation()!, options: nil)
}, completionHandler: nil)
let cgImage = photo.cgImageRepresentation()!.takeRetainedValue()
print(cgImage)
let orientation = photo.metadata[kCGImagePropertyOrientation as String] as! NSNumber
let uiOrientation = UIImage.Orientation(rawValue: orientation.intValue)!
let image = UIImage(cgImage: cgImage, scale: 1, orientation: uiOrientation)
self.runTextRecognition(with: image)
}
func runTextRecognition(with image: UIImage) {
let visionImage = VisionImage(image: image)
textRecognizer.process(visionImage) { features, error in
self.processResult(from: features, error: error)
}
}
func processResult(from text: VisionText?, error: Error?) {
guard error == nil, let text = text else {
print("oops")
return
}
print(text.text)
}
Whenever I run the app and take a photo, everything runs fine up until the line textRecognizer.process(visionImage). The console message is -[Not A Type _cfTypeID]: message sent to deallocated instance 0x106623e20.
Any help or suggestions would be much appreciated! Please let me know if I should include more information.
Never mind, I fixed this! I should have been using .takeUnretainedValue() instead .takeRetainedValue(), as ARC was releasing the CGImage object for me before I was using it.
Now that AssetsLibrary has been deprecated, we're supposed to use the photos framework, specifically PHPhotoLibrary to save images and videos to a users camera roll.
Using ReactiveCocoa, such a request would look like:
func saveImageAsAsset(url: NSURL) -> SignalProducer<String, NSError> {
return SignalProducer { observer, disposable in
var imageIdentifier: String?
PHPhotoLibrary.sharedPhotoLibrary().performChanges({
let changeRequest = PHAssetChangeRequest.creationRequestForAssetFromImageAtFileURL(url)
let placeholder = changeRequest?.placeholderForCreatedAsset
imageIdentifier = placeholder?.localIdentifier
}, completionHandler: { success, error in
if let identifier = imageIdentifier where success {
observer.sendNext(identifier)
} else if let error = error {
observer.sendFailed(error)
return
}
observer.sendCompleted()
})
}
}
I created a gif from a video using Regift and I can verify that the gif exists inside my temporary directory. However when I go save that gif to the camera roll, I get a mysterious error: NSCocoaErrorDomain -1 (null), which is really super helpful.
Has anyone ever experienced this issue?
You can try this.
let data = try? Data(contentsOf: /*Your-File-URL-Path*/)
PHPhotoLibrary.shared().performChanges({
PHAssetCreationRequest.forAsset().addResource(with: .photo, data: data!, options: nil)
})
My goal is to use an AVCaptureSession to programmatically lock focus, capture one image, activate the flash, then capture a second image after some delay.
I have managed to get the captures to work using an AVCaptureSession instance and an AVCaptureStillImageOutput. However, the images I get when calling captureStillImageAsynchronouslyFromConnection(_:completionHandler:) are 1920 x 1080, not the full 12 megapixel image my iPhone 6S camera is capable of.
Here is my capture function:
func captureImageFromStream(completion: (result: UIImage) -> Void)
{
if let stillOutput = self.stillImageOutput {
var videoConnection : AVCaptureConnection?
for connection in stillOutput.connections {
for port in connection.inputPorts! {
if port.mediaType == AVMediaTypeVideo {
videoConnection = connection as? AVCaptureConnection
break
}
}
if videoConnection != nil {
break
}
}
if videoConnection != nil {
stillOutput.captureStillImageAsynchronouslyFromConnection(videoConnection) {
(imageDataSampleBuffer, error) -> Void in
if error == nil {
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
if let image = UIImage(data: imageData) {
completion(result: image)
}
}
else {
NSLog("ImageCapture Error: \(error)")
}
}
}
}
}
What modifications should I make to capture the image I'm looking for? I'm new to Swift, so please excuse any beginner mistakes I've made.
Before you addOutput the stillImageOutput and startRunning, you need to set your capture session preset to photo:
captureSession.sessionPreset = AVCaptureSessionPresetPhoto
After the user gave us permission to access his Camera Roll. We would like to grab the data and upload it to our services from inside our app.
Is there a way to access the video data from the file? The only way to open the video file is to create an AVAsset. But that's not enough for me.
I'm aware off
func requestExportSessionForVideo(_ asset: PHAsset!,
options options: PHVideoRequestOptions!,
exportPreset exportPreset: String!,
resultHandler resultHandler: ((AVAssetExportSession!,
[NSObject : AnyObject]!) -> Void)!) -> PHImageRequestID
But in my case I just want to upload the video to our service I don't want to do:
first copy the video by doing an export into my local app data
and then send that data up to our service.
delete the data.
This approach above uses a lot of extra space and time and users with full 16GB iPhones it doesn't work well.
Ok this is what I tried so far using the URL
var anAcces = sourceURL?.startAccessingSecurityScopedResource
if !NSFileManager.defaultManager().fileExistsAtPath(sourceURL!.path!) {
NSLog("not exist")
}
var aFileCoordinator = NSFileCoordinator(filePresenter:nil)
var anError: NSError?
aFileCoordinator.coordinateReadingItemAtURL(sourceURL!, options:.ForUploading, error:&anError, byAccessor: { (newURL: NSURL!) -> Void in
var data = NSData(contentsOfURL: newURL)
})
if let unError = anError {
NSLog("Error \(unError)")
}
sourceURL?.stopAccessingSecurityScopedResource
This logs the following:
2015-02-08 16:20:01.947 Cameo[15706:2288691] not exist
2015-02-08 16:20:01.991 Cameo[15706:2288691] Error Error Domain=NSCocoaErrorDomain Code=257 "The operation couldn’t be completed. (Cocoa error 257.)" UserInfo=0x170876480 {NSURL=file:///var/mobile/Media/DCIM/100APPLE/IMG_0155.MOV, NSFilePath=/var/mobile/Media/DCIM/100APPLE/IMG_0155.MOV, NSUnderlyingError=0x17005a9a0 "The operation couldn’t be completed. Operation not permitted"}
Thanks to Paul suggestion I figured it out:
You have to create an PHImageManager requestAVAssetForVideo session in that block you have access to the file and read its data from the url.
let imageManager = PHImageManager.defaultManager()
let videoRequestOptions = PHVideoRequestOptions()
videoRequestOptions.deliveryMode = .HighQualityFormat
videoRequestOptions.version = .Current
videoRequestOptions.networkAccessAllowed = true
videoRequestOptions.progressHandler = { (progress: Double, error: NSError!, stop: UnsafeMutablePointer<ObjCBool>, [NSObject : AnyObject]!) -> Void in
NSLog("Progress: %#", progress.description)
}
videoRequestOptions.progressHandler = { (progress: Double, error: NSError!, stop: UnsafeMutablePointer<ObjCBool>, [NSObject : AnyObject]!) -> Void in
NSLog("Progress: %#", progress.description)
}
imageManager.requestAVAssetForVideo(nextAsset, options: videoRequestOptions, resultHandler: { (avAsset: AVAsset!, avAudioMix: AVAudioMix!, info: [NSObject : AnyObject]!) -> Void in
if let nextURLAsset = avAsset as? AVURLAsset {
let sourceURL = nextURLAsset.URL
if NSFileManager.defaultManager().fileExistsAtPath(sourceURL.path!) {
NSLog("exist file")
}
var data = NSData(contentsOfURL: sourceURL)
if let aData = data {
NSLog("length : <\(aData.length)")
}
else {
NSLog("no data read.")
}
}
}
Regarding the issue:
Failed to issue sandbox extension for file file:///var/mobile/Media/DCIM/100APPLE/IMG_0730.MOV, errno = 1
My workaround for this issue was to create a temporary path which I was able to access the the Media-File:
Future<void> loadAssets() async {
List<Asset> resultList = <Asset>[];
String error = 'No Error Detected';
final temp = await Directory.systemTemp.create();
List<File> imagesFileList = [];
try {
resultList = await MultiImagePicker.pickImages(
maxImages: 300,
enableCamera: true,
selectedAssets: imagesAssetsList,
cupertinoOptions: CupertinoOptions(takePhotoIcon: "chat"),
materialOptions: MaterialOptions(
actionBarColor: "#abcdef",
actionBarTitle: "Example App",
allViewTitle: "All Photos",
useDetailsView: false,
selectCircleStrokeColor: "#000000",
),
);
} on Exception catch (e) {
error = e.toString();
}
if (!mounted) return;
for (int i = 0; i < resultList.length; i++) {
final data = await resultList[i].getByteData();
imagesFileList.add(await File('${temp.path}/img$i').writeAsBytes(
data.buffer.asUint8List(data.offsetInBytes, data.lengthInBytes)));
print('pathnew: ${imagesFileList[i].path}');
await uploadFileToStorage(imagesFileList[i].path);
}
setState(() {
imagesAssetsList = resultList;
_error = error;
});
}
I hope it will work!
So I wrote a method that was supposed to take a picture with the camera then return that photo as a UIImage. But I've been getting this weird error Cannot convert the expression's type 'UIImage?' to type 'Void' and I have no idea what caused it... Here's the code:
func captureAndGetImage()->UIImage{
dispatch_async(self.sessionQueue, { () -> Void in
// Update orientation on the image output connection before capturing
self.imageOutput!.connectionWithMediaType(AVMediaTypeVideo).videoOrientation = self.previewLayer!.connection.videoOrientation
if let device = self.captureDevice{
self.imageOutput!.captureStillImageAsynchronouslyFromConnection(self.imageOutput!.connectionWithMediaType(AVMediaTypeVideo), completionHandler: { (imageDataSampleBuffer, error) -> Void in
if ((imageDataSampleBuffer) != nil){
var imageData:NSData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
var image = UIImage(data: imageData)
return image
}
})
}
})
}
I've also tried return image as UIImage but it didn't work either.
My guess is that there's something to do with the completion handler.
Thanks!
The problem is that you're treating this as a synchronous operation but it is asynchronous. You can't just return the image from an asynchronous operation. You will have to rewrite your method to take a completion block which then gets executed when you retrieve the image. I'd rewrite it to something like the following:
func captureAndGetImage(completion: (UIImage?) -> Void) {
dispatch_async(self.sessionQueue, { () -> Void in
// Update orientation on the image output connection before capturing
self.imageOutput!.connectionWithMediaType(AVMediaTypeVideo).videoOrientation = self.previewLayer!.connection.videoOrientation
if let device = self.captureDevice{
self.imageOutput!.captureStillImageAsynchronouslyFromConnection(self.imageOutput!.connectionWithMediaType(AVMediaTypeVideo), completionHandler: { (imageDataSampleBuffer, error) -> Void in
var image: UIImage?
if ((imageDataSampleBuffer) != nil){
var imageData:NSData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
image = UIImage(data: imageData)
}
completion(image)
})
}
})
}