Copy Image with UIPasteBoard (Swift) - ios

I recently saw this project in which a user can tap on a GIF from a custom keyboard and they would see a "copied" toolip appear. I have one question:
How does one reproduce this tooltip in the products GIF-Tutorial?
Could anyone give me some sample code to work with. I understand how to use UIPasteboard and it's functions, but I can't seem to get it to work when I put in the UTI type "public.png" in this function: (I noticed in Objective-c it's "#public.png", but I placed "public.png" I couldn't find a source online for this)
let imageURL = NSString(string:NSBundle.mainBundle().pathForResource("test", ofType: "png")!)
var data = NSData(contentsOfURL: NSURL(string:imageURL)!)
UIPasteboard.generalPasteboard().setData(data!, forPasteboardType: "public.png")

Try using this code:
let image = UIImage(named: "myimage.png")
UIPasteboard.generalPasteboard().image = image;
you can find out how this works here!
Hope this helps
Swift 5.1
UIPasteboard.general.image = image

When using UIPasteboard.generalPasteboard().image = image; it seems the image is not copied to the pasteboard. Instead try the next code, it also explains how you can replace "public.png" string:
// The Pasteboard is nil if full access is not granted
// 'image' is the UIImage you about to copy to the pasteboard
if let pb = UIPasteboard.generalPasteboard() {
let type = UIPasteboardTypeListImage[0] as! String
if !type.isEmpty {
pb.setData(UIImagePNGRepresentation(image), forPasteboardType: type)
if let readData = pb.dataForPasteboardType(type) {
let readImage = UIImage(data: readData, scale: 2)
println("\(image) == \(pb.image) == \(readImage)")
}
}
}

Related

Unable to set <UIImage:0x283942880 anonymous {1080, 1080} renderingMode=automatic> to UIImageView XCode Swift

I have built an app which fetches contacts from phonebook and saves their name and photo. To save the photo I've used the following code
if let imageData = contact.thumbnailImageData {
imageStr = String(describing: UIImage(data: imageData)!)
} else {
imageStr = "null"
}
and when I print imageStr using print("IMGSTR: \(imageStr)") I get the following output
IMSTR: <UIImage:0x283942880 anonymous {1080, 1080} renderingMode=automatic>
Now I'm stuck on how to set this string to the UIImageView, I tried
imageview.image = UIImage(named: imageStr)
but it shows nothing
Could someone please help me in how to set the string <UIImage:0x283942880 anonymous {1080, 1080} renderingMode=automatic> to UIImageView?
No need to convert it to a String. UserDefaults supports Data objects. Store it as Data and when setting it to a UIImageView use let image = UIImage(data : imageData)
If you want to convert an instance of Data to String, you should use the String(decoding:as:) initializer, like this.(eg : let str = String(decoding: data, as: UTF8.self)).

Load image from folder inside project

I'm making an app with the folder like image below:
As you can see in this picture, Stickers folder have 2 sub folder "1" and "2". In side them is bunch of icon and I wanna load it in a collection view with each sub folder is a package of different sticker. For more details, please see this picture:
So, how can I get it in my project folder? I've read about access document directory but seem like it not solve my problem.
Please help me. Thanks in advance.
Look at Filemanagers API. It offers all you need.
Example to get content at path:
let pathToDir1 = Bundle.main.resourcePath! + "/Stickers/1";
let fileManager = FileManager.default
let contentOfDir1 = try! fileManager.contentsOfDirectory(atPath: pathToDir1)
Example to iterate over:
let docsPath = Bundle.main.resourcePath!
let enumerator = fileManager.enumerator(atPath:docsPath)
let images = [UIImage]()
while let path = enumerator?.nextObject() as! String {
let contentOfCurDir = try! fileManager.contentsOfDirectory(atPath: path)
// do whatever you need.
}
For details see Apples documentation and sample code.
If you're looking to access an image within your project folder you simply need to use the image name. You don't need to define the whole path.
Objective-C
UIImage *img = [UIImage imageWithName:#"IMAGE_NAME"];
swift
var img = UIImage(named:"IMAGE_NAME")

Issue with custom keyboard with images

I am working on iOS custom keyboard. In this I need to show stickers and gifs in the custom keyboard, I'm able to show the images on the keyboard, but the problem is, when user select an image from the keyboard I want to show it on the textDocumentProxy. After doing some research I understood that it's not possible to insert the images as the textDocumentProxy.insertText. So I tried UIPasteboardto copy the image from the keyboard and paste it into the input field like this.
Here my code :
let pb = UIPasteboard.generalPasteboard()
let image: UIImage = UIImage(named: "1.png")!
let imgData: NSData = UIImagePNGRepresentation(image)!
pb.setData(imgData, forPasteboardType: kUTTypePNG as String)
pb.image = image
And I also set the RequestsOpenAccess to true in .plist file and allowed access for the custom keyboard in the device as well. But it's not working for me.
Please suggest the possible ways to do this. Thanks in advance.
You want to use gif images in custom keyboard link. and in above code you are using png image.
If you want to copy non gif image then you can use below code for Swift 3
#IBOutlet weak var imageview: UIImageView!
var img : UIImage = UIImage(named:"1.png")!
imageview = UIImageView(image: img)
//copy image code
UIPasteboard.general.image = imageview.image!
or If you want to use gif image then you can use below code for Swift 3
let url = Bundle.main.url(forResource: "imagename", withExtension: ".gif")
let data: NSData = NSData(contentsOf: url!)!
UIPasteboard.general.setData(data as Data, forPasteboardType: "com.compuserve.gif")

Get PHAsset from iOS Share Extension

I am developing a share extension for photos for my iOS app. Inside the extension, I am able to successfully retrieve the UIImage object from the NSItemProvider.
However, I would like to be able to share the image with my container app, without having to store the entire image data inside my shared user defaults. Is there a way to get the PHAsset of the image that the user has chosen in the share extension (if they have picked from their device)?
The documentation on the photos framework (https://developer.apple.com/library/ios/documentation/Photos/Reference/Photos_Framework/) has a line that says "This architecture makes it easy, safe, and efficient to work with the same assets from multiple threads or multiple apps and app extensions."
That line makes me think there is a way to share the same PHAsset between extension and container app, but I have yet to figure out any way to do that? Is there a way to do that?
This only works if the NSItemProvider gives you a URL with the format:
file:///var/mobile/Media/DCIM/100APPLE/IMG_0007.PNG
which is not always true for all your assets, but if it returns a URL as:
file:///var/mobile/Media/PhotoData/OutgoingTemp/2AB79E02-C977-4B4A-AFEE-60BC1641A67F.JPG
then PHAsset will never find your asset. Further more, the latter is a copy of your file, so if you happen to have a very large image/video, iOS will duplicate it in that OutgoingTemp directory. Nowhere in the documentation says when it's going to be deleted, hopefully soon enough.
I think this is a big gap Apple has left between Sharing Extensions and PHPhotoLibrary framework. Apple should've be creating an API to close it, and soon.
You can get PHAsset if image is shared from Photos app. The item provider will give you a URL that contains the image's filename, you use this to match PHAsset.
/// Assets that handle through handleImageItem:completionHandler:
private var handledAssets = [PHAsset]()
/// Key is the matched asset's original file name without suffix. E.g. IMG_193
private lazy var imageAssetDictionary: [String : PHAsset] = {
let options = PHFetchOptions()
options.includeHiddenAssets = true
let fetchResult = PHAsset.fetchAssetsWithOptions(options)
var assetDictionary = [String : PHAsset]()
for i in 0 ..< fetchResult.count {
let asset = fetchResult[i] as! PHAsset
let fileName = asset.valueForKey("filename") as! String
let fileNameWithoutSuffix = fileName.componentsSeparatedByString(".").first!
assetDictionary[fileNameWithoutSuffix] = asset
}
return assetDictionary
}()
...
provider.loadItemForTypeIdentifier(imageIdentifier, options: nil) { imageItem, _ in
if let image = imageItem as? UIImage {
// handle UIImage
} else if let data = imageItem as? NSData {
// handle NSData
} else if let url = imageItem as? NSURL {
// Prefix check: image is shared from Photos app
if let imageFilePath = imageURL.path where imageFilePath.hasPrefix("/var/mobile/Media/") {
for component in imageFilePath.componentsSeparatedByString("/") where component.containsString("IMG_") {
// photo: /var/mobile/Media/DCIM/101APPLE/IMG_1320.PNG
// edited photo: /var/mobile/Media/PhotoData/Mutations/DCIM/101APPLE/IMG_1309/Adjustments/FullSizeRender.jpg
// cut file's suffix if have, get file name like IMG_1309.
let fileName = component.componentsSeparatedByString(".").first!
if let asset = imageAssetDictionary[fileName] {
handledAssets.append(asset)
imageCreationDate = asset.creationDate
}
break
}
}
}

Unable to edit screenshots, performChanges block fails

I'm developing an app that allows users to edit photos using PhotoKit. I was previously saving the edited photo to disk as a JPEG. I would like to avoid converting to JPEG and have implemented the modifications in order to do that. It works great for photos taken with the camera, but if you try to edit a screenshot, the PHPhotoLibrary.sharedPhotoLibrary().performChanges block will fail and log The operation couldn’t be completed. (Cocoa error -1.). I am not sure why this is causing the performChanges block to fail, what have I done wrong here?
I've created a sample app available to download that demonstrates the problem, and I've included the relevant code below. The app attempts to edit the newest photo in your photo library. If it succeeds it will prompt for access to edit the photo, otherwise nothing will happen and you'll see the console log. To reproduce the issue, take a screenshot then run the app.
Current code that works with screenshots:
let jpegData: NSData = outputPhoto.jpegRepresentationWithCompressionQuality(0.9)
let contentEditingOutput = PHContentEditingOutput(contentEditingInput: self.input)
var error: NSError?
let success = jpegData.writeToURL(contentEditingOutput.renderedContentURL, options: NSDataWritingOptions.AtomicWrite, error: &error)
if success {
return contentEditingOutput
} else {
return nil
}
Replacement code that causes screenshots to fail:
let url = self.input.fullSizeImageURL
let orientation = self.input.fullSizeImageOrientation
var inputImage = CIImage(contentsOfURL: url)
inputImage = inputImage.imageByApplyingOrientation(orientation)
let outputPhoto = createOutputImageFromInputImage(inputImage)!
let originalImageData = NSData(contentsOfURL: self.input.fullSizeImageURL)!
let imageSource = CGImageSourceCreateWithData(originalImageData, nil)
let dataRef = CFDataCreateMutable(nil, 0)
let destination = CGImageDestinationCreateWithData(dataRef, CGImageSourceGetType(imageSource), 1, nil) //getType automatically selects JPG, PNG, etc based on original format
struct ContextStruct {
static var ciContext: CIContext? = nil
}
if ContextStruct.ciContext == nil {
let eaglContext = EAGLContext(API: .OpenGLES2)
ContextStruct.ciContext = CIContext(EAGLContext: eaglContext)
}
let cgImage = ContextStruct.ciContext!.createCGImage(outputPhoto, fromRect: outputPhoto.extent())
CGImageDestinationAddImage(destination, cgImage, nil)
if CGImageDestinationFinalize(destination) {
let contentEditingOutput = PHContentEditingOutput(contentEditingInput: self.input)
var error: NSError?
let imageData: NSData = dataRef
let success = imageData.writeToURL(contentEditingOutput.renderedContentURL, options: .AtomicWrite, error: &error)
if success {
//it does succeed
return contentEditingOutput
} else {
return nil
}
}
The problem happens due to the fact that adjusted photos are always saved as JPG files, and screenshots are in fact PNG files.
It occurred to me while I was debugging your sample project and saw the in the PhotoEditor, contentEditingOutput.renderedContentURL is a URL to a JPG, while if you examine the result of CGImageSourceGetType(imageSource) it is clear the it's a PNG (returns a PNG UTI: public.png).
So I went and read the documentation for renderedContentURL which states that if editing a photo asset, the altered image is written in JPEG format - which clearly won't work if your image is a PNG. This leads me to think that Apple don't support editing PNG files or don't want you to. Go figure..

Resources