I know that you can fetch metadata using LPMetadataProvider, and you can get the title of an article from the metadata. But can you get the link of the header image?
let metadataProvider = LPMetadataProvider()
let url = URL(string: "https://www.apple.com/ipad")!
metadataProvider.startFetchingMetadata(for: url) { metadata, error in
if error != nil { return }
// trying to get the image url from metadata
}
I believe that the answer is no. You can only get the actual image. URL images may change and I believe that the idea is not to have cached image URLs.
I'm using SDWebImages library to get image for thumbnails. It is working seemlesly.
However when I navigate from video to a controller where I play video, I need to show thumbnail once again. I need an image path to pass to the player.
The problem is if I pass the same URL the player will download the image once again. In order to avoid this behaviour i'm trying to get the image from disc which is already stored there by sdwebimages library.
/// get thumbnail from cache
var thumbnail: String?
if (video?.hasThumbnail) {
let urlString = "https://test.com/image/001.png"
if let path = SDImageCache.shared.cachePath(forKey: urlString) {
thumbnail = path
} else {
thumbnail = urlString
}
}
This is working on a simulator, but NOT on the real device.
Please read more about how to use
imageView.sd_setImage(with: URL(string: "http://www.example.com/path/to/image.jpg"), placeholderImage: UIImage(named: "placeholder.png"))
I have an application where user can upload multiple images and all the images will be stored in a server and will be displayed on a web view in my iOS application.
Now everything used to work just about fine till iOS 10 but suddenly we started seeing some pictures/ images not being displayed , after a little debugging we found out that this is the problem caused because of the new image format of apple (HEIC),
I tried changing back to the Native UIImagePicker (picks only one image) and the images are being displayed as Apple I guess is converting the Image from HEIC to JPG when a user picks them, but this is not the case when I use 3rd party libraries as I need to implement multiple image picker.
Though we are hard at work to make the conversion process on the server side to avoid users who have not updated the app to face troubles, I also want to see if there is any way in which I can convert the image format locally in my application.
There's a workaround to convert HEIC photos to JPEG before uploading them to the server :
NSData *jpgImageData = UIImageJPEGRepresentation(image, 0.7);
If you use PHAsset, the, in order to have the image object, you'll need to call this method from PHImageManager:
- (PHImageRequestID)requestImageForAsset:(PHAsset *)asset targetSize:(CGSize)targetSize contentMode:(PHImageContentMode)contentMode options:(nullable PHImageRequestOptions *)options resultHandler:(void (^)(UIImage *__nullable result, NSDictionary *__nullable info))resultHandler;
On server side you also have the ability to use this API or this website directly
I've done it this way,
let newImageSize = Utility.getJpegData(imageData: imageData!, referenceUrl: referenceUrl!)
/**
- Convert heic image to jpeg format
*/
public static func getJpegData(imageData: Data, referenceUrl: NSURL) -> Data {
var newImageSize: Data?
if (try? Data(contentsOf: referenceUrl as URL)) != nil
{
let image: UIImage = UIImage(data: imageData)!
newImageSize = image.jpegData(compressionQuality: 1.0)
}
return newImageSize!
}
In Swift 3, given an input path of an existing HEIF pic and an output path where to save the future JPG file:
func fromHeicToJpg(heicPath: String, jpgPath: String) -> UIImage? {
let heicImage = UIImage(named:heicPath)
let jpgImageData = UIImageJPEGRepresentation(heicImage!, 1.0)
FileManager.default.createFile(atPath: jpgPath, contents: jpgImageData, attributes: nil)
let jpgImage = UIImage(named: jpgPath)
return jpgImage
}
It returns the UIImage of the jpgPath or null if something went wrong.
I have found the existing answers to be helpful but I have decided to post my take on the solution to this problem as well. Hopefully it's a bit clearer and "complete".
This solution saves the image to a file.
private let fileManager: FileManager
func save(asset: PHAsset, to destination: URL) {
let options = PHContentEditingInputRequestOptions()
options.isNetworkAccessAllowed = true
asset.requestContentEditingInput(with: options) { input, info in
guard let input = input, let url = input.fullSizeImageURL else {
return // you might want to handle this case
}
do {
try self.save(input, at: url, to: destination)
// success!
} catch {
// failure, handle the error!
}
}
}
private func copy(
_ input: PHContentEditingInput, at url: URL, to destination: URL
) throws {
let uniformType = input.uniformTypeIdentifier ?? ""
switch uniformType {
case UTType.jpeg.identifier:
// Copy JPEG files directly
try fileManager.copyItem(at: url, to: destination)
default:
// Convert HEIC/PNG and other formats to JPEG and save to file
let image = UIImage(data: try Data(contentsOf: url))
guard let data = image?.jpegData(compressionQuality: 1) else {
return // you might want to handle this case
}
try data.write(to: destination)
}
}
I'm trying to get last frame from video. Last frame, not last second (because I have very fast videos, one second can have different scenes).
I've written such code for testing:
private func getLastFrame(from item: AVPlayerItem) -> UIImage? {
let imageGenerator = AVAssetImageGenerator(asset: item.asset)
imageGenerator.requestedTimeToleranceAfter = kCMTimeZero
imageGenerator.requestedTimeToleranceBefore = kCMTimeZero
let composition = AVVideoComposition(propertiesOf: item.asset)
let time = CMTimeMakeWithSeconds(item.asset.duration.seconds, composition.frameDuration.timescale)
do {
let cgImage = try imageGenerator.copyCGImage(at: time, actualTime: nil)
return UIImage(cgImage: cgImage)
} catch {
print("\(error)")
return nil
}
}
But I receive always such error when try to execute it:
Domain=AVFoundationErrorDomain Code=-11832 "Cannot Open"
UserInfo={NSUnderlyingError=0x170240180 {Error
Domain=NSOSStatusErrorDomain Code=-12431 "(null)"},
NSLocalizedFailureReason=This media cannot be used.,
NSLocalizedDescription=Cannot Open}
If I remove requestedTimeTolerance (so it will be on default infinite value) everything is okay, but I always receive brighter imaged than I have in video (maybe it is because not latest frame was captured? Or CGImage → UIImage transform has some troubles?)
Questions:
Why I receive error when zero tolerance is specified? How to get exactly last frame?
Why captured images may be overbrighted that in video? For example if I write such code:
self.videoLayer.removeFromSuperlayer()
self.backgroundImageView.image = getLastFrame(from: playerItem)
I see "brightness jump" (video was darker, image is brighter).
Update 1
I found related issue: AVAssetImageGenerator fails at copying image, but that question is not solved.
I have an iMessage app that displays some remote content using SDWebImage. The images are downloaded and cached on disk. After choosing an image, I want to attach it to the message as a plain UIImage (not a MSMessage).
Here's the code I'm using
// image is already downloaded
let cache = SDImageCache.shared()
let key = remoteImageUrl
let fileUrlString = cache.defaultCachePath(forKey: key)!
let fileUrl = URL(string: fileUrlString)!
// image holds the correct UIImage
let image = UIImage(contentsOfFile: fileUrlString)
activeConversation?.insertAttachment(fileUrl, withAlternateFilename: "a funny gif", completionHandler: { (error) in
// error is nil here
print("error: \(error)")
})
Here's what the message looks like
It seems like the Messages framework can't find the image at that path.
Note: after tapping send, I the iMessage app crashes "MobileSMS quit unexpectedly."
I found out that I needed to use
let fileUrl = URL(fileURLWithPath: fileUrlString)
Hope this helps someone else