I'm implementing a share extension that sends images to a server for computation.
In addition to my progress bar issues, I'm not able to use the images from Twitter app. Here is the code I'm using which is working in many other third party apps.
if let inputItem = extensionContext!.inputItems.first as? NSExtensionItem {
if let itemProvider = inputItem.attachments?.first as? NSItemProvider {
if itemProvider.hasItemConformingToTypeIdentifier(kUTTypeImage as String) {
itemProvider.loadItem(forTypeIdentifier: kUTTypeImage as String) { [unowned self] (imageData, error) in
let url = imageData as! URL
self.sendToServer(localUrl: url)
}
}
}
}
The error I have is the following: Could not cast value of type 'NSConcreteData' (0x1a8d45700) to 'NSURL' (0x1a8d36a10) and occurs for this part of the code let url = imageData as! URL.
It seems that the item is of type image. Any idea why that is happening while it works for the other apps?
Thanks for your help.
I managed to fix that by using the following code, which takes into account different ways the photo is passed by the itemProvider.
itemProvider.loadItem(forTypeIdentifier: kUTTypeImage as String){ [unowned self] (data, error) in
let myImage: UIImage?
switch data {
case let image as UIImage:
myImage = image
case let data as Data:
myImage = UIImage(data: data)
case let url as URL:
let imageData = NSData(contentsOf: url.absoluteURL)
myImage = UIImage(data: imageData! as Data)
default:
//There may be other cases...
print("Unexpected data:", type(of: data))
myImage = nil
}
self.sendToServer(imageData: myImage!, imageName: myImageName)
}
Related
I am a fairly decent Objective C developer, and I am now learning Swift (of which I am finding quite difficult, not only because of new concepts, such as optionals, but also because Swift is continually evolving, and much of the available tutorials are severely outdated).
Currently I am trying parse a JSON from a url into an NSDictionary and then use one of its value to display an image (which is also a url). Something like this:
URL -> NSDictionary -> init UIImage from url -> display UIImage in UIImageView
This is quite easy in Objective C (and there may even be a shorter answer):
NSURL *url = [NSURL URLWithString:#"https://api.nasa.gov/planetary/apod?api_key=DEMO_KEY"];
NSData *apodData = [NSData dataWithContentsOfURL:url];
NSDictionary *apodDict = [NSJSONSerialization JSONObjectWithData:apodData options:0 error:nil];
The above code snippet gives me back a standard NSDictionary, in which I can refer to the "url" key to get the address of the image I want to display:
"url" : "https://apod.nasa.gov/apod/image/1811/hillpan_apollo15_4000.jpg"
This I then convert into a UIImage and give it to a UIImageView:
NSURL *imageURL = [NSURL URLWithString: [apodDict objectForKey:#"url"]];
NSData *imageData = [NSData dataWithContentsOfURL:imageURL];
UIImage *apodImage = [UIImage imageWithData:imageData];
UIImageView *apodView = [[UIImageView alloc] initWithImage: apodImage];
Now, I am basically trying to replicate the above Objective C code in Swift but continuously run into walls. I have tried several tutorials (one of which actually did the exact same thing: display a NASA image), as well as find a few stack overflow answers but none could help because they are either outdated or they do things differently than what I need.
So, I would like to ask the community to provide the Swift 4 code for the these problems:
1. Convert data from url into a Dictionary
2. Use key:value pair from dict to get url to display an image
If it is not too much already, I would also like to ask for detailed descriptions alongside the code because I would like the answer to be the one comprehensive "tutorial" for this task that I believe is currently not available anywhere.
Thank you!
First of all I'm pretty sure that in half a year you will find Objective-C very complicated and difficult. 😉
Second of all even your ObjC code is discouraged. Don't load data from a remote URL with synchronous Data(contentsOf method. Regardless of the language use an asynchronous way like (NS)URLSession.
And don't use Foundation collection types NSArray and NSDictionary in Swift. Basically don't use NS... classes at all if there is a native Swift counterpart.
In Swift 4 you can easily decode the JSON with the Decodable protocol directly into a (Swift) struct,
the URL string can be even decoded as URL.
Create a struct
struct Item: Decodable {
// let copyright, date, explanation: String
// let hdurl: String
// let mediaType, serviceVersion, title: String
let url: URL
}
Uncomment the lines if you need more than the URL.
And load the data with two data tasks.
let url = URL(string: "https://api.nasa.gov/planetary/apod?api_key=DEMO_KEY")!
let task = URLSession.shared.dataTask(with: url) { (data, _, error) in
if let error = error { print(error); return }
do {
let decoder = JSONDecoder()
// this line is only needed if all JSON keys are decoded
decoder.keyDecodingStrategy = .convertFromSnakeCase
let result = try decoder.decode(Item.self, from: data!)
let imageTask = URLSession.shared.dataTask(with: result.url) { (imageData, _, imageError) in
if let imageError = imageError { print(imageError); return }
DispatchQueue.main.async {
let apodImage = UIImage(data: imageData!)
let apodView = UIImageView(image: apodImage)
// do something with the image view
}
}
imageTask.resume()
} catch { print(error) }
}
task.resume()
You can use this extension
extension UIImage {
public static func loadFrom(url: URL, completion: #escaping (_ image: UIImage?) -> ()) {
DispatchQueue.global().async {
if let data = try? Data(contentsOf: url) {
DispatchQueue.main.async {
completion(UIImage(data: data))
}
} else {
DispatchQueue.main.async {
completion(nil)
}
}
}
}
}
Using
guard let url = URL(string: "http://myImage.com/image.png") else { return }
UIImage.loadFrom(url: url) { image in
self.photo.image = image
}
Since image loading is a trivial and at the same time task which could be implemented in many different ways, I would recommend you to not "reinvent the wheel" and have a look to an image loading library such as Nuke, since it already covers most of the cases you might need during your development process.
It allows you to load and show image asynchronously into your view, using simple api:
Nuke.loadImage(with: url, into: imageView)
And also if you need - to specify how image should be loaded and presented:
let options = ImageLoadingOptions(
placeholder: UIImage(named: "placeholder"),
failureImage: UIImage(named: "failure_image"),
contentModes: .init(
success: .scaleAspectFill,
failure: .center,
placeholder: .center
)
)
Nuke.loadImage(with: url, options: options, into: imageView)
Create an UIIimageView Extension and the following code
extension UIImageView {
public func imageFromServerURL(urlString: String) {
self.image = nil
let urlStringNew = urlString.replacingOccurrences(of: " ", with: "%20")
URLSession.shared.dataTask(with: NSURL(string: urlStringNew)! as URL, completionHandler: { (data, response, error) -> Void in
if error != nil {
print(error as Any)
return
}
DispatchQueue.main.async(execute: { () -> Void in
let image = UIImage(data: data!)
self.image = image
})
}).resume()
}}
and
self.UploadedImageView.imageFromServerURL(urlString: imageURLStirng!)
I have just extended on vadian's answer, separated some concerns to clearly understand the basics. His answer should suffice.
First, you have to build your structure. This will represent the JSON structure you retrieved from the webservice.
struct Item: Codable {
let url, hdurl : URL,
let copyright, explanation, media_type, service_version, title : String
}
Then make you request methods. I usually create a separate file for it. Now, vadian mentioned about completion handlers. These are represented by escaping closures. Here, closure ()-> is passed on both functions and called having the decoded data as argument.
struct RequestCtrl {
func fetchItem(completion: #escaping (Item?)->Void) {
let url = URL(string: "https://api.nasa.gov/planetary/apod?api_key=DEMO_KEY")!
//URLSessionDataTask handles the req and returns the data which you will decode based on the Item structure we defined above.
let task = URLSession.shared.dataTask(with: url) { (data, _, _) in
let jsonDecoder = JSONDecoder()
if let data = data,
let item = try? jsonDecoder.decode(Item.self, from: data){
//jsonDecoder requires a type of our structure represented by .self and the data from the request.
completion(item)
} else {
completion(nil)
}
}
task.resume()
}
func fetchItemPhoto(usingURL url: URL, completion: #escaping (Data?)-> Void) {
let task = URLSession.shared.dataTask(with: url) { (data, _, _) in
if let data = data { completion(data) } else { completion(nil) }
}
task.resume()
}
}
Now in you ViewController, call your request and handle the execution of your closure.
class ViewController: UIViewController {
let requestCtrl = RequestCtrl()
override func viewDidLoad() {
super.viewDidLoad()
requestCtrl.fetchItem { (fetchedItem) in
guard let fetchedItem = fetchedItem else { return }
self.getPhoto(with: fetchedItem)
}
}
func getPhoto(with item: Item) {
requestCtrl.fetchItemPhoto(usingURL: item.url) { (fetchedPhoto) in
guard let fetchedPhoto = fetchedPhoto else { return }
let photo = UIImage(data: fetchedPhoto)
//now you have a photo at your disposal
}
}
}
These are not the best of practices since I am also still learning, so by all means do some research on topics especially closures, ios concurrency and URLComponents on Apple's documentation :)
you need to convert url into string and data to add in imageview
let imageURL:URL=URL(string: YourImageURL)!
let data=NSData(contentsOf: imageURL)
Yourimage.image=UIImage(data: data! as Data)
First add the pod in Podfile
pod 'Alamofire',
pod 'AlamofireImage'
you can check this link for install pods => https://cocoapods.org/pods/AlamofireImage
// Use this function for load image from URL in imageview
imageView.af_setImage(
withURL: url,
placeholderImage: placeholderImage //its optional if you want to add placeholder
)
Check this link for method of alamofireImage
https://github.com/Alamofire/AlamofireImage/blob/master/Documentation/AlamofireImage%203.0%20Migration%20Guide.md
Update for Xcode 13.3 , Swift 5
To load the Image asynchronously from a URL string, use this extension:
extension UIImageView {
public func getImageFromURLString(imageURLString: String) {
guard let imageURL = URL(string: imageURLString) else { return}
Task {
await requestImageFromURL(imageURL)
}
}
private func requestImageFromURL(_ imageURL: URL) async{
let urlRequest = URLRequest(url: imageURL)
do {
let (data, response) = try await URLSession.shared.data(for: urlRequest)
if let httpResponse = response as? HTTPURLResponse{
if httpResponse.statusCode == 200{
print("Fetched image successfully")
}
}
// Loading the image here
self.image = UIImage(data: data)
} catch let error {
print(error)
}
}
}
Usage:
imageView.getImageFromURLString(imageURLString: "https://apod.nasa.gov/apod/image/1811/hillpan_apollo15_4000.jpg")
I am calling API in iOS (swift). Everything works perfectly, but it's taking too much time while getting response approximately 40 or 60 seconds. I don't know why this is happening. Let me show you my API calling method:
Code
func userDetailAPI(){
let preferences = UserDefaults.standard
let uid = "u_id"
let acctkn = "acc_tkn"
if preferences.object(forKey: uid) == nil {
// Doesn't exist
} else {
let u_id = preferences.object(forKey: uid) as! String
print(u_id)
let acc_tkn = preferences.object(forKey: acctkn) as! String
print(acc_tkn)
let userprofile: [String : Any] = ["user_id":u_id,"access_token":acc_tkn]
print(userprofile)
Alamofire.request(userDetails, method: .post, parameters: userprofile).responseJSON { response in
print("RESPONSE : \(response)")
let result = response.result.value
if result != nil{
let data = result as! [String : AnyObject]
let userdata = data["data"] as! NSDictionary
let email = userdata["email"]
let name = userdata["name"]
let photo = userdata["photo"]
//let u_type = userdata["user_type"]!
self.lblUserName.text = name as? String
self.lblEmailID.text = email as? String
let proimgurl = NSURL(string: photo as! String)
self.imgProPic.image = UIImage(data: NSData(contentsOf: proimgurl! as URL)! as Data)
// }
}
}
}
}
Please check and help me - is this the right method for API calling or is there any other, better way?
Because of this line
self.imgProPic.image = UIImage(data: NSData(contentsOf: proimgurl! as URL)! as Data)
so you have almofire request plus blocking main thread until image is downloaded , so consider using the asynchronous , automatic cashing SDWebImage
self.imgProPic.sd_setImage(with: proimgurl!, placeholderImage: UIImage(named: "placeholder.png"))
Also in swift avoid using NS stuff like here
let userdata = data["data"] as! NSDictionary // use [String:Any]
and
let proimgurl = NSURL(string: photo as! String) // use URL
You should download the ImageView's image from Url in another thread. If you do it in the main thread, it'll slow down your app and ultimately run out of memory.
The below-given line is which causes the problem is below
self.imgProPic.image = UIImage(data: NSData(contentsOf: proimgurl! as URL)! as Data)
I suggest you use the SDWebImage library.
You can do like something below
let imageUrl = URL(string: photo as! String)
self.imgProPic.image.sd_setImage(with: imageUrl, placeholderImage: UIImage(named: "profile"), options: .refreshCached, completed: nil)
If this doesn't solve your problem, try calling the same web service using API clients such as Postman. If it's taking the same amount of time, then you can't do much about it. Ask the web service developer to optimize the performance.
Hey by the way there is also alamofire image pod is available.
https://github.com/Alamofire/AlamofireImage
eg:
do import AlamofireImage into your file and call image url like below:
Alamofire.request(image_url, method: .get).responseImage(completionHandler: { (response) in
self.your_UIImage_variable.image = response.result.value
})
I am using CoreData for an app. I have set image as BinaryData in data model. But I have fetched image from server as UIImage and it throws error as:
cannot assign value of type 'UIImage?' to type 'NSData?
I searched but couldn't find any resemblance solution for it. Can anyone help me in swift 3?
My code is:
let url1:URL = URL(string:self.appDictionary.value(forKey: "image") as! String)!
let picture = "http://54.243.11.100/storage/images/news/f/"
let strInterval = String(format:"%#%#",picture as CVarArg,url1 as CVarArg) as String as String
let url = URL(string: strInterval as String)
SDWebImageManager.shared().downloadImage(with: url, options: [],progress: nil, completed: {[weak self] (image, error, cached, finished, url) in
if self != nil {
task.imagenews = image //Error:cannot assign value of type 'UIImage?' to type 'NSData?'
}
})
The error message is pretty clear - you cannot assign UIImage object to a variable of NSData type.
To convert UIImage to Swift's Data type, use UIImagePNGRepresentation
var data : Data = UIImagePNGRepresentation(image)
Note that if you're using Swift, you should be using Swift's type Data instead of NSData
You must convert, image into Data (or NSData) to support imagenews data type.
Try this
let url1:URL = URL(string:self.appDictionary.value(forKey: "image") as! String)!
let picture = "http://54.243.11.100/storage/images/news/f/"
let strInterval = String(format:"%#%#",picture as CVarArg,url1 as CVarArg) as String as String
let url = URL(string: strInterval as String)
SDWebImageManager.shared().downloadImage(with: url, options: [],progress: nil, completed: {[weak self] (image, error, cached, finished, url) in
if self != nil {
if let data = img.pngRepresentationData { // If image type is PNG
task.imagenews = data
} else if let data = img.jpegRepresentationData { // If image type is JPG/JPEG
task.imagenews = data
}
}
})
// UIImage extension, helps to convert Image into data
extension UIImage {
var pngRepresentationData: Data? {
return UIImagePNGRepresentation(img)
}
var jpegRepresentationData: Data? {
return UIImageJPEGRepresentation(self, 1.0)
}
}
My iOS app (Swift 3) needs to important images from other apps using an Action Extension. I'm using the standard Action Extension template code which works just fine for apps like iOS Mail and Photos where the image shared is a URL to a local file. But for certain apps where the image being shared is the actual image data itself, my action extension code isn't getting the image.
for item: Any in self.extensionContext!.inputItems {
let inputItem = item as! NSExtensionItem
for provider: Any in inputItem.attachments! {
let itemProvider = provider as! NSItemProvider
if itemProvider.hasItemConformingToTypeIdentifier(kUTTypeImage as String) { //we'll take any image type: gif, png, jpg, etc
// This is an image. We'll load it, then place it in our image view.
weak var weakImageView = self.imageView
itemProvider.loadItem(forTypeIdentifier: kUTTypeImage as String, options: nil, completionHandler: { (imageURL,
error) in
OperationQueue.main.addOperation {
if let strongImageView = weakImageView {
if let imageURL = imageURL as? NSURL {
strongImageView.image = UIImage(data: NSData(contentsOf: imageURL as URL)! as Data)
let imageData = NSData(contentsOf: imageURL as URL)! as Data
self.gifImageView.image = UIImage.gif(data: imageData)
let width = strongImageView.image?.size.width
let height = strongImageView.image?.size.height
.... my custom logic
}
}
For reference, I reached out to the developer for one of the apps where things aren't working and he shared this code on how he is sharing the image to the Action Extension.
//Here is the relevant code. At this point the scaledImage variable holds a UIImage.
var activityItems = Array<Any?>()
if let pngData = UIImagePNGRepresentation(scaledImage) {
activityItems.append(pngData)
} else {
activityItems.append(scaledImage)
}
//Then a little later it presents the share sheet:
let activityVC = UIActivityViewController(activityItems: activityItems,applicationActivities: [])
self.present(activityVC, animated: true, completion: nil)
Figured it out thanks to this post which explains the challenge quite well https://pspdfkit.com/blog/2017/action-extension/ . In summary, we don't know if the sharing app is giving us a URL to an existing image or just raw image data so we need to modify the out of the box action extension template code to handle both cases.
for item: Any in self.extensionContext!.inputItems {
let inputItem = item as! NSExtensionItem
for provider: Any in inputItem.attachments! {
let itemProvider = provider as! NSItemProvider
if itemProvider.hasItemConformingToTypeIdentifier(kUTTypeImage as String) { //we'll take any image type: gif, png, jpg, etc
// This is an image. We'll load it, then place it in our image view.
weak var weakImageView = self.imageView
itemProvider.loadItem(forTypeIdentifier: kUTTypeImage as String, options: nil, completionHandler: { (imageURL,
error) in
OperationQueue.main.addOperation {
if let strongImageView = weakImageView {
if let imageURL = imageURL as? NSURL {
strongImageView.image = UIImage(data: NSData(contentsOf: imageURL as URL)! as Data)
let imageData = NSData(contentsOf: imageURL as URL)! as Data
self.gifImageView.image = UIImage.gif(data: imageData)
let width = strongImageView.image?.size.width
let height = strongImageView.image?.size.height
.... my custom logic
}
else
guard let imageData = imageURL as? Data else { return } //can we cast to image data?
strongImageView_.image = UIImage(data: imageData_)
//custom logic
}
I have some code that creates an NSSecureCoding variable called "content" and I want to convert that variable into NSData that can then be made into a UIImage or be sent to a local server. How do I convert this properly? I want this for a Share Extension I am making in my iOS app, so when you press share on a photo, it gets the photo contents and converts it into NSData. Here is my code:
inputItem = extensionContext!.inputItems.first as NSExtensionItem
attachment = inputItem.attachments![0] as NSItemProvider
if (attachment.hasItemConformingToTypeIdentifier(kUTTypeImage as String)){
attachment.loadItemForTypeIdentifier(kUTTypeImage as String,
options: nil,
completionHandler: {(content, error: NSError!) in
//insert code to convert "content"(NSSecureCoding) to NSData variable
})
}
DispatchQueue.global().async {
attachment.loadItem(forTypeIdentifier: kUTTypeImage as String, options: nil, completionHandler: { (item, error) in
if let error = error {
print(error.localizedDescription)
return
}
var image: UIImage?
if item is UIImage {
image = item as? UIImage
}
if item is URL {
let data = try? Data(contentsOf: item as! URL)
image = UIImage(data: data!)!
}
if item is Data {
image = UIImage(data: item as! Data)!
}
if let image = image {
DispatchQueue.main.async {
// image here
}
}
})
}
kinda late but this happened to me today and I solved it like this,
inside the completionHandler:
if let data = content {
self.imageData = UIImage(data: NSData(contentsOfURL: data as! NSURL)!)
}
imageData is UIImage type.