If I assign an image to a UIImage view in a xib, is that image cached so that if I access the image using UIImage imageNamed: I am getting cached Image data?
I'm using iOS 5.1
UIImage imageNamed: does its own cacheing of any images you use. The first time you use it for a given image, it'll populate the cache, and subsequently it'll use the cached version.
UIImageView in Interface Builder takes a string to tell it what image to use. It appears that the object that is actually encoded in the Xib to represent the image is a private class called UIImageNibPlaceholder, which contains a private NSString variable called runtimeResourceName. It's this class that implements the initWithCoder: method which is used when the system is loading objects from a xib.
So, the question is, inside UIImageNibPlaceholder's initWithCoder:, does it use the imageNamed: function of UIImage? I think it's reasonable to assume that it does, since the thing stored in the xib is the string runtimeResourceName, and the system is turning that string into an actual image when loading the xib.
This post on the Apple developer forums seems to clarify the point (under NDA so I can't copy it here). I couldn't find any publicly accessible information on the subject.
Related
I have an variable that's of type .Image and class XCUIElement. Something like this:
var image = app.descendantsMatchingType(.Image).elementAtIndex(0)
Is there a way I can extract the actual image so I can compare it to another image?
I've tried caling the value method, but it returns a string. Casting it to a UIImage always fails.
I have had a conversation about this with the Apple Developer Tools evangelist recently. There is currently no way of accessing the actual image from an image view, button, etc. Similarly, there is no way to access other properties of views that might be of interest, like "isHidden" or "attributedText", etc. I was told that the engineers on the UI Testing team are interested in the use cases that people are wanting access to these properties for, so it would be very helpful -- both for them and for the other people who want this feature -- if you would file a bug report / feature request asking for it at https://bugreport.apple.com
As a tip regarding the "value" property on an XCUIElement, at least for now this appears to map to the "accessibilityValue" property of whatever view the XCUIElement is referencing. So if you set that accessibilityValue of a view you are interested in to contain some information you are interested in verifying, then this can possibly help in testing. Two things to be aware of though:
1) Even though the "value" property of an XCUIElement is of type "id", the type of the accessibilityValue property is "NSString". I don't know what would happen if you try to force some non-string value (like an image) into accessibilityValue and then try to retrieve it from the "value" property of XCUIElement, but I suspect it wouldn't work well. Partially because:
2) The accessibilityValue property of a view is actually used by Apple's VoiceOver feature for the vision impaired. When the value is set, it will be read out loud when the user taps on that element (which is why it's supposed to be a string).
I also covered the issue with not being able to access properties of view via XCUIElement in more detail here: http://www.danielhall.io/exploring-the-new-ui-testing-features-of-xcode-7
I know it may be not exactly what you're looking for, but I managed to write a test that checks if the visual representation of a UIImage on the screen has changed.
I'm using a screenshot() method of XCUIElement which returns an instance of XCUIScreenshot:
let myImage = XCUIApplication().images["myAccessibilityIdentifier"]
let screenshotBefore = myImage.screenshot()
//...
//do some actions that change the image being displayed
//...
let screenshotAfter = myImage.screenshot()
//Validating that the image changed as intended
XCTAssertNotEqual(screenshotBefore.pngRepresentation, screenshotAfter.pngRepresentation)
The screenshots will be the size of the image as rendered on the screen which may be different to the original image of course.
It's important to compare the PNG representations using the pngRepresentation property, but not the XCUIScreenshot objects because the two objects will always be different internally.
This technique can't test that the image displayed on the screen is exactly what is needed but at least can detect changes in the image.
I have an iPhone application that displays details of local businesses. I read the business details from a .csv file to first show all businesses in a UITableViewController, and then depending on the selection, show the details for each business in a UIViewController.
All information is displaying and working as intended, except for my images - they are not displaying. I have created an IBOutlet for them same as for all the other labels and this is how I set the image in the UIViewController in viewDidLoad():
self.businessImage.image
= [UIImage imageNamed:businessDetailContent.imageName];
Where businessDetailContent.imageName is a NSObject that holds the business' details.
All images are also imported and present. When I NSLog after the above code, I can see the image is present, but its not showing up on my UIViewController.
And here is how the information in the .csv file looks
ID, name, address, tel, email, URL, description, hotel_someHotelName
Where hotel_someHotelName is the name of the image file (hotel_someHotelName.jpg).
I have tried adding the .jpg extension to the end of hotel_someHotelName in .csv but still no image.
Some of the images are low quality, so may this be the issue?
You are saying:
self.businessImage.image = [UIImage imageNamed:businessDetailContent.imageName];
And no image is appearing in your interface. So there are three possibilities:
[UIImage imageNamed:businessDetailContent.imageName] is nil
self.businessImage is nil
self.businessImage is a nonnil image view, and you are successfully assigning it an image, but that image view is not in your interface (or constraints are resizing it to zero size so that it is effectively invisible)
The first two may be readily tested by careful logging. One way to test the third possibility is to give the image view a colored background (set its backgroundColor; you can do this in Interface Builder too). If you still don't see it, that's the problem.
Just found my issue. In my .csv file, the image name had a space after the image name. When I removed this space (purely found by accident), the images are working fine.
Thanks for all the help all the same
I've created an UIButton subclass and I need to get the path of the image that the button has. I can write self.path but not self.imageView.image.path Any idea?
I don't believe that you can get the path. If you look at the api docs for UIImage, you can instantiate with a file, but by the time you have an instance the file is gone and all that's still there is the NSData.
This wouldn't be very efficient, but if you know the candidate files that were used to create the image, then perhaps you could hash the NSData, and compare with a hash of the NSData on the UIButton's UIImage?
I've never used NSCoding before and I'm very confused about how it should be implemented.
My current iPad app has a UIImageView (called "background") which is a property of my main view controller. "background" has a UIImage "image" property (obviously) and various subviews which are added by the user. The added subviews are my own custom subclasses of UIImageView.
I need to be able to save the state of the "background" UIImageView so it can be restored with the same image and all the subviews in place as it was when archived.
I understand UIImageView conforms to the NSCoding protocol, but I'm not sure where to implement encodeWithCoder and initWithCoder. Do I call these from my main view controller? Do I need to create a category for UIImageView which allows me to override these methods?
Do I need to write code for archiving every property of my "background" UIImageView and its subviews? I have read elsewhere on SO that UIImage does not conform to NSCoding so needs to be subclassed or have a category added in order to be able to archive UIImageView.
I thought there would be a simple way to save to disk an object including all its properties, subviews etc. It seems there's a lot that needs to be done in order for me to save this "background" UIImageView and restore it later. I'm struggling to visualise everything I need to do. Any pointers much appreciated!
Serialization (aka archiving and unarchiving) is actually pretty complicated, but the degree to which Cocoa makes it easy is a pretty impressive feat.
Once you've set things up so that the UIImageView and all of its properties that you want to keep conform to NSCoding, then all you have to do to save the object is:
NSData *dataToSave = [NSKeyedArchiver archivedDataWithRootObject:yourImageView];
And then store that NSData somewhere. Then, to unarchive the object,
UIImageView *restoredImageView = [NSKeyedUnarchiver unarchiveObjectWithData:dataToRestore];
after recovering the NSData from somewhere.
As for making everything conform to NSCoding, UIImageView conforms to NSCoding, as does UIView, so between your UIImageView, its subviews, and their properties, everything probably conforms to NSCoding except for the actual UIImage. For that, if you google you can find lots of categories people have made to make it conform to NSCoding, so just include one of them in your project and you should be fine.
Looking for easy way to support retina displays. It occurred to me that if I could look through a nib-loaded view and get the names of all the image resources used there, I could check if they have a corresponding retina image and load it (if it's a retina device.)
I know how to iterate through subviews after it's loaded, but I don't know how (or if you can) get the resource name set in Interface Builder. I'm trying to avoid having to set all the image names in code.
What I'd like to do (in pseudo code):
for subView in self.view.subviews:
if subView is UIImageView:
resourceName = (UIImageView *)subView.imageName
if retinaResourceFileExists(resourceName) and isRetinaDisplay:
(UIImageView *)subView.image = retinaImage(resourceName)
(Bonus: Maybe there is a way to iterate through IBOutlet variables, but I doubt it?)
use [UIImage imageNamed:#"Foo"]; and the device will load the #2x image automatically.
Interface Builder loads the retina image automatically, too.