In my app I need to display stores around the user location. Each store has name, tagline, and a logo, and we want to display all these information on the callout bubble that comes up on the map once I touch a pin. Considering that I need to load the image remotely, and that waiting three seconds to see the callout after touching the pin is not acceptable, what would be the best solution?
The file of an array of around 20 stores is like 10kb, but if we load the logo for all of them straight away, maybe it would be like 110kb (considering an estimate of 5kb per image), which I`m not sure if it is a good idea.
In one of my project that case works just fine.
I'm using SDWebImage for the remote async load of the image.
I did:
subclass the MKPinAnnotationView:
.h
#interface TLStoreResultMapAnnotationView : MKPinAnnotationView
#property (assign)BOOL imageSet;
#end
.m
#import "TLStoreResultMapAnnotationView.h"
#import "TLStoreResultMapAnnotation.h"
#import "UIImageView+WebCache.h"
#implementation TLStoreResultMapAnnotationView
#synthesize imageSet=_imageSet;
- (void)layoutSubviews {
if(self.selected && (!self.imageSet)) {
TLStoreResultMapAnnotation *annotation = (TLStoreResultMapAnnotation *)self.annotation;
NSURL *url = [NSURL URLWithString:[annotation.store.imageURL stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding]];
UIImageView *storeImageView = (UIImageView *)self.leftCalloutAccessoryView;
storeImageView.frame = CGRectMake(storeImageView.frame.origin.x,storeImageView.frame.origin.y,30.0,30.0);
storeImageView.contentMode = UIViewContentModeScaleAspectFill;
storeImageView.clipsToBounds = YES;
[storeImageView setImageWithURL:url
placeholderImage:[UIImage imageNamed:#"webloading.png"] options:SDWebImageCacheMemoryOnly];
self.imageSet = YES;
}
[super layoutSubviews];
UIImageView *storeImageView = (UIImageView *)self.leftCalloutAccessoryView;
storeImageView.frame = CGRectMake(storeImageView.frame.origin.x,storeImageView.frame.origin.y,30.0,30.0);
}
#end
of course your need to adapt the code a bit.
Related
I am working on play video on GMSMarker pin and below is my code. The display and hide/show pin view, everything work fine. Except video not play. I am using story board with pinview.
#interface MapViewController () <GMSMapViewDelegate>
#property (strong, nonatomic) IBOutlet GMSMapView *mapView;
#property (strong, nonatomic) IBOutlet UIView *pinView;
#property (strong, nonatomic) GMSMarker *london;
#property (strong, nonatomic) AVPlayer *player;
#property (strong, nonatomic) AVPlayerLayer *playerLayer;
#end
#implementation MapViewController
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view.
GMSCameraPosition *camera = [GMSCameraPosition cameraWithLatitude:51.5 longitude:-0.127 zoom:18];
self.mapView.camera = camera;
self.mapView.delegate = self;
CLLocationCoordinate2D position = CLLocationCoordinate2DMake(51.5, -0.127);
_london = [GMSMarker markerWithPosition:position];
_london.title = #"London";
_london.tracksViewChanges = YES;
_london.map = self.mapView;
[self setUpVideoPlayer];
}
- (UIView *)mapView:(GMSMapView *)mapView markerInfoWindow:(GMSMarker *)marker {
[self.player play];
return self.pinView;
}
-(void)setUpVideoPlayer{
NSString *videoFilePath = [[NSBundle mainBundle] pathForResource:#"SampleVideo" ofType:#"mp4"];
AVAsset *avAsset = [AVAsset assetWithURL:[NSURL fileURLWithPath:videoFilePath]];
AVPlayerItem *avPlayerItem =[[AVPlayerItem alloc]initWithAsset:avAsset];
self.player = [[AVPlayer alloc]initWithPlayerItem:avPlayerItem];
self.playerLayer =[AVPlayerLayer playerLayerWithPlayer:self.player];
[self.playerLayer setFrame:self.pinView.frame];
[self.pinView.layer addSublayer:self.playerLayer];
[self.player seekToTime:kCMTimeZero];
}
Please help me to fix the issue. Why the video is not playing.
Thanks in advance.
Normally, info window will not regenerate after showing it(its adding as an image). To show video frames it need to regenerate when video play. It means window need to refresh automatically. To do it you need to set one property for your GMSMarker. Line as follows,
_london.tracksInfoWindowChanges = YES;
So your full code is
CLLocationCoordinate2D position = CLLocationCoordinate2DMake(51.5, -0.127);
_london = [GMSMarker markerWithPosition:position];
_london.title = #"London";
_london.tracksViewChanges = YES;
_london.tracksInfoWindowChanges = YES;
_london.map = self.mapView;
Google documentation is https://developers.google.com/maps/documentation/ios-sdk/marker#set_an_info_window_to_refresh_automatically
I believe that the reason why your code doesn't work is the fact that Google Maps for iOS renders Custom Info Views as Images. I'm not sure, but I believe, since I have also tried with animations and it also doesn't work. When I use animation, I first see the initial state of animation and when I open marker info window again, I see the final state, without animation. Also, when I add a text field on custom info window view, I can't click on that text field. When I try with the video like you, I see a blank window like the player is going to start and hear the sound, but the video never starts.
I have found this note on many questions on Stackoverflow (they say that it is from Google Docs), but I don't see it on Google Docs:
Note: The info window is rendered as an image each time it is displayed on the map. This means that any changes to its properties, while it is active, will not be immediately visible. The contents of the info window will be refreshed the next time that it is displayed.
Source: Google Maps Showing speech bubble in marker info window iOS
Maybe you can try with tracksInfoWindowChanges = YES:
- (UIView *)mapView:(GMSMapView *)mapView markerInfoWindow:(GMSMarker *)marker {
marker.tracksInfoWindowChanges = YES;
self.player play];
return self.pinView;
}
Source: How to force refresh contents of the markerInfoWindow in Google Maps iOS SDK
I know that you have set it in viewDidLoad method, but maybe setting it in a
- (UIView *)mapView:(GMSMapView *)mapView markerInfoWindow:(GMSMarker *)marker
method maybe will work, but I don't believe. A lot of people say that setting markerInfoWindow to YES works, but I have tried, and it doesn't work.
I have tried to use iOS Maps and video plays without problems. I have never had a problem with using Apple Maps. So, I'm sure that Google Maps isn't a good solution for this problem. Maybe you could fix it by some hack, but I have lost several hours to find a solution, without results. Maybe it can be done, for example, to move a UIView above map when the user moves his finger on the map or zooms in/out, or some other solution, but I prefer system solutions above hacks. In this case, I don't see system solution, maybe I'd find it if I spent more time to research. If I were you, I wouldn't use Google Maps for this.
Edit: I've just seen that my solution works on simulator. On real device (iPhone 7+) it doesn't work, i.e. I see player controls are animating, I see sound, but I don't see the video. But even if it works on real device like it works on simulator, user can't play or pause video, because he only sees images which are updating regularly.
I have a larger app that should be able to share multiple images.
I implemented this using UIActivityViewController and UIActivityItemProvider to have asynchronous usage of the items (so that i only have to prepare one image at a time and not have to fill the memory with all of them at once to share them).
I was "inspired" by the Apple Airdrop example:
AirdropSample download
However when using my app to share e.g. 9 images (to camera Roll == "Save 9 images") only 4 to 7 images end up being in the camera roll, no error messages whatsoever.
If i repeat it over and over sometimes i get 5 images or 6 seemingly random.
I cannot post my app here, but i modified the above sample in a way that it will also randomly "fail" with delivering all images to the camera Roll...
If you download above sample and replace 4 files with these, it shows the problem:
APLAsyncImageViewController.h:
#import <UIKit/UIKit.h>
#import "APLAsyncImageActivityItemProvider.h"
#interface APLAsyncImageViewController : UIViewController
#end
APLAsyncImageViewController.m:
#import "APLAsyncImageViewController.h"
#import "APLProgressAlertViewController.h"
NSString * const kProgressAlertViewControllerIdentifier = #"APLProgressAlertViewController";
#interface APLAsyncImageViewController ()
#property (strong, nonatomic) UIWindow *alertWindow;
#property (strong, nonatomic) APLProgressAlertViewController *alertViewController;
#property (strong, nonatomic) UIPopoverController *activityPopover;
#property (weak, nonatomic) IBOutlet UIButton *shareImageButton;
- (IBAction)openActivitySheet:(id)sender;
#end
#implementation APLAsyncImageViewController
- (IBAction)openActivitySheet:(id)sender
{
NSMutableArray *itemArray = [[NSMutableArray alloc] init];
for( int i = 0; i < 9;i++)
{
APLAsyncImageActivityItemProvider *aiImageItemProvider = [[APLAsyncImageActivityItemProvider alloc] init];
[itemArray addObject: aiImageItemProvider];
}
//Create an activity view controller with the activity provider item. UIActivityItemProvider (AsyncImageActivityItemProvider's superclass) conforms to the UIActivityItemSource protocol
UIActivityViewController *activityViewController = [[UIActivityViewController alloc] initWithActivityItems:itemArray applicationActivities:nil];
if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone) {
//iPhone, present activity view controller as is
[self presentViewController:activityViewController animated:YES completion:nil];
}
else
{
//iPad, present the view controller inside a popover
if (![self.activityPopover isPopoverVisible]) {
self.activityPopover = [[UIPopoverController alloc] initWithContentViewController:activityViewController];
[self.activityPopover presentPopoverFromRect:[self.shareImageButton frame] inView:self.view permittedArrowDirections:UIPopoverArrowDirectionAny animated:YES];
}
else
{
//Dismiss if the button is tapped while pop over is visible
[self.activityPopover dismissPopoverAnimated:YES];
}
}
}
#end
APLAsyncImageActivityItemProvider.h:
#import <UIKit/UIKit.h>
#interface APLAsyncImageActivityItemProvider : UIActivityItemProvider
#end
APLAsyncImageActivityItemProvider.m:
#import "APLAsyncImageActivityItemProvider.h"
#import "UIImage+Resize.h"
#implementation APLAsyncImageActivityItemProvider
- (id)activityViewControllerPlaceholderItem:(UIActivityViewController *)activityViewController
{
return [[UIImage alloc] init];
}
- (id)item
{
UIImage *image = [UIImage imageNamed:#"Flower.png"];
//CGSize imageSize;
//imageSize.height = 1000;
//imageSize.width = 1000;
//image = [UIImage imageWithImage:image scaledToFitToSize:imageSize];
return image;
}
- (UIImage *)activityViewController:(UIActivityViewController *)activityViewController thumbnailImageForActivityType:(NSString *)activityType suggestedSize:(CGSize)size
{
//The filtered image is the image to display on the other side.
return [[UIImage alloc] init];
}
#end
If you execute the sample like this (Use menu item "Send Image After Preprocessing", press "SHARE" Button) it will often or mostly fail to deliver all 9 images to the camera roll.
IF you uncomment the 4 lines in "APLAsyncImageActivityItemProvider.m" that basically just scale the output image THEN it will work ALWAYS.
Can you tell me why ? I feel that if i know the answer to that riddle i can also fix my app.
Thank you,
Nils
The reason for this issue is pretty simple.
There are limited number of writer threads available to use by the app, so writing too many images simulteneously may fail due to "Write Busy" error. You can verify that by trying to manually write these images using UIImageWriteToSavedPhotosAlbum (or corresponding Asset/Photos framework counterparts).
So, resizing images just hides the real problem because writing a smaller image takes less time.
The problem can be solved by liming the number of writers and/or retrying in case of error.
Also note that -[UIActivityItemProvider item] is a blocking call, but there are no synchronous ways to write to the gallery out of the box. This can be handled with dispatch_semaphore_wait , NSCondition, etc.
Or, in case you're using ReactiveCocoa, by simply calling waitUntilCompleted.
I would go to say that the [UIImage imageWithImage:image scaledToFitToSize:imageSize] image created is GC before saved to camera roll, but I don't know what is going on in that method without the code. That method is not a normal UIImage method, a category maybe?
EDIT
Replace your for loop code adding to your item array with this code, and then uncomment your resize lines, and see if that works.
NSMutableArray *itemArray = [[NSMutableArray alloc] init];
for( int i = 0; i < 9;i++)
{
UIImage *aiImage = [[[APLAsyncImageActivityItemProvider alloc] init] image];
[itemArray addObject:aiImage];
}
In a simple test app for iPhone I display (using SDWebImage) a user avatar on the top:
When the app user taps the avatar I'd like to display that picture "fullscreen" in another view.
So I have added that view to my Xcode 5 storyboard and also tap recognizer and the push segue which I've called "pushZoom" (here fullscreen):
The tap recognizer works okay, I can see its method being called.
My question: is there maybe a simple way in Interface Builder to pass the image to the new view or do I have to go the tedious way (add ZoomViewContoller.[mh] files, define an outlet for the image view there)?
Below is the copy of my ViewController.m in case I modify it later # GitHub:
#import "ViewController.h"
#import <SDWebImage/UIImageView+WebCache.h>
static NSString* const kAppleMaps = #"https://maps.apple.com/?q=%#";
static NSString* const kGoogleMaps = #"comgooglemaps-x-callback://?q=%#&x-success=myphone://?resume=true&x-source=MyPhone";
static NSString* const kAvatar = #"https://lh6.googleusercontent.com/-6Uce9r3S9D8/AAAAAAAAAAI/AAAAAAAAC5I/ZZo0yzCajig/photo.jpg";
#implementation ViewController
- (void)viewDidLoad
{
[super viewDidLoad];
[self setTitle:#"Google+"];
_firstName.text = #"Alex";
[_cityBtn setTitle:#"Bochum" forState:UIControlStateNormal];
[_imageView setImageWithURL:[NSURL URLWithString:kAvatar]
placeholderImage:[UIImage imageNamed:#"Male.png"]];
}
- (IBAction)avatarTapped:(id)sender {
NSLog(#"%s", __PRETTY_FUNCTION__);
}
- (IBAction)cityPressed:(id)sender
{
NSURL* testURL = [NSURL URLWithString:#"comgooglemaps-x-callback://"];
NSString* fmt = ([[UIApplication sharedApplication] canOpenURL:testURL] ? kGoogleMaps : kAppleMaps);
NSString* city = [self urlencode:_cityBtn.currentTitle];
NSString* str = [NSString stringWithFormat:fmt, city];
NSLog(#"%s: city=%# str=%#", __PRETTY_FUNCTION__, city, str);
[[UIApplication sharedApplication] openURL:[NSURL URLWithString:str]];
}
- (NSString*)urlencode:(NSString*)str
{
return (NSString*)CFBridgingRelease(CFURLCreateStringByAddingPercentEscapes(
NULL,
(__bridge CFStringRef) str,
NULL,
CFSTR(":/?#!$&'()*+,;="),
kCFStringEncodingUTF8));
}
#end
Why do you need another viewController. You may keep an imageView of full size over you view without any image initially and keep it hidden, and on tapping that button, remove it from hiding. and just pass the image. So it doesn't need any of these problems. It seems simple for me
In my experience is small , but I show my idea
- (IBAction)avatarTapped:(id)sender {
//here use your choice in singleton or NSUserdefaults for passing the image url from one VC to another VC. here I used NSUserdefaults
[[NSUserDefaults standardUserDefaults]setObject:kAvatar forKey:#"userimage"];
}
in your full image show ViewController.m
- (void)viewDidLoad
{
[super viewDidLoad];
NSString *userimg =[[NSUserDefaults standardUserDefaults]stringForKey:#"userimage"];
[_imageView setImageWithURL:[NSURL URLWithString:userimg]
placeholderImage:[UIImage imageNamed:#"Male.png"]];
}
Pass image URL as string or url to next screen and show it in an imageView using
[self.myImageView setImageWithURL:[NSURL URLWithString:photoURL] placeholderImage:[UIImage imageNamed:#"loading.png"]];
Thats what i would do.Please post here if you got some better idea.
I have a UIImageView in a storyboard, connected to the header of the subclassed UIViewController like this:
#property (retain, nonatomic) IBOutlet UIImageView *displayPhoto;
In the .m of the UIViewController subclass, I try to set the image of the displayPhoto.
#synthesize displayPhoto;
-(void)awakeFromNib {
UIImage *image = [UIImage imageNamed:#"dp.jpg"];
[displayPhoto setImage:image];
displayPhoto.layer.cornerRadius = displayPhoto.image.size.width/2;
[self.view addSubview:displayPhoto];
}
but it does absolutely nothing. If I add one more line, adding another view of the image like so: [self.view addSubview:[[UIImageView alloc] initWithImage:image]];, making my methodf
#synthesize displayPhoto;
-(void)awakeFromNib {
UIImage *image = [UIImage imageNamed:#"dp.jpg"];
[self.view addSubview:[[UIImageView alloc] initWithImage:image]];
[displayPhoto setImage:image];
displayPhoto.layer.cornerRadius = displayPhoto.image.size.width/2;
[self.view addSubview:displayPhoto];
}
It suddenly starts working and adds to instances of the image to the view. One in the top left corner (from the line I added) and a second in the storyboard image location.
What's changing? How is adding that one line making my code work and is there any work around so I can make the image just show up as normal?
If you added the UIImage in your storyboard and connected the outlet correctly, there is no need to create a new image programmatically. Try this.
- (void) viewDidLoad {
[super viewDidLoad];
self.displayPhoto.image = [UIImage imageNamed:#"dp.jpg"];
}
Check the connection of imageview to storyboard will solve your problem.
Use this:
self.imgObj.image=[UIImage imageNamed:#"Your string"];
I am creating a real estate app. I have a screen which displays a listing of all entries with a thumbnail and a little text on the side. These I have loaded from the server when the app launched. Each entry can have up to 5 photos, which I do not pre-load for obvious reasons. My issue is this… when the user selects an entry, the app downloads the larger photos from the server. Depending on circumstances this can take a few seconds. Right now the app just hangs for those few seconds. I don't know of any practical way to use an activity indicator in a list. A header space just seems like wasted space to use only to display"Loading…". Anyone have any ideas on what I can do to let the user know that loading is in progress?
Clarification: Once an entry is selected from the list, I load up another Table View Controller which has the photos in its list of selections. I currently load the photos in the ViewDidLoad using
NSData *myPhoto = [[NSData alloc] initWithContentsOfURL:[NSURL URLWithString:myURL]];
You can:
Use UIActivityIndicatorView to show a spinning activity indicator in the precise spot where the image will eventually be loaded.
In a separate queue download the image. While the below code uses GCD, it's actually much better to use NSOperationQueue because on a slow network, using GCD can consume all of the available worker threads, detrimentally affecting performance on the app. A NSOperationQueue with a reasonable maxConcurrentOperationCount (such as 4 or 5) is much better.
When the download is complete, dispatch the updating of the UI back to the main queue (e.g. turn off the activity indicator and set the image).
This is sample code from a gallery app that shows how you might do it. This is probably more complicated than you need and might be hard to repurpose via cut-and-paste, but the loadImage method shows the basic elements of the solution.
#interface MyImage : NSObject
#property (nonatomic, strong) NSString *urlString;
#property (nonatomic, strong) UIImageView *imageView;
#property (nonatomic, strong) UIActivityIndicatorView *activityIndicator;
#property (nonatomic, strong) UIView *view;
#property BOOL loading;
#property BOOL loaded;
#end
#implementation MyImage
// I find that I generally can get away with loading images in main queue using Documents
// cache, too, but if your images are not optimized (e.g. are large), or if you're supporting
// older, slower devices, you might not want to use the Documents cache in the main queue if
// you want a smooth UI. If this is the case, change kUseDocumentsCacheInMainQueue to NO and
// then use the Documents cache only in the background thread.
#define kUseDocumentsCacheInMainQueue NO
- (id)init
{
self = [super init];
if (self)
{
_view = [[UIView alloc] initWithFrame:CGRectMake(0.0, 0.0, IMAGE_WIDTH, IMAGE_HEIGHT)];
_imageView = [[UIImageView alloc] initWithFrame:CGRectMake(0.0, 0.0, IMAGE_WIDTH, IMAGE_HEIGHT)];
_imageView.contentMode = UIViewContentModeScaleAspectFill;
_imageView.clipsToBounds = YES;
[_view addSubview:_imageView];
_loading = NO;
_loaded = NO;
}
return self;
}
- (void)loadImage:(dispatch_queue_t)queue
{
if (self.loading)
return;
self.loading = YES;
ThumbnailCache *cache = [ThumbnailCache sharedManager];
if (self.imageView.image == nil)
{
// I've implemented a caching system that stores images in my Documents folder
// as well as, for optimal performance, a NSCache subclass. Whether you go through
// this extra work is up to you
UIImage *imageFromCache = [cache objectForKey:self.urlString useDocumentsCache:kUseDocumentsCacheInMainQueue];
if (imageFromCache)
{
if (self.activityIndicator)
{
[self.activityIndicator stopAnimating];
self.activityIndicator = nil;
}
self.imageView.image = imageFromCache;
self.loading = NO;
self.loaded = YES;
return;
}
// assuming we haven't found it in my cache, then let's see if we need to fire
// up the spinning UIActivityIndicatorView
if (self.activityIndicator == nil)
{
self.activityIndicator = [[UIActivityIndicatorView alloc] initWithActivityIndicatorStyle:UIActivityIndicatorViewStyleGray];
self.activityIndicator.center = CGPointMake(self.view.frame.size.width / 2.0, self.view.frame.size.height / 2.0);
[self.view addSubview:self.activityIndicator];
}
[self.activityIndicator startAnimating];
// now, in the background queue, let's retrieve the image
dispatch_async(queue, ^{
if (self.loading)
{
UIImage *image = nil;
// only requery cache for Documents cache if we didn't do so in the main
// queue for small images, doing it in the main queue is fine, but apps
// with larger images, you might do this in this background queue.
if (!kUseDocumentsCacheInMainQueue)
image = [cache objectForKey:self.urlString useDocumentsCache:YES];
// if we haven't gotten the image yet, retrieve it from the remote server
if (!image)
{
NSData *data = [[NSData alloc] initWithContentsOfURL:[NSURL URLWithString:self.urlString]];
if (data)
{
image = [UIImage imageWithData:data];
// personally, I cache my image to optimize future access ... you might just store in the Documents folder, or whatever
[cache setObject:image forKey:self.urlString data:data];
}
}
// now update the UI in the main queue
dispatch_async(dispatch_get_main_queue(), ^{
if (self.loading)
{
[self.activityIndicator stopAnimating];
self.activityIndicator = nil;
self.imageView.image = image;
self.loading = NO;
self.loaded = YES;
}
});
}
});
}
}
// In my gallery view controller, I make sure to unload images that have scrolled off
// the screen. And because I've cached the images, I can re-retrieve them fairly quickly.
// This sort of logic is critical if you're dealing with *lots* of images and you want
// to be responsible with your memory.
- (void)unloadImage
{
// remove from imageview, but not cache
self.imageView.image = nil;
self.loaded = NO;
self.loading = NO;
}
#end
By the way, if the image you're downloading is in a UIImageView in a UITableViewCell the final update back to the table might want to do something about checking to see if the cell is still on screen (to make sure it wasn't dequeued because the UITableViewCell scrolled off the screen). In that case, the final UI update after successful download of the image might do something like:
dispatch_async(dispatch_get_main_queue(), ^{
// if the cell is visible, then set the image
UITableViewCell *cell = [self.tableView cellForRowAtIndexPath:indexPath];
if (cell)
{
cell.imageView.image = image;
}
});
Note, this is using the UITableView method cellForRowAtIndexPath, which should not be confused with the UITableViewController method tableView:cellForRowAtIndexPath.
For one of my projects i used this custom class for UIImageView:
https://github.com/nicklockwood/AsyncImageView
Small tutorial is located here: http://www.markj.net/iphone-asynchronous-table-image/
With just few lines of code i managed to implement asynchronous loading of images, caching etc. Just give it a look.