iOS 13.2 MKTileOverlay occasionally won't render - ios

I'm having an issue where in iOS 13.2 (probably also from iOS 13), loading offline map tile using MKTileOverlay occasionally won't be able to render, leaving the tile blank, there seems to be no issue with MKTileOverlay's subclass at all as it worked well in iOS 12 and below. I have 2 MKTileOverlay class (1 add grid and 1 load map tile file, default MKTileOverlay), both won't be able to load on that blank tile with default MKTileOverlayRenderer, other overlays seems to appear fine.
The issue seems to be resolved itself if I go to home screen and go back to the app, causing the tiles to reload. Is this a bug from iOS MapKit itself? Does anyone have temporary solution for this? Thank you.
Code for adding overlay:
let overlay = MKTileOverlay(urlTemplate: urlTemplate)
overlay.canReplaceMapContent = true
overlay.maximumZ = 19
mapView.insertOverlay(overlay, at: 0, level: .aboveLabels)
Renderer:
func mapView(_ mapView: MKMapView, rendererFor overlay: MKOverlay) -> MKOverlayRenderer {
if overlay is MKTileOverlay {
let renderer = MKTileOverlayRenderer(tileOverlay: overlay as! MKTileOverlay)
return renderer
}
return MKOverlayRenderer()
}

It is clearly a MapKit issue/bug.
I've also open a feedback ticket since the 9th of December 2020.
The root of this issue is not very sure.
MapKit and specially MKTileOverlay always had/have some issues with "heavy" tiles like PNG 24bit. When the MKTileOverlay use PNG (heavy tiles), the tiles sometimes are flashing and the map is continuously reloading especially with wide screens (iPad pro etc..)
So, since the JPEG tiles are often lighter than PNG, JPEG can be a workaround.
BUT, this new iOS 13.2+ issue is not the same! Random tiles are not rendered. If you remove and readd the MKTileOverlay or call the reloadData method of MKTileOverlayRenderer, the missing tiles will be rendered and it will be other random tiles that which be missing.
The real solution of the issue is to open a feedback ticket: https://feedbackassistant.apple.com
Edit: I've just tried to replace my 8bit PNG by 85% JPEG on the very simple MKTileOverlay project sample i've sent to Apple in my ticket. Same issue... no improvement.
Edit 2: Loading the NSData into an UIImage then using UIImageRepresentationJPEG seems to do the trick... Ugly...
- (void)loadTileAtPath:(MKTileOverlayPath)path result:(void (^)(NSData * _Nullable, NSError * _Nullable))result
{
NSString *tilePath = [self PATHForTilePath:path];
NSData *data = nil;
if (![[NSFileManager defaultManager] fileExistsAtPath:tilePath])
{
NSLog(#"Z%ld/%ld/%ld does not exist!", path.z, path.x, path.y);
}
else
{
NSLog(#"Z%ld/%ld/%ld exist", path.z, path.x, path.y);
UIImage *image = [UIImage imageWithContentsOfFile:tilePath];
data = UIImageJPEGRepresentation(image, 0.8);
// Instead of: data = [NSData dataWithContentsOfFile:tilePath];
if (data == nil)
{
NSLog(#"Error!!! Unable to read an existing file!");
}
}
dispatch_async(dispatch_get_main_queue(), ^{
result(data, nil);
});
}

As I noted in a comment to the original question I was having the same problem, but it is now largely resolved, so I thought I'd post what worked for me.
The problem for me occurred in the following method
- (void)loadTileAtPath:(MKTileOverlayPath)path result:(void (^)(NSData * __nullable tileData, NSError * __nullable error))result{
where the custom tile would fail to load even when it was being presented with what appeared to be valid NSData.
I found that the problem was reduced if I used jpegs instead of pngs for my custom tiles, but it was only when I changed the way that I was handling the tile data did the problem largely go away. (I largely, because I still get the occasional unloaded tile, but I'd say it's 100x less often than I was getting them before).
The following method is my Xamarin.iOS implementation of it, but you should be able to see the principle for Swift or Objective C.
The key is the difference in the way the NSData is created. Instead of calling the UrlForTilePath method, I create a UIImage from the tile path and then use the UIImageJPEGRepresentation (AsJPEG in C#) to create the NSData.
public override void LoadTileAtPath(MKTileOverlayPath path, MKTileOverlayLoadTileCompletionHandler result)
{
//I was using this prior to ios 13.2
//NSUrl url = this.URLForTilePath(path);
//NSData tileData = NSData.FromFile(url.AbsoluteString);
//result(tileData, null);
//Now I use this
String folderPath = "tiles/" + path.Z + "/" + path.X + "/";
String tilePath = NSBundle.MainBundle.PathForResource(path.Y.ToString(), "jpg", folderPath);
String blankPath = NSBundle.MainBundle.PathForResource("tile", "jpg");
try
{
//does the file exist?
UIImage tile;
if (File.Exists(tilePath))
{
tile = UIImage.FromFile(tilePath);
if (tile == null)
{
Console.WriteLine("Error Loading " + path.Z + " " + path.Y + " " + path.X);
//This may be redundant, as I'm not getting any errors here, even when the tile doesn't display
}
}
else
{
tile = UIImage.FromFile(blankPath);
}
NSData tileData = tile.AsJPEG();
result(tileData, null);
}
catch (Exception ex)
{
}
}

Related

UIImageJPEGRepresentation using large amount of memory (Swift 3.0)

I'm trying to compress and get the NSdata from between 20 and 30 UIImages with a "for-loop" like this:
for theImage in selectedUIImages {
let data = UIImageJPEGRepresentation(image,0.5)
// doing something with the data
}
Tried on an iPhone 7 with no issues besides my app using upto 700MB of memory when going through the loop, but on an older iPhone I get the message:
*Message from debugger: Terminated due to memory issue.*
The main objective is to get the NSData from the UIImage so I can put the image in a dir for uploading. Let me explain:
The Amazon S3 Transfer utility wants a path/url to the image and therefore I need to make a path/url for the UIImage and the only way i know is to get it by:
data.write(to: URL(fileURLWithPath: localPath), options: .atomic)
Try using an autorelease pool:
autoreleasepool {
for theImage in selectedUIImages {
let data = UIImageJPEGRepresentation(image,0.5)
// doing something with the data
}
}
and move it in a background thread.
Because your app run out of memory.
You can save it to Document directory after compress then upload it to server one by one. So it not make your memory issue.
You can decrease the image size by decreasing a ratio parameter. You can use 0.3 instead 0.5.
for theImage in selectedUIImages {
let data = UIImageJPEGRepresentation(image,0.3)
// doing something with the data
}

GPUImageView stop responding to "Filter Change" after two times

I'm probably missing something. I'm trying to change filter to my GPUImageView.It's actually working the first two times(sometimes only one time), and than stop responding to changes. I couldn't find a way to remove the target from my GPUImageView.
Code
for x in filterOperations
{
x.filter.removeAllTargets()
}
let f = filterOperations[randomIntInRange].filter
let media = GPUImagePicture(image: self.largeImage)
media?.addTarget(f as! GPUImageInput)
f.addTarget(g_View)
media.processImage()
Any suggestions? * Processing still image from my library
UPDATE
Updated Code
//Global
var g_View: GPUImageView!
var media = GPUImagePicture()
override func viewDidLoad() {
super.viewDidLoad()
media = GPUImagePicture(image: largeImage)
}
func changeFilter(filterIndex : Int)
{
media.removeAllTargets()
let f = returnFilter(indexPath.row) //i.e GPUImageSepiaFilter()
media.addTarget(f as! GPUImageInput)
f.addTarget(g_View)
//second Part
f.useNextFrameForImageCapture()
let sema = dispatch_semaphore_create(0)
imageSource.processImageWithCompletionHandler({
dispatch_semaphore_signal(sema)
return
})
dispatch_semaphore_wait(sema, DISPATCH_TIME_FOREVER)
let img = f.imageFromCurrentFramebufferWithOrientation(img.imageOrientation)
if img != nil
{
//Useable - update UI
}
else
{
// Something Went wrong
}
}
My primary suggestion would be to not create a new GPUImagePicture every time you want to change the filter or its options that you're applying to an image. This is an expensive operation, because it requires a pass through Core Graphics and a texture upload to the GPU.
Also, since you're not maintaining a reference to your GPUImagePicture beyond the above code, it is being deallocated as soon as you pass out of scope. That tears down the render chain and will lead to a black image or even crashes. processImage() is an asynchronous operation, so it may still be in action at the time you exit your above scope.
Instead, create and maintain a reference to a single GPUImagePicture for your image, swap out filters (or change the options for existing filters) on that, and target the result to your GPUImageView. This will be much faster, churn less memory, and won't leave you open to premature deallocation.

Problems accurately timing the loading of a image from file into UIImageView

I am trying to measure the time taken to load a large photo (JPEG) from file into an UIImageView on iOS 8.0.
My current code:
import UIKit
class ViewController: UIViewController {
#IBOutlet weak var imageView: UIImageView!
#IBAction func loadImage(sender: UIButton) {
if let imageFile = NSBundle.mainBundle().pathForResource("large_photo", ofType: "jpg") {
// start our timer
let tick = Tick()
// loads a very large image file into imageView
// the test photo used is a 4608 × 3456 pixel JPEG
// using contentsOfFile: to prevent caching while testing timer
imageView.image = UIImage(contentsOfFile: imageFile)
// stop our timer and print execution time
tick.tock()
}
}
}
class Tick {
let tickTime : NSDate
init () {
tickTime = NSDate()
}
func tock () {
let tockTime = NSDate()
let executionTime = tockTime.timeIntervalSinceDate(tickTime)
println("[execution time]: \(executionTime)")
}
}
When I load a very large image (4608 x 3456 JPEG) on my test device (5th gen iPod touch), I can see that the execution time is ~2-3 seconds and blocks the main thread. This is observable by the fact that the UIButton remains in a highlighted state for this period of time and no other UI elements allow interaction.
I would therefore expect my timing function to report a time of ~2-3 seconds. However, it reports a time of milliseconds - eg:
[execution time]: 0.0116159915924072
This tick.tock() prints the message to the Console before the image is loaded. This confuses me, as the main thread appears blocked until after the image is loaded.
This leads me to ask the following questions:
if the image is being loaded asynchronously in the background, then
why is user interaction/main thread blocked?
if the image is being loaded on the main thread, why does the
tick.tock() function print to the console before the image is
displayed?
There are 2 parts to what you are measuring here:
Loading the image from disk:
UIImage(contentsOfFile: imageFile)
And decompressing the image from a JPEG to a bitmap to be displayed:
imageView.image = ....
The first part involves actually retrieving the compressed JPEG data from the disk (disk I/O) and creating a UIImage object. The UIImage object holds a reference to the compressed data, until it needs to be displayed. Only at the moment that it's ready to be rendered to the screen does it decompress the image into a bitmap to display (on the main thread).
My guess is that your timer is only catching the disk load part, and the decompression is happening on the next runloop. The decompression of an image that size is likely to take a while, probably the lions share of the time.
If you want to explicitly measure how long the decompression takes, you'll need to do it manually, by drawing the image to an off screen context, like so:
let tick = Tick()
// Load the image from disk
let image = UIImage(contentsOfFile: imageFile)
// Decompress the image into a bitmap
var newImage:UIImage;
UIGraphicsBeginImageContextWithOptions(image.size, true, 0);
image.drawInRect(CGRect(x:0,y:0,width:image.size.width, height:image.size.height))
newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
tick.tock()
Here we are replicating the decompression that would happen when you assigned the image to the imageView.image
A handy trick to keep the UI responsive when dealing with images this size is to kick the whole process onto a background thread. This works well because once you have manually decompressed the image, UIKit detects this and doesn't repeat the process.
// Switch to background thread
dispatch_async(dispatch_get_global_queue(Int(DISPATCH_QUEUE_PRIORITY_DEFAULT.value), 0)) {
// Load the image from disk
let image = UIImage(contentsOfFile: imageFile)
// Ref to the decompressed image
var newImage:UIImage;
// Decompress the image into a bitmap
UIGraphicsBeginImageContextWithOptions(image.size, true, 0);
image.drawInRect(CGRect(x:0,y:0,width:image.size.width, height:image.size.height))
newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
// Switch back to main thread
dispatch_async(dispatch_get_main_queue()) {
// Display the decompressed image
imageView.image = newImage
}
}
A disclaimer: The code here has not been fully tested in Xcode, but it's 99% correct if you decide to use it.
I would try to time this using a unit test, since the XCTest framework provides some good performance measurement tools. I think this approach would get around the lazy loading issues... although I'm not 100% on it.
func testImagePerformance() {
let date = NSDate()
measureBlock() {
if let imageFile = NSBundle.mainBundle().pathForResource("large_photo", ofType: "jpg") {
imageView.image = UIImage(contentsOfFile: imageFile)
}
}
}
(Just an aside, you mentioned that the loading blocks the main app thread... you should look into using an NSOperationQueue to make sure that doesn't happen... you probably already know that: http://nshipster.com/nsoperation/)

iOS Monotouch UIImagePickerController multiple photos / videos from camera

We're experiencing a strange problem with a UIImagePickerController. In our application users are able to fill out a series of forms and also attach images and videos within these forms.
We allow users to add multiple photos / videos either from the camera roll or to be captured at the time of filling the form out.
We're using the UIImagePickerController to do this. The problem occurs when 1 or 2 images / videos are taken with the camera.
Once 1 or 2 images / videos are captured when the camera screen is re-entered for a third time the image is static and doesn't update. The view is stuck at the last frame of whatever was captured last.
If the capture button is pressed then the image / video suddenly updates and has captured what the camera was pointing at. From then on the picker is good for another go behaving normally. Additionally selecting a picture / video from the camera roll appears to make everything behave again for another picture / video. Finally when the screen isn't responding and the user has selected to take a picture the view will shrink to a small rectangle within the view. The controller is being setup as follows:
private void SourceChosen(EventHandler<UIImagePickerMediaPickedEventArgs> captureEvent, int buttonIndex, string[] mediaTypes)
{
var picker = ConfigurePicker(mediaTypes, captureEvent);
if (CameraAvailable && buttonIndex == 0)
{
picker.SourceType = UIImagePickerControllerSourceType.Camera;
picker.CameraDevice = UIImagePickerControllerCameraDevice.Rear;
this.NavigationController.PresentViewController(picker, true, () => { });
}
if ((!CameraAvailable && buttonIndex == 0) || (CameraAvailable && buttonIndex == 1))
{
picker.SourceType = UIImagePickerControllerSourceType.PhotoLibrary;
this.NavigationController.PresentViewController(picker, false, () => { });
}
}
private UIImagePickerController ConfigurePicker(string[] mediaTypes, EventHandler<UIImagePickerMediaPickedEventArgs> captureEvent)
{
var mediaPicker = new UIImagePickerController();
mediaPicker.FinishedPickingMedia += captureEvent;
mediaPicker.Canceled += (sender, args) => mediaPicker.DismissViewController(true, () => { });
mediaPicker.SetBarDefaults();
mediaPicker.MediaTypes = mediaTypes;
return mediaPicker;
}
An example of a captureEvent is as follows:
void PhotoChosen(object sender, UIImagePickerMediaPickedEventArgs e)
{
UIImage item = e.OriginalImage;
string fileName = string.Format("{0}.{1}", Guid.NewGuid(), "png");
string path = Path.Combine(IosConstants.UserPersonalFolder, fileName);
NSData imageData = item.AsPNG();
CopyData(imageData, path, fileName, ViewModel.Images, ((UIImagePickerController)sender));
}
private void CopyData(NSData imageData, string path, string fileName, List<AssociatedItem> collectionToAddTo, UIImagePickerController picker)
{
byte[] imageBytes = new byte[imageData.Length];
System.Runtime.InteropServices.Marshal.Copy(imageData.Bytes, imageBytes, 0, Convert.ToInt32(imageData.Length));
File.WriteAllBytes(path, imageBytes);
AssociatedItem item = new AssociatedItem
{
StorageKey = fileName
};
collectionToAddTo.Add(item);
picker.DismissViewController(true, ReloadTables);
}
At the moment as you can see we're not holding a reference to the picker but we have tried variations of this code where we store a reference to the picker and dispose it after the CopyData method, we've added picker.Release(); after copydata and before the dispose (results in subsequent pickers crashing the application when displayed) and pretty much every other variation on the theme.
Does anyone have any idea why this might be occurring and how to fix it? It was my assumption that we might be running low on memory but neither disposing of it each time / only ever creating one instance and changing its mode from pictures to videos has any affect and we always see the same behaviour.
EDIT
Thanks to Kento and the below answer what we needed to get it all working as intended was something along the lines of:
public class PickerDelegate : UIImagePickerControllerDelegate
{
private readonly Action<UIImagePickerController, NSDictionary> _captureEvent;
public PickerDelegate(Action<UIImagePickerController, NSDictionary> captureEvent)
{
_captureEvent = captureEvent;
}
public override void FinishedPickingMedia(UIImagePickerController picker, NSDictionary info)
{
_captureEvent(picker, info);
}
}
Then to get an image
void PhotoChosen(UIImagePickerController picker, NSDictionary info)
{
UIImage item = (UIImage)info.ObjectForKey(UIImagePickerController.OriginalImage);
string fileName = string.Format("{0}.{1}", Guid.NewGuid(), "png");
string path = Path.Combine(IosConstants.UserPersonalFolder, fileName);
NSData imageData = item.AsPNG();
CopyData(imageData, path, fileName, ViewModel.Images, picker);
}
Or to get a video
void VideoChosen(UIImagePickerController picker, NSDictionary info)
{
var videoURL = (NSUrl)info.ObjectForKey(UIImagePickerController.MediaURL);
NSData videoData = NSData.FromUrl(videoURL);
string fileName = string.Format("{0}.{1}", Guid.NewGuid(), "mov");
string path = Path.Combine(IosConstants.UserPersonalFolder, fileName);
CopyData(videoData, path, fileName, ViewModel.Videos, picker);
}
I had this same problem.
The post here is not marked as the answer but it did solve it for me: https://stackoverflow.com/a/20035698/2514318
I'm guessing this is a bug w/ MonoTouch when using the FinishedPickingMedia event. I have read that there are leaks with using UIImagePickerController (regardless of using obj c or Mono) so I prefer to keep the instance around and re-use it. If you do re-create it each time, I would recommend disposing the previous instance.
Can anyone from Xamarin weigh in on if this is a bug or not?
This post helped me a lot, so I decided to make a very simples sample and post on github for anyone that may need it: https://github.com/GiusepeCasagrande/XamarinSimpleCameraSample

UIImage AsPNG and AsJPEG fails

I'm using MonoTouch and I have a UIImage (displayed in a UIImageView and it looks good) and I'm trying to convert it to NSData, but AsJPEG and AsPNG returns null. What can be the problem?
My code looks like this:
NSError err;
NSData imageData = CroppedImageView.Image.AsJPEG(); // imageData is null!
if (!imageData.Save ("tmp.png", true, out err)) {
Console.WriteLine("Saving of file failed: " + err.Description);
}
The AsJPEG method calls UIImageJPEGRepresentation and its return value is documented as:
A data object containing the JPEG data, or nil if there was a problem generating the data. This function may return nil if the image has no data or if the underlying CGImageRef contains data in an unsupported bitmap format.
The is similar to many API in iOS (and OSX) where exception are not commonly used (and null is used to report some kind of error).
Anyway you should check your image dimensions and properties - they might give you an hint at something that would not translate into a JPEG bitmap.
Also since the NSData can represent a very large amount of memory you should try to limit it's life, e.g.:
using (NSData imageData = CroppedImageView.Image.AsJPEG ()) {
NSError err;
if (!imageData.Save ("tmp.jpg", true, out err)) {
Console.WriteLine("Saving of file failed: " + err.Description);
}
}
It looks like you are writing to a file in the current directory of the app, this is readonly.
You should use:
var path = System.IO.Path.GetTempFilename();
or
var path = System.IO.Path.Combine(System.IO.Path.GetTempPath(), "tmp.png");
Like you would do on other platforms, and use a file from there.
You can also use Environment.SpecialFolder.MyDocuments.
The AsJPEG returned null because the image size was too big (it was taken with an iPhone 5). After I Scaled it down by 2, it generates the data properly.

Resources