Long loading time for map - ios

I'm facing with issue related to long loading time of the empty map from Here-SDK-iOS.
I opened sample project and looks like it freeze for some time, the code for initialisation is next:
override func loadView() {
super.loadView()
NSLog("%#", ">>>>>>>>>> Load View \(Date())")
}
override func viewDidLoad() {
super.viewDidLoad()
mapView = MapView(frame: view.bounds)
view.addSubview(mapView)
mapView.mapScene.loadScene(mapScheme: .normalDay, completion: self.onLoadScene)
mapView.gestures.tapDelegate = self
NSLog("%#", ">>>>>>>>>> View Did Load \(Date())")
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
mapView.handleLowMemory()
}
func onLoadScene(_ error: MapError?) {
guard error == nil else {
print("Error: Map scene not loaded, \(String(describing: error))")
return
}
// Configure the map.
let camera = mapView.camera
camera.lookAt(point: GeoCoordinates(latitude: 52.518043, longitude: 13.405991),
distanceInMeters: 1000 * 10)
}
The log:
2021-06-25 08:45:27.500549+0300 testDrawing[50019:31547554] >>>>>>>>>> Load View 2021-06-25 05:45:27 +0000
...
2021-06-25 08:45:31.301998+0300 testDrawing[50019:31547554] >>>>>>>>>> View Did Load 2021-06-25 05:45:31 +0000
As u can see there is about 4 sec to init the map with initial location. Note: I use freemium account for testing.
I also can see additional warnings like
[WARN ] harp-sdk - Ignore adding invalid observer
[WARN] ResponseFromJsonBuilder - Absent value,
response=olp::authentication::IntrospectAppResult, field=description
[INFO ] harp-sdk - Adding data source
[INFO ] Storage.LevelDB - Cleared other DB in folder: "/Users//Library/Developer/CoreSimulator/Devices/61082972-C4BE-42E2-9696-0D2458D475D5/data/Containers/Data/Application/74E993BF-7111-4E08-A27F-A7F62B3ADA1D/Library/Caches/v1/sSR8TFucGSrS94S4sDvrsA/analyticsData/events.sqlite
[INFO] ThreadPoolTaskScheduler - Starting thread 'OLPSDKPOOL_0'
The most time consuming operation - is Cleared other DB in folder- about 3 sec.
Can anyone advice what's the reason for this behaviour.
---- UPDATE -----
I also receiving random crash
version = heresdk-explore-ios-4.7.5.0.5737
---- UPDATE -----
Few mode tech details:
For issue «freeze on start":
tested on Mac with M1 and simulator (12 Pro, iOS 14.5) and on device 12 Pro iOS 14.5.
On device lag a bit smaller - 1-3 sec, on simulator - up to 7 sec.
heresdk-explore-ios-4.7.5.0.5737
heresdk-navigate-ios-4.7.6.0.5863
For "crash on start":
12 Pro, iOS 14.5 both
heresdk-explore-ios-4.7.5.0.5737
---- UPDATE ----
Hi, the crash with analytics service must be fixed in upcoming release. The most time consuming operation - is Cleared other DB in folder- about 3 sec. There is no evidence that this operation consumes 3 sec, probably something happen after this operation. – Hsilgos
#Hsilgos,
Yes, there is no evidence that exactly this operation consume 3 sec - it's just a guess from my side (when I remove sdk - everything works instantly, so this is simple check).
Here few screenshots from profiler:
Here is the trace file
Here u can see that HARP.SDK.RENDERER (map renderer I guess) eat a lot of resource, and looks like MainThread is waiting until it's done.
Again - this is just a guess.
Another point for improvement - add support for running fat on arm64 Darwin simulator on M1. For not I need to use Rosetta ( - this disappointing....
---- UPDATE ----
Issue with drawing performance - example (when around 200 items are drawn at once, to be more precise - one-by-one - there is no API for batch drawing):
Look at second drawing area - the performance issue is more visible there

The HERE SDK team announced today that it is planned to add explicit support for M1 arm64 simulator builds. It is not clear yet when and how this will happen, but since it is missing right now, running a simulator on an Apple M1 processor will slow things down a bit in comparison to real devices. However, for me, running simulator on M1 is still faster than running a simulator on a non-M1, just because the M1 is incredibly fast and my non-M1 MacBook has not the fastest setup.
Also, the analytics crash seems to be a known issue. So, hopefully, we can see a fix for both soon.

Related

XCTest doesn't measure CPU and memory

I have this performance test to check Memory usage during scrolling.
func testMemotyUsage() {
let app = XCUIApplication()
let measureOptions = XCTMeasureOptions()
measureOptions.invocationOptions = [.manuallyStop]
measure(
metrics: [XCTMemoryMetric(application: app)],
options: measureOptions
) {
app.buttons["Listing Page"].tap()
swipeUp()
stopMeasuring()
tapBack()
}
}
func tapBack() {
app.navigationBars.buttons.element(boundBy: 0).tap()
}
func swipeUp() {
collectionView.swipeUp(velocity: .fast)
}
func swipeDown() {
collectionView.swipeDown(velocity: .fast)
}
var collectionView: XCUIElement {
app.collectionViews["collectionViewId"]
}
But when I run the test, it doesn't display any metrics at all.
I tried to update
XCTMemoryMetric(application: app) -> to XCTMemoryMetric()
In this case it at least shows some result, but the result is incorrect, because as it's seen on the screenshot below, the app consumes around 130 MB of memory, but the test shows 9 KB only. BTW the real memory consumption is around 130-150 MB, because there are a lot of images in the collection view.
My guess that it doesn't show the correct result, because the app is not passed as a parameter. Although when I pass the app, it doesn't show any results at all 🙃
Same issue happens when I write the test to check CPU usage with XCTCPUMetric.
Questions:
How to write a performance test that will show memory and CPU usage of some UI tests?
(Optional) Why when I run the test in Debug mode, it shows that 2 processes are running (ExampleUITests - the target for UI tests, and Example - the main target). Is it normal and when I measure the memory consumption, am I supposed to get the consumption of the main target Example, right?

os_signpost works on simulator but not when running on device

I have the following lines of code in one of the functions I want to track time for, using instruments. When I run the app on simulator from instruments, the measured times with os_signpost do show up and I can accurately measure what I need.
Then I switch to device, same code. However, as it runs on device instruments does not show the measured times. It only show this:
the os_signpost times do not show up at all.
So, all works great on simulator, but not when I switch to my iPhone device.
Any idea?
let spid = OSSignpostID(log: SignpostLog.myLog, object: myObj as AnyObject)
defer { os_signpost(.end, log: SignpostLog.myLog, name: "operation1", signpostID: spid) }
os_signpost(.begin, log: SignpostLog.myLog, name: "operation1", signpostID: spid)
Make sure you did not disable OS_ACTIVITY_MODE via an environment variable (see screenshot below)
I removed it and everything worked =)
Turns out os_signpost would not show up because the recording was set to immediate. Setting it to Deferred made it so I could run the app, then stop recording and then instruments shows the custom os_signpost I had in code.

Why do I get memory warnings with only 7 MB of memory allocated?

I am running my iOS App on iPod touch device and I get memory warnings even if the total allocation peak is only 7 MB as shown below (this happens when the Game Scene is pushed):
What I find strange is that:
the left peak (at time 0.00) corresponds to 20 MB of memory allocated (Introduction Scene) and despite this DOES NOT give any memory warning.
the central peak (at time 35.00) corresponds to raughly 7 MB of memory allocated (Game Scene is being pushed) and DOES give memory warning.
I do not understand why I get those warnings if the total memory is only 7 MB. Is this normal? How can I avoid this?
Looking at the allocation density we can see the following schema, which (to me) does not show much difference between the moment when the Intro Scene is being pushed (0.00) and the moment in which the Game Scene is being pushed (35.00). Being the density peaks similar I would assume that the memory warnings are due to something else that I am not able to spot.
EDIT:
I have been following a suggestion to use "Activity monitor" instead but unfortunately my App crashes when loading the Game Scene with only 30 MB of memory allocated. Here is the Activity monitor report.
Looking at the report I can see a total real memory usage sum of about 105 MB. Given this should refer to RAM memory and given my model should have 256 MB of RAM this should not cause APP crashes or Memory leaks problems.
I run the Leak monitor and it does not show any leak on my App. I also killed all the other apps.
However, analyzing the report, I see an astonishing 167 MB of Virtual Memory value associated to my App. Is this normal? What does that value mean? Can this be the reason for the crash? How can I detect which areas of my code are responsible for this?
My iPod is a 4th Generation model with 6.4 GB of capacity (memory) and only 290 MB of memory free. I am not sure if this somehow effects the Virtual Memory paging performance.
EDIT 2: I have also looked more at SpringBoard and its Virtual Memory usage is 180 MB. Is this normal? I found some questions/answers that seem to suggest that SpringBoard is responsible for autoreleasing objects (it should be the process for managing the screen and home botton but I am not sure if it has also to do with memory management). Is this correct?
Another note. I am using ARC. However I am not sure this has to do much with the issue as there are no apparent memory leaks and XCode should convert the code adding release/dealloc/retain calls to the compiled binary.
EDIT 3: As said before I am using ARC and Cocos2d (2.0). I have been playing around with the Activity monitor. I found out that if I remove the GameCenter authentication mechanism then the Activity Monitor runs fine (new doubt: did anyone else had a similar issue? Is the GameCenter authentication view being retained somewhere?). However I noticed that every time I navigate back and forwards among the various scenes prior the GameScene (Initial Scene -> Character Selection -> Planet Selection -> Character Selection -> Planet Selection -> etc.. -> Character Selection ..) the REAL MEMORY usage increases. After a while I start to get memory warnings and the App gets killed by iOS. Now the question is:
-> am I replacing the scenes in the correct way? I call the following from the various scene:
[[CCDirector sharedDirector] replaceScene: [MainMenuScene scene]];
I have Cocos2d 2.0 as static library and the code of replaceScene is this:
-(void) replaceScene: (CCScene*) scene
{
NSAssert( scene != nil, #"Argument must be non-nil");
NSUInteger index = [scenesStack_ count];
sendCleanupToScene_ = YES;
[scenesStack_ replaceObjectAtIndex:index-1 withObject:scene];
nextScene_ = scene; // nextScene_ is a weak ref
}
I wonder if somehow the scene does not get deallocated properly. I verified that the cleanup method is being called however I also added a CCLOG call on the CCLayer dealloc method and rebuild the static library. The result is that the dealloc method doesn't seem to be called.
Is this normal? :D
I found that other people had similar issues. I am wondering if it has to do with retain cycles and self blocks. I really need to spend some time studying this unless, from EDIT 3, anyone can tell me already what I am doing wrong :-)
All memory capacity shared through all apps&processes run in iOS. So, other apps can use a lot of memory and your app receive memory warning too. You'll receive memory warnings until it is not enough.
To understand what actually happens with memory in your app you should
Profile your app with Leaks (ARC is not guarantee that you don't have leaks, i.e. self-capturing issue).
Use heapshot analysis (shortly described here http://bentrengrove.com/blog/2013/4/26/heapshot-analysis)
And checkout this post about memory & virtual memory in iOS: http://liam.flookes.com/wp/2012/05/03/finding-ios-memory/
I solved this by adding a print of the process effective memory usage in the console. In this way I could get a precise measurament of the real memory used by the App process. Using instrument proved to be imprecise as the real memory used did not match with the one shown on instruments.
This code can be used to get the effective memory usage:
-(vm_size_t)report_memory
{
struct task_basic_info info;
mach_msg_type_number_t size = sizeof(info);
kern_return_t kerr = task_info(mach_task_self(),
TASK_BASIC_INFO,
(task_info_t)&info,
&size);
if( kerr == KERN_SUCCESS ) {
} else {
NSLog(#"Error with task_info(): %s", mach_error_string(kerr));
}
return info.resident_size;
}

iOS iPad has 1GB RAM why is my app killed after using 30MB

Is it is possible to write an app that uses 200MB, say? My iPad has 1GB, but I get
didReceiveMemoryWarning
after using 20 or 30MB and shortly after my app is killed. (I am the foreground app so I don't really see why I have to get this warning, why doesn't the OS close the background apps, but whatever). I am taking no action in didReceiveMemoryWarning (just logging it and calling super), is that why I am killed? Or is there other possible reasons?
So I understand I am supposed to free-up memory when I get the warning, but I don't want to! (Lets assume my app REALLY does need 200MB to operate).
If I did free-up some memory when I get the warning (how much?) then would my app then not be killed? And could I then carry on and use up MORE memory? If so I could create some "balloon" memory just so i can free it when warned and then at least my app survives. This seems insane though.
Or is it basically impossible to have an iPAD app that uses more than a few 10s of MB?
I recently had this problem. It basically comes down to the speed at which you allocate memory. If you try to grab a lot of memory up front then iOS will terminate you for using too much memory and not responding to memory warnings. iOS memory handling is ridiculous really. The worst thing is that my problems only arose AFTER I'd released the app on the app store. It took me ages to track down what the problem was :(
The way I managed to handle this was to allocate the RAM i needed at startup (64MB) slowly and hold off when I receive memory warnings. I create my own ViewController that displays an animated splash screen while I'm an initialising the memory usage In viewDidLoad I do the following (Meg is a simple inline function that multiplies by 1024* 1024):
AllocBlockSize = Meg( 2 );
mAllocBlock = (char*)malloc( mAllocBlockSize );
//[mpProgressLabel setText: #"Initialising Memory: 1MB"];
mpInitTimer = [NSTimer scheduledTimerWithTimeInterval: 0.5f target: self selector: #selector( AllocMemory ) userInfo: nil repeats: YES];
In my AllocMemory selector I do this:
- (void) AllocMemory
{
if ( self.view == nil )
return;
if ( mMemoryWarningCounter == 0 )
{
if ( mAllocBlockSize < Meg( 64 ) )
{
mAllocBlockSize *= 2;
mAllocBlock = (char*)realloc( mAllocBlock, mAllocBlockSize );
ZeroMemory( mAllocBlock, mAllocBlockSize );
if ( mAllocBlockSize == Meg( 64 ) )
{
mMemoryWarningCounter = 8;
}
}
else
{
free( mAllocBlock );
// Initialise main app here.
}
}
else
{
mMemoryWarningCounter--;
}
}
And to handle the memory warnings I do as follows:
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
mMemoryWarningCounter += 4;
}
Also do note the ZeroMemory step. When I didn't have this here i would allocate 64MB and still get booted. I assume the touching the memory fully commits it to my app thus zeroing the memory was necessary to eliminate the memory warning and eviction problems I was suffering.
Something is not right. Any app can use about 1/3-1/2 the total physical ram on any device under most any version of iOS from 6-8 without being jetslammed (killed)
I can write a simple app that instantly takes 400mb on a 1gb device, and it's not killed - unless iOS can't terminate other services fast enough.
IOS 8 is more forgiving vs 6 or 7 as many of the launchdaemons now have a jetslammed priority flag which decides which order things get killed - as well as a memory limit so if a LD exceeds that high water mark, it's killed. IOS should let any app keep using memory until all the lower priority jetslammed services have been killed off. There's also another setting for LDs that terminate them when memory is under pressure - regardless of JS priority.
Once all that's left are services with a higher priority than a user app, that's when jetslamms/kills or should get memory warnings before it gets slammed
Programming on the iPad 6 (air 2) is MUCH easier. 2gb ram. 300-400MB free after boot and iOS will back down to using 700MB allowing 1.3GB for an app.... And I bet the iPhone 7 and mini 4s will have 2gb. That will let us see play station 3, or better games, for iOS IF AND ONLY IF users will pay the price of a normal PS3 game (20-80$). Most ppl will complain, but most spend more on these free to pay (play) apps with 4.99$-129.99$ iAps - absurd (Apple should limit iAps to 29.99)
Gone are the days of the 10% rule (where your app should use no more than 10% system ram)
Look at more hardcore, major iOS games.... They use 300-400MB on 1gb devices and won't run on 512mb devices.
So if you are being killed for 30MB something really is not right.

Cocos2d X .Spontaneous unload of resources

On loading scene I preload all resources like sprites, sounds, etc. But one of my test devices (HTC Desire, Android 2.2.2) unloads resources after loading, so when the game tries to play some sound or draw sprite, it freezes for a moment to load the resource again.
This problem appers only on HTC Desire, I didn't met this problem on my other devices (Samsung Galaxy Ace, Android 2.3.6 & Acer A100 tab, Android 4.0.3).
Can someone tell me why this happens? Thanks.
This is how I preload resources:
for (.....)
{
CCString* file = CCString::create(path.c_str());
if(file) {
CCTexture2D* texture = CCTextureCache::sharedTextureCache()->addImage(file->getCString());
}
}
I've also tried like this, but it gives the save result:
for (.....)
{
CCString* file = CCString::create(path.c_str());
if(file) {
CCSpriteFrame* frame = new CCSpriteFrame();
CCSpriteFrameCache::sharedSpriteFrameCache()->addSpriteFrame(frame, file->getCString());
frame->retain();
}
}
In both ways CCTextureCache::sharedTextureCache()->dumpCachedTextureInfo() says that all textures are loaded:
01-29 15:18:36.111: D/cocos2d-x debug info(7579): cocos2d: CCTextureCache dumpDebugInfo: 53 textures, for 103840 KB (101.41 MB)
I also tried to reduce amount of preloaded textures to 31 (42.76 MB), but nothing changed.
P.S.: I repeat, this problem appers only on Desire with 2.2 Android OS ...
One possible explanation for this behavior is described here, under "avoid purging caches during memory warnings".
By default cocos2d (and I suppose cocos2d-x is no different) will purge all caches when a memory warning is received. This means you can preload assets as much as you want, there need only be one memory warning and all of the preloaded (and currently unused) textures will be unloaded.

Resources