How to instanciate a NMARoute object from GPX file - ios

We are working with the iOS Premium HereMaps SDK. Our basic question is how to instantiate a NMARoute object from a quite detailed GPX file. This object should be used for a custom turn-by-turn navigation. At the moment we take the following steps:
Create an array of NMAGeoCoordinates objects from the GPX file
Create an array of NMAWaypoints
Call [NMACoreRouter calculateRouteWithStops:]
Unfortunately we are facing multiple limitations:
The [NMACoreRouter calculateRouteWithStops:] method limits the number of waypoints. Although we have a GPX file that is detailed enough to create the whole route we have to cherry pick waypoints and let the here service calculate the route again. This process does not ensure we will get exactly the route we had in the GPX file.
NMAWaypoints cause a "You reached your stopover" voice call out during the turn by turn navigation every time a waypoint is reached. We know that it is possible to use NMAViaWaypoint as type but this is unsuitable for us because NMAViaWaypoints will be dismissed during a reroute process which could happen during the navigation. Moreover stopovers will cause a break in the navigation e.g. in the distances displayed.
For some GPS points that are located at the middle of a crossing the HereMaps routing calculation sometimes chooses a different route than intended. This is again due to the problem that HereMaps want to calculate the route by its own, despite the fact that we have a detailed GPX file containing the route.
What we are actually looking for is a better way to get a NMARoute object from a GPX file. To our surprise there is a REST endpoint provided by the HereMaps SDK to convert a GPX file to json data but unfortunately no way to feed this data into the iOS SDK.

This will sadly not work with the current HERE iOS SDK. There is a tight coupling of TbT Navigation and routing, so Navigation can only work with the internal route engine at the moment.
Some background:
Imagine you derive from your pre-calculated route while navigating (it might even happen when you don't derive, e.g. due to GPS jumps, wrong mapmatching in complex situations, temporary loss of signal, if you start in unmapped or private roadnetworks, and so on) navigation will ask routing to re-calculate. And what if your GPX trace is not matching the road network and mapdata in the iOS MobileSDK based application ? So your trace tells guidance to drive somewhere, where no road is available anymore ? Guidance would refuse and force recalculation. And in your case you most probably don't want traffic optimized navigation, but what to do with blocked roads (so not just slow free flow speed, but fully blocked)? I'd suggest to enable optimization here - but that would also not work with static GPX traces.
So as you already said, the better solution is to recalculate locally a route that's very close to your traces, but takes the local mapdata and constrains into account.
The limitations you mention are correct, but:
The Waypoint limit has been completely remove in SDK 3.6 now. But please still keep care with the number of waypoints, especially with a number > 500 and complex segments in between these waypoints.
Did you try to use NMAAudioManager delegate to intercept the audio output ? https://developer.here.com/documentation/ios-premium/topics_api_nlp_hybrid_plus/protocolnmaaudiomanagerdelegate-p.html#topic-apiref
As described above, there are many reasons why the route derives. Sometimes it's due to mapdata, sometimes due to the calculation constains. 100% reconstruction might be tricky (sometimes probably you are right with the GPX trace, but in some situations the HERE SDK might be right), so try to play around with the number of waypoints and routing options to get as close as possible.

Related

In Here-API how do we show downloaded region package NMAMapLoader in MapView?

I downloaded and run map-downloader-ios-swift (https://github.com/heremaps/here-ios-sdk-examples/blob/master/map-downloader-ios) project and it worked well. After finished download 1 region package, how do we show that region in mapview without an internet connection? Also is it possible to add routes in that downloaded region in offline mode?
Thank you.
In offline mode, cached map data is used for rendering, routing, searching, etc. So since you have downloaded the map region package of interest, viewing and routing requests related to to this region will be done with the cached data.
Please read this reference (also applicable to iOS):
https://developer.here.com/documentation/android-premium/dev_guide/topics/routing-offline.html
And as stated on the reference page:
There is no guarantee that online and offline routes will be the same as different algorithms are used for online and offline route calculation. Online route calculation is performed on high performance servers, therefore more computationally intensive algorithms are used online, which cannot be used offline. Online route calculation should be preferred and offline routes are expected to be used as backup especially when there is no connectivity.

AVAudioEngine schedule sample accurate parameter changes

I am trying to create an app using a combination of AVAudioPlayerNode instances and other AUAudioUnits for EQ and compression etc. Everything connects up well and using the V3 version of the API certainly makes configuration easier for connecting nodes together. However during playback I would like to be able to automate parameter changes such a the gain on a mixer so that the changes are ramped (eg. fade out or fade in.) and feel confident that the changes are sample accurate.
One solution I have considered to install a tap on a node (perhaps the engine's mixer node) and within that adjust the gain for a given unit but since the tap is on the output of a unit this is always going to be too late to have the desired effect (I think) without doing of offset calculations and then delaying my source audio playback to match up to the parameater changes. I have also looked at the scheduleParameterBlock property on AUAudioUnit but it seems I would need to implement my own custom unit to make use of that rather than use built-in units even though it was mentioned in
WWDC session 508: " ...So the first argument to do schedule is a sample
time, the parameter value can ramp over time if the Audio Unit has
advertised it as being rampable. For example, the Apple Mixer does
this. And the last two parameters, of course, are function parameters
are the address of the parameter to be changed and the new parameter
value.... "
Perhaps this meant that internally the Apple Mixer uses it and not that we can tap into any rampable capabilities. I can't find many docs or examples other than implementing a custom audio unit as in Apple's example attached to this talk.
Other potential solutions I have seen include using NSTimer, CADisplayLink or dispatchAfter... but these solutions feel worse and less sample accurate than offsetting from the installed tap block on the output of a unit.
I feel like I've missed something very obvious since there are other parts of the new AVAudioEngine API that make a lot of sense and the old AUGraph API allowed more access to sample accurate sequencing and parameter changing.
This is not as obvious as you'd hope it would be. Unfortunately in my tests, the ramp parameter on scheduleParameterBlock (or even the underlying AudioUnitScheduleParameters) simply doesn't do anything. Very odd for such a mature API.
The bottom line is that you can only set a parameter value within a single buffer, not at the sample level. Setting a parameter value at a sample time, will automatically ramp from the current value to the new value by the end of the containing buffer. There seems to be no way to disable this automatic ramping.
Longer fades have to be done in sections by setting fractional values across multiple buffers and keeping track of the fade's relative progress. In reality, for normal duration fades, this timing discrepancy is unlikely to be a problem because sample-accuracy would be overkill.
So to sum up, sample-level parameter changes seem to be impossible, but buffer-level parameter changes are easy. If you need to do very short fades (within a single buffer or across a couple of buffers) then this can be done at the sample-level by manipulating the individual samples via AURenderCallback.

Passing object/NSData between iOS devices

I'm creating a game, turn based, and I was thinking of using Game Center to handle it, but the passed game-object is evidently max 64kb. Is there another way to pass objects between devices for this use, without having to create a database or storage-server as middle man? The game-object itself for me is probably a lot less than 64kb, but there are some initial variables I would like to send, such as images. With my calculations, the initial data for one game is about 500kb, but after getting those images once, the passed game object is just a couple of kb's, and are never going to include those images again.
Is there a way to send these images directly?
There are a few ways to get around the limit.
This answer mentions Alljoyn which would allow you to transfer that size of files.
You could also send them indirectly by transferring them to your own server, then passing a link to the file to the other player. For a turn based game, this would have good advantages of enhanced reliability as you could put in retries on error for both the upload to the server and the download to the device and control it yourself. I would recommend AFHTTPClient for this, also.
Is there another way to pass objects between devices for this use, without having to create a database or storage-server as middle man?
Without your own server, there isn't.

Firefox OS device gps

I'm currently searching a way for device using firefox OS to communicate with device's gps, so it can get the exact location positioning, rather than the w3c geolocation api which is not as accurate as gps realtime.. Thanks!
Simple answer: it isn't possible to access the "device's GPS" directly. You only have the Geolocation API that you already know.
Long answer: My experience with it is not bad at all. So, I think only of two possibilities for not getting "exact location positioning", as you name it:
maybe you're not using the right options to get a precise position. In this case, you could tweak your options a bit to get better results;
maybe you're not waiting until the underlying software can use your GPS instead of some less accurate instrument/estimation (like Wi-Fi positioning estimation).
It can be a combination of both =P
In the first case, you can verify if you're using enableHighAccuracy, like this:
navigator.geolocation.watchPosition(
successCallback,
errorCallback,
{ enableHighAccuracy: true }
);
This will ask the browser for better results, as the standard indicates. Watch out that this may use more battery, and this may not be available anyway. This may take more time too, which is related to my other observation.
In the second case, you may be using a value for timeout that is too small, and maybe it's combined with a maximumAge that may be too high.
If maximumAge is high and timeout is small, you get an out dated position, as there won't be enough time to get a new position and you accept an old one.
If both are small, you'll start to get lots of TIMEOUT errors (the value is 3), as there'll be no positions for you.
You need to find the right balance between all 3 options to get the best positions. And you have to be patient sometimes.
Play with all 3 options and take a look at the errors you get. They'll tell you a lot about your issue getting precise and accurate coordinates.
The position object has some attributes that may come in handy to analyze what's happaning:
the position.timestamp attribute will tell you how old that position object is. If this is old, you know you should tweak the options
the position.coords.accuracy attribute will tell you the accuracy level of the lat/long coordinates. If this is too big (it's in meters), you know you should tweak the options
If you wait forever, on a place where the GPS should work well (say, outdoors, on a clean field), and you keep getting inaccurate results, maybe you can't do much better anyway. I'd say it's not possible anyway, with your software+hardware =(
As of now, Firefox OS only has support for GPS positioning (with the latest addition of A-GPS in the mix). That results in the fact, that most of the time you will have to wait from 1 to several minutes at least for the GPS module to acquire lock on your location, and you will need clear look at the sky for the lock to be acquired.
That said, after a lock is acquired, by using the right settings in the call itself (like setting the enableHighAccuracy flag to true) the GPS should provide as accurate position as any other device would.
Right now cell-based and wifi-based geolocation is not available in the current version of the OS (1.0.1 or 1.1.0, either) but is in the pipeline.
You can use the Geolocation API Firefox OS or Google Maps (I do not remember where I got it)

How should a parser filter behave in directshow editing services?

we´ve created a custom push source / parser filter that is expected to work in a directshow
editing services timeline.
Now everything is great except that the filter does not stop to deliver samples when the current
cut has reached it´s end. The rendering stops, but the downstream filter continues to consume
samples. The filter delivers samples until it reaches EOF. This causes high cpu load, so the application
is simply unusable.
After a lot of investigation I’m not able to find a suitable mechanism that can inform my filter
that the cut is over so the filter needs to be stopped :
The Deliver function on the connected decoder pins always returns S_OK, meaning the attached decoder
is also not aware the IMediaSamples are being discarded downstream
there’s no flushing in the filter graph
the IMediaSeeking::SetPositions interface is used but only the start positions are set –
our is always instructed to play up to the end of the file.
I would expect when using IAMTimelineSrc::SetMediaTimes(Start, Stop) from the application
that this would set a stop time too, but this does not happen.
I’ve also tried to manipulate the XTL timeline adding ‘mstop’ attributes to all the clip in the
hope that this would imply a stop position being set, but to no avail
In the filters point of view, the output buffers are always available (as the IMediaSamples are being discarded downstream),
so the filter is filling samples as fast as it can until the source file is finished.
Is there any way the filter can detect when to stop or can we do anything from the application side ?
Many thanks
Tilo
You can try adding a custom interface to your filter and call a method externally from your client application. See this SO question for a bit more of details on this approach. You should be careful with thread safety while implementing this method, and it is indeed possible that there is a neater way of detecting that the capturing should be stopped.
I'm not that familiar with DES, but I have tried my demux filters in DES and the stop time was set correctly when there was a "stop=" tag for the clip.
Perhaps your demux does not implement IMediaSeeking correctly. Do you expose IMediaSeeking through the pins?
I had a chance to work with DES and custom push source filter recently.
From my experience;
DES actually does return error code to Receive() call, which is in turn returned to Deliver() of the source, when the cut reaches the end.
I hit the similar situation that source does not receive it and continues to run to the end of the stream.
The problem I found (after a huge amount of ad-hoc trials) is that the source needs to call DeliverNewSegment() method at each restart after seek. DES seems to take incoming samples only after that notification. It looks like DES receives the samples as S_OK even without that notification, but it just throws away.
I don't see DES sets end time by IMediaSeeking::SetPositions, either.
I hope this helps, although this question was very old and I suppose Tilo does not care this any more...

Resources