I'm developing an iOS application that needs to determine the probability that a user is following a given path.
If they are not following the path, I'd like to give them the option to recalculate.
This should be a relatively simple algorithm, for inputs I have a location (x,y) and n paths (two x,y points).
What is the best way to do this?
You might look at Dijkstra's algorithm to find shortest distance between two points? What I think is you should always feed the current location of vehicle as it will show the recalculated value if taken a wrong turn and show it in graph. Hope it helps.
Related
I'm looking for an algorithm to find a polygon that can represent a set of points in 2D space. Specifically, if given a set of points like this
It should ideally produce something similar to this:
(The arrows are segments)
Basically, the output would be a set of segments that "best" address the features of the points. The algorithms possibly take some parameters to control the numbers of output segments.
I currently do not have any ideas on what algorithms I'm looking for. Any papers or advice are appreciated.
This is a possible algorithm.
For every point, look at the 2 points closest to it, they become connected.
Then use Douglas Peucker to refine the edges.
Essentially you will create a first polygon containing all the points, and the try to eliminate points whose elimination doesn't change the shape too much.
I have an array of MKLocationCoordinate2D in iOS and I'd like to create a heat map of those points based on the clustering of them.
i.e. the more there are in a certain area then the higher the weight.
I've found a load of different frameworks for generating the heat maps and they all require the weights to be calculated yourself (which makes sense).
I'm just not sure where to start with the calculation.
I could do something like calculating the mean distance between each point and every other point but I'm not sure if that's a good idea.
Could someone point me in the direction of how to weight each point based on it's closeness to other points.
Thanks
I solved this by implementing a quad tree and using that to quickly get the number of neighbours within a certain radius.
I can then change the radius to tweak it but it will very quickly return weights based on how many neighbours each point has.
I have implemented a Particle Filter to localise a robot. If i want to get the most likely set of paths, what would be the best way to do it?
I was wondering if taking the particle with the highest weight is a correct way to do it?
At first, each particle should track its paths. This can be done by adding a list of waypoints to each Particle. When you want to get the most likely path, you can take the path from the particle with the highest weight. This is not the same as taking the most likely position in each time step and aggregate them as the most likely path!
You can also use the weighted average values of all paths of the particles. This depends on what distribution you expect. When it has only one mode, this may give a more precise path. In contrast, if you expect a multimodal distribution (assume an obstacle, where half of the particles pass left and the other half pass right), the weighted average might give wrong results.
I would stick with the particle with the highest weight.
I am trying to estimate the pose and position of a satellite given an image of it. I have a 3D model of the satellite. Using either PnP solvers or POSIT works great when I pick out the point correspondences myself, however I need to to find a method to match the points up automatically. Using a corner detector (best one I found so far is based on the contour) I can find all the relevant points in the image in addition a few spurious points. However I need to match a given point in the image to the correct point in the 3D model. The articles I have read on the subject always seem to assume that we have found the point pairs without going into details about how to do so.
Is there any approach usually taken that can determine these correspondences based on some invariant features? Or should i resort to a different method not based on corner points?
You can have a look at the SoftPOSIT algorithm, which determines 3D-2D correspondences and then executes POSIT algorithm. As far as I know Matlab code is available for SoftPOSIT.
ou have to do PnP with RANSAC, see openCV code solvePnPRansac(). This method can tolerate a high percent of mismatches so you don't need to be precise with all your matches but just have a certain percent of correct ones (even as low as 30%). Of course the min number of right correspondences is 4.
Speaking of invariant features - if the amount of rotation between neighbouring frame is small you don't need to use invariant features. Even a small patch of with grey intensities would suffice to find a match. The only problem is that you have to update your descriptor or even choose a different feature point on your model depending on the model rotation. The latter may be hard to do since you have to know 3D coordinate of every feature.
After been able to calculate shortest distance using Dijstra algorithm feeding manually vertex point (getting lat and long from google maps) I'm searching a more dynamic way to do the same.
Assuming I'm having a shape file representing my map (with boundaries and obstacles) which algorithm I can use to decompose it?
Googling a little bit I found I should do a "cell decomposition" but honestly I've not figured out how to do it?
Thank you.
If you only have obstacles in shape file, than you could construct visibility graph, and use dijkstra on that.
If you have regions with different passabilities, than you should use some more complicated techniques, for example overlay it with some grid(rectangualr or triangular), than triangulate, assign weights to it's edges, and than use dijkstra too