Here is what i tried:
[SKRoutingService sharedInstance].navigationDelegate = self;
SKNavigationSettings* navSettings = [SKNavigationSettings navigationSettings];
navSettings.transportMode = SKTransportPedestrian;
navSettings.showStreetNamePopUpsOnRoute = YES;
navSettings.viaPointNotificationDistance = 5;
navSettings.navigationType=SKNavigationTypeReal;
navSettings.distanceFormat=SKDistanceFormatMetric;
[[SKRoutingService sharedInstance]startNavigationWithSettings:navSettings];
I dont find any differences in the behavior when i tried different transportMode. What i believe is Audio advices are only targeted for Car, where advices are alerting user before large distances (take right turn in 30 meters....). But i want it to alert user in short distances (take right turn in 2 meters ...)
What can i do to get advices which are for pedestrians but not for Car/bike riders?
You can change the pre-defined distances for pedestrian advices in SKMaps.bundle > AdvisorConfigs > Pedestrian> advice_places.adv
Where the first column represents distances_outside_city and the second distances_in city.
Here you can find more info and know how on the SDK audio advices.
Related
In my app I want to detect how strong surrounding magnetic/electromagnetic fields are. What I want to achieve is to measure magnetic field change to know if it's stronger than in control measurement or if it's lower. This is my code:
- (void)setupLocationManager {
self.locationManager = [[CLLocationManager alloc] init];
if ([CLLocationManager headingAvailable] == NO) {
self.locationManager = nil;
} else {
self.locationManager.headingFilter = kCLHeadingFilterNone;
self.locationManager.delegate = self;
[self.locationManager startUpdatingHeading];
}
}
// CLLocationManagerDelegate
- (void)locationManager:(CLLocationManager *)manager didUpdateHeading:(CLHeading *)heading {
CGFloat magnitude = sqrt(heading.x * heading.x + heading.y * heading.y + heading.z * heading.z);
if (self.defaultMagnitudeValue == 0.f) {
self.defaultMagnitudeValue = magnitude;
}
self.curentMagnitudeValue = magnitude;
}
Magnitude value reacts to the magnetic fields surrounding the device, the problem is that you need to be REALLY close to a source of such a field.
So, the question is: Is there any possibility for an iOS app to measure magnetic fields on distances more than 10-20 centimeters? If so, how?
P.S.: I have also checked the Teslameter App by Apple, it has nearly same code and totally same problems.
Sure you can, the magnetic field just has to be very strong!
Your first limit might be the hardware. As noted in In iOS, what is the difference between the Magnetic Field values from the Core Location and Core Motion frameworks? the range of those raw values may be limited. At some some hardware and iOS versions restricted them to -128 to +128 microteslas. On an iPhone 6 I can get microtelsa readings well outside that range but that doesn't necessarily help. Apple doesn't seem to provide any reference for the accuracy of magnetometer readings, we can measure values out to nanotesla but the results we get back might be meaningless noise.
Earth's magnetic field at the surface will be 25 to 65 microteslas. Any field you hope to measure is going to need to be measurably stronger than that at the desired measurement distance. What are you trying to measure, and is it really strong enough to move a compass needle at the distance you want to measure it?
I am using SpriteKit's built in Physics Engine to build a game for iOS. Basically it involves a bouncing ball which moves via me manually setting it's initial velocity and bounces via resetting the velocity within the contact event with the floor.
The issue is, the actual maths for this environment do not add up. Using 'SUVAT' equations it's easy to determine how far the ball's x-displacement should be when it reaches the floor after being thrown with a certain velocity, however (with gravity set to -9.81), it barely moves a couple of pixels.
I simplified the problem to just trying to shoot a ball a certain distance upwards (in the y-direction) and the same thing happened, it moves a couple of points up and then just falls to the floor, at least a 20th of how far it should move.
This is how I have set the physics environment up:
self.physicsWorld.contactDelegate = self;
self.physicsWorld.gravity = CGVectorMake(0.0, -9.81);
And then this is my function for generating this ball (shooting upwards) example. Mathematically it should reach the height of the tower:
-(void)generateTestBall {
self.ball = [SKSpriteNode spriteNodeWithImageNamed:#"ball"];
SKSpriteNode * tower = [SKSpriteNode spriteNodeWithImageNamed:#"player"];
self.ball.position = CGPointMake(self.scene.size.width/2,self.scene.size.height/2);
self.ball.size = CGSizeMake(20,20);
self.ball.color = [SKColor redColor];
self.ball.colorBlendFactor = 1;
tower.position = CGPointMake(self.scene.size.width/2 + 20,self.scene.size.height/2+100);
tower.size = CGSizeMake(20,200);
tower.color = [SKColor blueColor];
tower.colorBlendFactor = 1;
[self addChild:tower];
[self addChild:self.ball];
self.ball.physicsBody = [SKPhysicsBody bodyWithCircleOfRadius:10];
self.ball.physicsBody.affectedByGravity = YES;
self.ball.physicsBody.linearDamping = NO;
self.ball.physicsBody.dynamic = YES;
CGFloat ballVel = sqrt(2*9.81*tower.size.height);
NSLog(#"%f",ballVel);
self.ball.physicsBody.velocity = CGVectorMake(0.0f, ballVel);
}
Please can someone explain what I am doing wrong? I've double checked my maths (I'm a maths student so fingers crossed that's not the issue)!
Thanks!
Steve
So I FINALLY managed to figure it out. Just incase anyone else is experiencing the same issue I'll post the answer here:
The issue was that, although gravity is (apparently) in ms^-2 and velocity m2^-1 (to replicate earth), any distances in Objective C are measured in POINTS rather than the required form of METRES. Therefore any calculation done with x,y position / size values taken from SKSpriteNodes etc will be a certain factor out.
After running a few tests I found the factor to roughly be 157. This means that you must multiply any sizes / distances in POINTS by 157 to get the relative 'METRE' value which will work with SUVAT.
The actual numbers themselves seem a bit ridiculous as they're all very big (velocity, distance etc) but that doesn't actually pose a problem anyway as they all now work relative to each other!
Hope this helps anyone!
Steve
I'm following the examples in CorePlot to add new points to a scatter plot as I acquire them. I modeled my graph from the RealTimePlot example that ships with CoreData. I noticed that when I'm running my graphs, the CPU pegs to 100% once the screen starts scrolling to render the new data. I've tried different ways of adding the data into my dataSource as well as update calls, but I can't get anything that's both responsive and an acceptable CPU load.
After much trial and error, I fired up the RealTimePlot example to see what the CPU load was, and it's pegging at 100% as well. This is on an iPhone 5s, so I'm scared to see how a 4 will act. My question; Is there an alternative way to update the graph?
Anyhow, here is my update loop. I'm taking accelerometer data and graphing it.
-(void)motionController:(VWWMotionController*)sender didUpdateAcceleremeters:(CMAccelerometerData*)accelerometers{
static NSInteger counter = 0;
CPTGraph *theGraph = self.graph;
CPTPlot *thePlot = [theGraph plotWithIdentifier:#"Blue Plot"];
if ( thePlot ) {
if ( self.dataForPlot.count >= NUM_POINTS ) {
[self.dataForPlot removeObjectAtIndex:0];
[thePlot deleteDataInIndexRange:NSMakeRange(0, 1)];
}
CPTXYPlotSpace *plotSpace = (CPTXYPlotSpace *)theGraph.defaultPlotSpace;
NSUInteger location = (counter >= NUM_POINTS ? counter - NUM_POINTS + 2 : 0);
CPTPlotRange *oldRange = [CPTPlotRange plotRangeWithLocation:CPTDecimalFromUnsignedInteger( (location > 0) ? (location - 1) : 0 )
length:CPTDecimalFromUnsignedInteger(NUM_POINTS - 2)];
CPTPlotRange *newRange = [CPTPlotRange plotRangeWithLocation:CPTDecimalFromUnsignedInteger(location)
length:CPTDecimalFromUnsignedInteger(NUM_POINTS - 2)];
[CPTAnimation animate:plotSpace
property:#"xRange"
fromPlotRange:oldRange
toPlotRange:newRange
duration:CPTFloat(0)];
NSNumber *x = #(counter);
NSNumber *y = #(accelerometers.acceleration.x);
[self.dataForPlot addObject:#{ #"x": x,
#"y": y }];
dispatch_async(dispatch_get_main_queue(), ^{
[thePlot insertDataAtIndex:self.dataForPlot.count - 1 numberOfRecords:1];
});
counter++;
}
}
FYI. I'd ideally like to have 30 FPS. I have adjusted the accelerometer callback from 1/5 to 1/30 with some CPU improvement, but then I'm loosing frame rate.
If I comment out the insertDataAtIndex line, the CPU hangs back at 1%, but of course the graph won't follow the new data.
I don't see any derivatives of the insert method.
I've also read in other threads that recalculating the axis labels is resource intesive. I've tried removing them with very little improvement.
I'm thinking of going to OpenGL or CoreGraphics for this job.
What do you suggest?
NOTE: I've not tested on an iPhone 5s (I've done iPhone 4s, 5, iPad 3, iPad 4, iPad Mini 1) and our apps seem to have different requirements, so I'm not positive this will help you.
Regardless, in my testing I've found that I get the best, most responsive results by redrawing all points each time I get a new data point. This is best for me because I am having to update the graph ~120 times per second. I don't have an iPhone around to test our app's CPU, but on an iPad Mini, the CPU is < 50% with the graph running 2 plots (~125 points each, though this fairly responsive put to around 250 points each). The graph only has labels on the y-axis and they don't change. The x-axis is time, and the labels are unnecessary for us.
Our requirements may be too different for this implementation to help you, but it may be worth a shot. If you want to, it goes something like this:
-(void)UpdateGraph
{
//foreach scatterplot
{
//Remove point from array
//[linePlot deleteDataInIndexRange:NSMakeRange(0, [array count])]
}
//Insert new data point into array(s)
//foreach scatterplot
{
//[linePlot insertDataAtIndex:0 numberOfRecords:[array count]]
}
//Schedule next update if necessary
//[self performSelector:#selector(UpdateGraph) withObject:nil afterDelay:(1/(CGFloat)Your_FPS_Here)];
}
Hope that helps you and best of luck!
The key is to make the graph simple and fast to draw. Set the labeling policy so that you don't have too many labels. Don't use minor ticks or grid lines (set the line styles to nil). Turn off major grid lines, too, if you don't need them. Don't use an area fill on the plot.
I'm trying to detect three actions: when a user begins walking, jogging, or running. I then want to know when the stop. I've been successful in detecting when someone is walking, jogging, or running with the following code:
- (void)update:(CMAccelerometerData *)accelData {
[(id) self setAcceleration:accelData.acceleration];
NSTimeInterval secondsSinceLastUpdate = -([self.lastUpdateTime timeIntervalSinceNow]);
if (labs(_acceleration.x) >= 0.10000) {
NSLog(#"walking: %f",_acceleration.x);
}
else if (labs(_acceleration.x) > 2.0) {
NSLog(#"jogging: %f",_acceleration.x);
}
else if (labs(_acceleration.x) > 4.0) {
NSLog(#"sprinting: %f",_acceleration.x);
}
The problem I run into is two-fold:
1) update is called multiple times every time there's a motion, probably because it checks so frequently that when the user begins walking (i.e. _acceleration.x >= .1000) it is still >= .1000 when it calls update again.
Example Log:
2014-02-22 12:14:20.728 myApp[5039:60b] walking: 1.029846
2014-02-22 12:14:20.748 myApp[5039:60b] walking: 1.071777
2014-02-22 12:14:20.768 myApp[5039:60b] walking: 1.067749
2) I'm having difficulty figuring out how to detect when the user stopped. Does anybody have advice on how to implement "Stop Detection"
According to your logs, accelerometerUpdateInterval is about 0.02. Updates could be less frequent if you change mentioned property of CMMotionManager.
Checking only x-acceleration isn't very accurate. I can put a device on a table in a such way (let's say on left edge) that x-acceleration will be equal to 1, or tilt it a bit. This will cause a program to be in walking mode (x > 0.1) instead of idle.
Here's a link to ADVANCED PEDOMETER FOR SMARTPHONE-BASED ACTIVITY TRACKING publication. They track changes in the direction of the vector of acceleration. This is the cosine of the angle between two consecutive acceleration vector readings.
Obviously, without any motion, angle between two vectors is close to zero and cos(0) = 1. During other activities d < 1. To filter out noise, they use a weighted moving average of the last 10 values of d.
After implementing this, your values will look like this (red - walking, blue - running):
Now you can set a threshold for each activity to separate them. Note that average step frequency is 2-4Hz. You should expect current value to be over the threshold at least few times in a second in order to identify the action.
Another helpful publications:
ERSP: An Energy-efficient Real-time Smartphone Pedometer (analyze peaks and throughs)
A Gyroscopic Data based Pedometer Algorithm (threshold detection of gyro readings)
UPDATE
_acceleration.x, _accelaration.y, _acceleration.z are coordinates of the same acceleration vector. You use each of these coordinates in d formula. In order to calculate d you also need to store acceleration vector of previous update (with i-1 index in formula).
WMA just take into account 10 last d values with different weights. Most recent d values have more weight, therefore, more impact on resulting value. You need to store 9 previous d values in order to calculate current one. You should compare WMA value to corresponding threshold.
if you are using iOS7 and iPhone5S, I suggest you look into CMMotionActivityManager which is available in iPhone5S because of the M7 chip. It is also available in a couple of other devices:
M7 chip
Here is a code snippet I put together to test when I was learning about it.
#import <CoreMotion/CoreMotion.h>
#property (nonatomic,strong) CMMotionActivityManager *motionActivityManager;
-(void) inSomeMethod
{
self.motionActivityManager=[[CMMotionActivityManager alloc]init];
//register for Coremotion notifications
[self.motionActivityManager startActivityUpdatesToQueue:[NSOperationQueue mainQueue] withHandler:^(CMMotionActivity *activity)
{
NSLog(#"Got a core motion update");
NSLog(#"Current activity date is %f",activity.timestamp);
NSLog(#"Current activity confidence from a scale of 0 to 2 - 2 being best- is: %ld",activity.confidence);
NSLog(#"Current activity type is unknown: %i",activity.unknown);
NSLog(#"Current activity type is stationary: %i",activity.stationary);
NSLog(#"Current activity type is walking: %i",activity.walking);
NSLog(#"Current activity type is running: %i",activity.running);
NSLog(#"Current activity type is automotive: %i",activity.automotive);
}];
}
I tested it and it seems to be pretty accurate. The only drawback is that it will not give you a confirmation as soon as you start an action (walking for example). Some black box algorithm waits to ensure that you are really walking or running. But then you know you have a confirmed action.
This beats messing around with the accelerometer. Apple took care of that detail!
You can use this simple library to detect if user is walking, running, on vehicle or not moving. Works on all iOS devices and no need M7 chip.
https://github.com/SocialObjects-Software/SOMotionDetector
In repo you can find demo project
I'm following this paper(PDF via RG) in my indoor navigation project to determine user dynamics(static, slow walking, fast walking) via merely accelerometer data in order to assist location determination.
Here is the algorithm proposed in the project:
And here is my implementation in Swift 2.0:
import CoreMotion
let motionManager = CMMotionManager()
motionManager.accelerometerUpdateInterval = 0.1
motionManager.startAccelerometerUpdatesToQueue(NSOperationQueue.mainQueue()) { (accelerometerData: CMAccelerometerData?, error: NSError?) -> Void in
if((error) != nil) {
print(error)
} else {
self.estimatePedestrianStatus((accelerometerData?.acceleration)!)
}
}
After all of the classic Swifty iOS code to initiate CoreMotion, here is the method crunching the numbers and determining the state:
func estimatePedestrianStatus(acceleration: CMAcceleration) {
// Obtain the Euclidian Norm of the accelerometer data
accelerometerDataInEuclidianNorm = sqrt((acceleration.x.roundTo(roundingPrecision) * acceleration.x.roundTo(roundingPrecision)) + (acceleration.y.roundTo(roundingPrecision) * acceleration.y.roundTo(roundingPrecision)) + (acceleration.z.roundTo(roundingPrecision) * acceleration.z.roundTo(roundingPrecision)))
// Significant figure setting
accelerometerDataInEuclidianNorm = accelerometerDataInEuclidianNorm.roundTo(roundingPrecision)
// record 10 values
// meaning values in a second
// accUpdateInterval(0.1s) * 10 = 1s
while accelerometerDataCount < 1 {
accelerometerDataCount += 0.1
accelerometerDataInASecond.append(accelerometerDataInEuclidianNorm)
totalAcceleration += accelerometerDataInEuclidianNorm
break // required since we want to obtain data every acc cycle
}
// when acc values recorded
// interpret them
if accelerometerDataCount >= 1 {
accelerometerDataCount = 0 // reset for the next round
// Calculating the variance of the Euclidian Norm of the accelerometer data
let accelerationMean = (totalAcceleration / 10).roundTo(roundingPrecision)
var total: Double = 0.0
for data in accelerometerDataInASecond {
total += ((data-accelerationMean) * (data-accelerationMean)).roundTo(roundingPrecision)
}
total = total.roundTo(roundingPrecision)
let result = (total / 10).roundTo(roundingPrecision)
print("Result: \(result)")
if (result < staticThreshold) {
pedestrianStatus = "Static"
} else if ((staticThreshold < result) && (result <= slowWalkingThreshold)) {
pedestrianStatus = "Slow Walking"
} else if (slowWalkingThreshold < result) {
pedestrianStatus = "Fast Walking"
}
print("Pedestrian Status: \(pedestrianStatus)\n---\n\n")
// reset for the next round
accelerometerDataInASecond = []
totalAcceleration = 0.0
}
}
Also I've used the following extension to simplify significant figure setting:
extension Double {
func roundTo(precision: Int) -> Double {
let divisor = pow(10.0, Double(precision))
return round(self * divisor) / divisor
}
}
With raw values from CoreMotion, the algorithm was haywire.
Hope this helps someone.
EDIT (4/3/16)
I forgot to provide my roundingPrecision value. I defined it as 3. It's just plain mathematics that that much significant value is decent enough. If you like you provide more.
Also one more thing to mention is that at the moment, this algorithm requires the iPhone to be in your hand while walking. See the picture below. Sorry this was the only one I could find.
My GitHub Repo hosting Pedestrian Status
You can use Apple's latest Machine Learning framework CoreML to find out user activity. First you need to collect labeled data and train the classifier. Then you can use this model in your app to classify user activity. You may follow this series if are interested in CoreML Activity Classification.
https://medium.com/#tyler.hutcherson/activity-classification-with-create-ml-coreml3-and-skafos-part-1-8f130b5701f6
I've been checking the Tapku Calendar code for a bit and searched and read all the relevant questions and responses here however none seem to really offer the correct solution to the problem: How to select multiple dates, either programmatically or by tapping. Just a simple blue tile over two adjacent dates would make me happy :-) The post below seems to have a similar question however the answer does not work. The place in the code is not hit unless the month changes - not exactly what I am looking for. What would be great is a higher-level implementation of selectDate: that would select multiple dates. But just the right place to tweak in the library would be a great place to start is anyone is more familiar with the code. Much appreciated.
iOS: Tapku calendar library - allow selecting multiple dates for current month
So after a bit of stepping through code, I have this rudimentary method using a hammer. I adopted most of the code from TKCalendarMonthView.m->selectDay:day method. The method I created basically creates a new TKCalendarMonthTiles object and fills in the details and then adds subviews onto the main TKCalendarMonthTiles object (self). I tag the subviews so I can first get rid of them if they exist at the beginning of the method as I only want to select one additional day (you could leave the subviews attached if you want them to remain in the UI). I don't track the dates or store them or anything however this meets my needs.
The idea is to simply create a view with the correct tile image you want to use and one that contains the text label of the actual "date" like "14" then add those views as subviews to self. The borrowed code does all the calculations for "where" that date tile resides in the grid, so the view is drawn at the correct location. Code:
- (void)markDay:(int)day {
// First, remove any old subviews
[[self viewWithTag:42] removeFromSuperview];
[[self viewWithTag:43] removeFromSuperview];
int pre = firstOfPrev < 0 ? 0 : lastOfPrev - firstOfPrev + 1;
int tot = day + pre;
int row = tot / 7;
int column = (tot % 7)-1;
TKCalendarMonthTiles *deliveryTile = [[TKCalendarMonthTiles alloc] init];
deliveryTile.selectedImageView.image = [UIImage imageWithContentsOfFile:TKBUNDLE(#"TapkuLibrary.bundle/Images/calendar/MyDateTile.png")];
deliveryTile.currentDay.text = [NSString stringWithFormat:#"%d",day];
if(column < 0){
column = 6;
row--;
}
CGRect r = deliveryTile.selectedImageView.frame;
r.origin.x = (column*46);
r.origin.y = (row*44)-1;
deliveryTile.selectedImageView.frame = r;
deliveryTile.currentDay.frame = r;
[[deliveryTile selectedImageView] setTag:42];
[[deliveryTile currentDay] setTag:43];
[self addSubview:deliveryTile.selectedImageView];
[self addSubview:deliveryTile.currentDay];
} // markDay:
I call this method at the end of TKCalendarMonthView.m->selectDay:day as well as at the end of TKCalendarMonthView.m->-reactToTouch:down. Limited testing so far so good. Off to figure out why the timezone setting keeps thinking its tomorrow (I am in Pacific time zone).
Cheers, Michael