In my app the iPhone is moved in two cases:
Moved in lineare direction
Moved in a arc direction
How can I detect whether the direction of the movement changes the direction?
You'll have to pull in the CoreMotion framework and start the device accelerometer and/or gyroscope.
CoreMotion Reference
What you'r looking to get is CMAccelerometerData. This is done by instantiating a CMMotionManager object and calling startAccelerometerUpdatesToQueue:withHandler:
Something like this:
CMMotionManager *manager = [[CMMotionManager alloc] init];
manager.accelerometerUpdateInterval = 0.5; // half a second
NSOperationQueue *queue = [[NSOperationQueue alloc] init];
[manager startAccelerometerUpdatesToQueue:queue withHandler:^(CMAccelerometerData *accelerometerData, NSError *error) {
double x = accelerometerData.acceleration.x;
double y = accelerometerData.acceleration.y;
double z = accelerometerData.acceleration.z;
// post back to main queue to update UI
dispatch_async(dispatch_get_main_queue(), ^{
});
}];
You'll need to use some good old-fashioned geometry to detect arcs vs. lines.
Related
After seeing this question, I tried to code up a quick program that would save the watches accelerometer and gyroscope data to a file.
#implementation InterfaceController{
NSMutableArray *accData;
bool recording;
}
- (void)awakeWithContext:(id)context {
[super awakeWithContext:context];
// Configure interface objects here.
self.motionManager = [[CMMotionManager alloc] init];
[self.motionManager setAccelerometerUpdateInterval:.01];
}
- (IBAction)startStopRecording {
if (!recording){//We are starting to record.
recording = YES;
accData = [[NSMutableArray alloc] init];
[self.startRecording setTitle:#"Stop Recording"];
[self.motionManager startAccelerometerUpdatesToQueue:[NSOperationQueue currentQueue] withHandler:^(CMAccelerometerData *accelerometerData, NSError *error) {
[accData addObject:[NSString stringWithFormat:#"%f, %f, %f", accelerometerData.acceleration.x, accelerometerData.acceleration.y, accelerometerData.acceleration.z]];
}];
}else{
recording = NO;//we are stopping the recording
[self.motionManager stopAccelerometerUpdates];
[self.startRecording setTitle:#"Start Recording"];
[InterfaceController openParentApplication:#{ #"accData": accData } reply:^(NSDictionary *replyInfo, NSError *error) { //this method saves the array to a csv file.
NSLog(#"Data has been saved.");
}];
}
}
I had plotted this data and for the life of me, no matter how hard I shook the watch, all my plots looked like this:
Until 8 hours later, I started to suspect that I wasn't grabbing the acceleration data from the watch, but rather from the phone (sitting still on the table next to me). I ran some tests and confirmed that this is exactly what is happening.
Which leads me to the original question. How do I pull acceleration/gyro/data from the watch and not from the iPhone?
The problem was that I wasn't running watchOS2. I assumed I was but it's still in beta and I hadn't installed it. The data I was getting was accelerometer data from the phone. Also, currently, you can only get acc data from the watch using watchOS2 and not gyro data.
you can use CoreMotion framework to get activity data.
while I can only get accel data, the gyro often return false.
Currently I'm trying to use CMAcceleration to get accelerometer values for x,y& z axis is it possible to get Acceleration Time & Acceleration Magnitude with this x,y & z values.When i checked the developer forum i can't find any delegates / methods to get this two.Guide me to find the values.
Using the following code:
int updatesensorFrequencyInterval = [sensorFrequencySliderTxt.text intValue];
self.motionManager.accelerometerUpdateInterval = updatesensorFrequencyInterval;
self.motionManager.gyroUpdateInterval = updatesensorFrequencyInterval;
[self.motionManager startAccelerometerUpdatesToQueue:[NSOperationQueue currentQueue]
withHandler:^(CMAccelerometerData *accelerometerData, NSError *error) {
[self outputAccelertionData:accelerometerData.acceleration];
if(error){
NSLog(#"%#", error);
}
}];
The x,y,z values of CMAccelerometerData acceleration struct are magnitudes in m/s^2.
CMAccelerometerData is a subclass of CMLogItem, so it inherits the timestamp property which provides the NSTimeInterval since the phone was booted. This can be used to measure elapsed time between any pair of samples.
I'd like to write an app that detects if the person is holding an iPhone or iPad level, or if they have the device angled somewhere along the x/y/z axis and at what angle it is at. I've seen many apps that provide similar functionality, but not much code.
Can someone point me to an online tutorial, or provide code that demonstrates these capabilities?
Coremotion is the relevant framework. motiongraphs is a great sample app that visualizes CoreMotion data nicely in realtime.
As #timothykc noted, the MotionGraphs sample is a great example of using the CoreMotion library.
https://developer.apple.com/library/ios/samplecode/MotionGraphs/Introduction/Intro.html
Another nice example:
http://www.captechconsulting.com/blog/john-morrison/ios-getting-started-accelerometer-data
Here are the highlights:
1) Use a singleton pattern for accessing the CMMotionManager (this snippet is straight from the sample project).
#interface AppDelegate ()
{
CMMotionManager *motionmanager;
}
#end
#implementation AppDelegate
- (CMMotionManager *)sharedManager
{
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
motionmanager = [[CMMotionManager alloc] init];
});
return motionmanager;
}
#end
2) Use this code to register for updates to the pitch/roll/yaw:
CMMotionManager *mManager = [(AppDelegate *)[[UIApplication sharedApplication] delegate] sharedManager];
APLDeviceMotionGraphViewController * __weak weakSelf = self;
if ([mManager isDeviceMotionAvailable] == YES) {
[mManager setDeviceMotionUpdateInterval:0.1];
[mManager startDeviceMotionUpdatesToQueue:[NSOperationQueue mainQueue] withHandler:^(CMDeviceMotion *deviceMotion, NSError *error)
{
//Access the pitch, roll, and yaw from the attitude and do something with them.
//deviceMotion.attitude.yaw
//deviceMotion.attitude.roll
//deviceMotion.attitude.pitch
}];
}
I'm using the CMMotionManager to gather accelerometer data. I am trying to set the update interval to every half second with the following:
[_motionManager setDeviceMotionUpdateInterval:.5];
[_motionManager startAccelerometerUpdatesToQueue:[[NSOperationQueue alloc] init]
withHandler:^(CMAccelerometerData *accelerometerData, NSError *error) {
dispatch_async(dispatch_get_main_queue(), ^{
[self performSelectorOnMainThread:#selector(update:) withObject:accelerometerData waitUntilDone:NO];
});}];
yet I receive updates far more frequently than every half second. Any idea why?
Wasn't setting the update interval for accelerometer itself.
[_motionManager setAccelerometerUpdateInterval:.5];
Good day, here's what i am trying to do:
i have a photo processing app that takes images using AVFoundation
i have a DeviceMotion queue that is processing device position at 60Hz
when image is taken, it needs to be cropped and saved. DeviceMotion needs to keep running and interface updated without delays
what i am seeing is: updates to interface from DeviceMotion queue are being frozen for the duration of image crop.
this is how i start updates for DeviceMotion:
self.motionManager.deviceMotionUpdateInterval = 1.0f/60.0f;
gyroQueue = [[NSOperationQueue alloc] init];
[self.motionManager startDeviceMotionUpdatesToQueue:gyroQueue withHandler:^(CMDeviceMotion *motion, NSError *error){
[NSThread setThreadPriority:1.0];
[self processMotion:motion withError:error];
}];
when images is returned from AVFoundation it is added to the queue for processing:
imageProcessingQueue = [[NSOperationQueue alloc] init];
[imageProcessingQueue setName:#"ImageProcessingQueue"];
[imageProcessingQueue setMaxConcurrentOperationCount:1];
//[imageProcessingQueue addOperationWithBlock:^{
//[self processImage:[UIImage imageWithData:imageData]];
//}];
NSInvocationOperation *operation = [[NSInvocationOperation alloc] initWithTarget:self selector:#selector(processImage:) object:[UIImage imageWithData:imageData]];
[operation setThreadPriority:0.0];
[operation setQueuePriority:NSOperationQueuePriorityVeryLow];
[imageProcessingQueue addOperation:operation];
and the method for processing the image:
- (void)processImage:(UIImage*)image {
CGSize cropImageSize = CGSizeMake(640,960);
UIImage *croppedImage = [image resizedImageWithContentMode:UIViewContentModeScaleAspectFit bounds:cropImageSize interpolationQuality:kImageCropInterpolationQuality];
NSData *compressedImageData = UIImageJPEGRepresentation(croppedImage, kJpegCompression);
[self.doc addPhoto:compressedImageData];
}
the issue is:
devicemotion updates are blocked for the duration of image crop when image is processed using the NSOperationQueue
if i process the image using performSelectorInBackground - it works as desired (no delays to DeviceMotion queue)
[self performSelectorInBackground:#selector(processImage:) withObject:[UIImage imageWithData:imageData]];
any ideas on where my understanding of background threading needs an update? :)
PS. I have asked this question earlier but it got nowhere, so this is a re-post
i have found a solution (or solid workaround) for this issue:
instead of routing deviceMotion updates to the queue using startDeviceMotionUpdatesToQueue, i have created a CADisplayLink timer and it is not interfering with other background queues - while it is matching screen refresh rate it's given highest priority by it's nature:
[self.motionManager startDeviceMotionUpdates];
gyroTimer = [CADisplayLink displayLinkWithTarget:self selector:#selector(processMotion)];
[gyroTimer addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];